Saturday, January 28, 2006

Generate your way through the Verification quagmire

I was thinking some more about an idea I had when commenting on someones verification problem here:
http://verificationguild.com/modules.php?name=Forums&file=viewtopic&p=3312&highlight=#3312
It is very hard to verify todays chip designs, but they are already quite regular.
My idea is to design IP that has its size more fully parameterized and generatable, so you would not design a 16 bit UART, you would design an n-bit UART with n able to take a range of values.
If the design of the UART has no 'corner cases' for increasing values of n, i.e. the internal architectural algorithm doesn't change for increasing n, then new ways to test the IP become available:
  • Formally test the IP generator performs its function over all n.
    It may be easier to formally prove the IP functionality for the IP generator over all n than to formally prove, for example, the functionality of a 16 bit version of the IP.
    Assertions would have to be written specifically for the genearator so that they too are parameterised by n.
    Questions: Are Assertion languages up to it? Are Formal checkers able to handle the complexity in the assertions?
  • More thoroughly verify the IP for a sequence of much smaller values of n and infer the quality of the IP generator for large values of n for which verification, especially formal and simulation based verification, is much harder, if not impossible.
Another example could be in designing and verifying some next-generation 32-by-32 channel switch that will take hundreds of millions of gates to implement. Design an m-by-m switch instead. Maybe you will get the 2,3,4,6,and 8 by 8 versions onto an FPGA and are able to do throughput tests, that then allow you to extrapolate the throughput of the 32-by-32 bit target version.

In time, Systems design and verification could be transformed as the requisite IP arrives fully parameterised, and with information on how performance scales with parameter sizes. System simulations for, say, a 32 bit CPU connected to a 64 bit bus and with a 16 and 32 bit peripheral could be done by the system simulation engine working out and configuring a simulation of a much smaller system maybe a four bit CPU with an 8 bit bus and 2 and 4 bit peripherals. The Sytem simulator could insert monitors, do the reduced simulation, then extrapolate the performance of the larger system. The smaller simulation should still be able to show faults that would be in the larger system as the system as a whole is just scaled.
One performance result that might yield to such extrapolation could be system power calculations based on toggle counts of individual IP blocks.

Is anyone doing this at the moment? I know of configurable CPU generators but they seem to be tailoring the instruction set to the software that is to be run.

- Paddy