the day, simulators only had point delays, so all gate
libraries had buffers on their I/O. Now, we tend to use path, or
pin-to-pin delays in gate level models things have been made more
difficult for me.
My task was to randomly inject
some stuck-at faults in a large gate level Verilog design for
simulation. Individual gates of the design are connected by nets and
although I can force a net I cannot force a node as all nodes
on a net are affected by a force on the net, and you cannot get at
individual nodes by looking to force a net inside a
particular gate model as their are no input buffers. The
other problem with forcing nets/nodes using interactive commands of the
simulator is that you have to compile the design sub-optimally as all
the nets have to be accessible which stops the compiler/elaborator from
performing certain speed optimizations on what will be an execution
speed critical set of simulations.
Brainstorming
with a colleague we came up with the concept of modifying the netlist
to insert logic to either pass-through or force its out put to either 1
or 0 from an external control.
So off I went to href="http://embedded.eecs.berkeley.edu/Alumni/pinhong/scriptEDA/">ScriptEDA,
(great page - awful backdrop), and installed his href="http://www-cad.eecs.berkeley.edu/%7Epinhong/scriptEDA/pythongnd-0.35.tar.gz">Python
Interface for Gate-level Netlist Engine., which I then tried
on my gate level netlist.. Unfortunately our gate-level netlist failed
to parse because it used concatenations of nets which the parser
couldn't handle. I would need to extend the parser to support
net concatenations.
Looking at the
source of the parser it was a C program using href="http://en.wikipedia.org/wiki/Flex_lexical_analyser">lex
and yacc
creating C data structures which were then interfaced to Python via a href="http://www.swig.org/">SWIG generated
wrapper. I don't like to maintain C programs , and I needed
to flex my parsing abilities so decided to spend a short time
researching a pure Python solution.
PLY
The
obvious choice of Python parsing tool seemed to be PLY as it
stands for Python Lex Yacc, so a download later and poking around in
the install, I came across href="http://code.google.com/p/ply/source/browse/trunk/example/yply/yply.py?r=43">yply.py:
style="margin-left: 40px;">
yply.py
This example implements a program yply.py that converts a UNIX-yacc
specification file into a PLY-compatible program. To use, simply
run it like this:
% python yply.py [-nocode] inputfile.y >myparser.py
The output of this program is Python code. In the output,
any C code in the original file is included, but is commented out.
If you use the -nocode option, then all of the C code in the
original file is just discarded.
To use the resulting grammer with PLY, you'll need to edit the
myparser.py file. Within this file, some stub code is included that
can be used to test the construction of the parsing tables. However,
you'll need to do more editing to make a workable parser.
Disclaimer: This just an example I threw together in an afternoon.
It might have some bugs. However, it worked when I tried it on
a yacc-specified C++ parser containing 442 rules and 855 parsing
states.