If you thought that timescale didn’t matter in SystemVerilog – think again!
And truth be told – I thought that timescale was obviated with SystemVerilog’s new features – and I’m having to think again!
So I had thought that we had finally rid ourselves of the `timescale albatross with SystemVerilog.
If you don’t remember the (old) issue here’s a refresher:
`timescale is a compiler directive used to configure the units and precision of a # delay in Verilog. The syntax is something like this:
`timescale 1ns / 10ps
which means that the time unit of a delay is 1ns and the precision is 10ps (or two decimals of precision in nsÂ as x.xx ns).
Since a delay in Verilog doesn’t have units it relies on the timescale to determine how long the delay will last in simulation.
No problem, right?
Here’s the issue: a `timescale directive has file scope and is “sticky” – once a `timescale is found while compiling everything in that file and following files has that timeunit and timeprecision value until the next `timescale is encountered.
And that was the issue – if you didn’t include a `timescale in your file then you relied on compilation order to get your timescale.Â If the compilation order changed or the upstream file changed it’s unit then all of a sudden your delays were wrong.
SystemVerilog solved this problem in two ways:
- delays now have units!Â So I no longer rely on the timescale time unit to determine how long my delay is.Â Instead of #1;Â I now say #1ns.
- introduction of timeunit and timeprecision keywords – these allow the specification of a timescale within the declaration body of a module, interface, or package; no longer do we rely on a `timescale compiler directive.
Unfortunately the problem wasn’t solved…
While delays can be specified for units – effectively obviating the need for a timescale – there’s a (minor) catch.
If a delay is specified that is more precise than a previously declared timescale then it is rounded to match the timescale precision.
`timescale 1ns / 100ps
#1.75nsÂ // rounded to 1.8ns
But that’s a minor hiccup – compared to this:
The LRM doesn’t explicitly define what happens when the timeunits aren’t aligned. It seems that when a time (or realtime) typed variable is passed from one timescale domain to another the units are CHANGED to the destination timescale.
This is crazy.Â If I define a realtime variable in a package with a timescale and then pass that variable to a function that is *defined* in another package (or file) timescale the time *units* are changed to match the destination timescale!
What does that mean?
It means that if I have 1.75ns in a 1ns/1ps timescale and pass it to a class declared in a 1ms/1us timescale that the time will be changed to 1.75ms.
An example follows – after the break.