Jim Lewis
Forum Replies Created
-
AuthorPosts
-
June 18, 2025 at 19:51 #2737
Jim Lewis
MemberHi Charlie,
What I suspect you are doing is that in the checker model you have created your requirements as follows:Req1ID <= NewReqID("Req1", 1, ...) ; Req2ID <= NewReqID("Req2", 1, ...) ;
When the build finishes with one or more tests that use this checker module is that the requirements are merged into the final report. However, that report also has the items that it merged together.
With the merged results, how should the requirements goal be tabulated? The default behavior (because of history) is to take the highest requirement level and use this as the total requirement goal. This requires that we merge the requirement specification into the merged requirements report. The other behavior is to sum the coverage goals from each test case to formulate the total goal for that requirement.
These behaviors are controlled by Tcl setting named USE_SUM_OF_GOALS. Its default is in Scripts/OsvvmSettingsDefault.tcl. You override it by creating an OsvvmSettingsLocal.tcl. See Documentation/OsvvmSettings_user_guide.pdf for details on what directory OsvvmSettingsLocal.tcl goes in.
You can also change the setting for a particular build by doing the following in your *.pro script:
set ::osvvm::USE_SUM_OF_GOALS "true"
If you want the requirements to be handled separately, you need to give them different names. If you are running with OSVVM scripts, you should already be setting the test name in the Control process of the test sequencer:
architecture T1 of TestCtrl is . . . begin ControlProc : process begin SetTestName("T1") ; -- make sure this is done before any wait statements . . .
Then in your checker model you can do:
wait for 0 ns ; -- make sure TestName is set before reading it Req1ID <= NewReqID(GetTestName & ".Req1", 1, ...) ; Req2ID <= NewReqID(GetTestName & ".Req2", 1, ...) ;
This now will give you unique test requirements for each test case.
Note that by default, if a requirement in a given test case does not reach its goal, the test fails with an error.
Best Regards,
JimJune 3, 2025 at 00:26 #2722Jim Lewis
MemberHi Isaac,
I think Patrick’s solution will be the defacto solution for integration with VUnit. Particularly since he is working on generating reports too – really cool.Before I talked to Patrick today, I worked on the script I mentioned above. For each OSVVM library, it will create a list of files associated with the library. Note the file list may be be different for different simulators since there are files with work arounds for particular simulator issues.
Run this by starting up your simulator with the OSVVM scripts, per the OSVVM Script_users_guide.pdf in the OsvvmLibraries/Documentation directory. For Questa, ModelSim or Riviera-PRO, all you do is:
source <path-to-OsvvmLibraries/OsvvmLibraries/Scripts/StartUp.tcl
Then do:
source $OsvvmLibraries/Scripts/VendorScripts_CompileList.tcl
And then run the OSVVM Scripts (takes only seconds):
build $OsvvmLibraries
You will find the scripts in files named:
_ .files. You will find VendorScripts_CompileList.tcl on the dev branch of OsvvmLibraries.
Best Regards,
JimJune 2, 2025 at 15:08 #2717Jim Lewis
MemberHi Isaac,
From time to time, the OSVVM compile scripts are updated. It is hard to maintain more than one approach. The reason we developed the OSVVM scripts is to provide a better reporting mechanism. In addition the the ordinary JUnit reports, OSVVM does a more comprehensive Build summary. We also do test case summaries, functional coverage reports, requirements tracking, detailed alert reports, scoreboard reports, … and all of this is automatic.The effortless way to work with another flow is to build OsvvmLibraries with the OSVVM pro scripts and then link the libraries into the other methodology. Unless you are editing things in the OsvvmLibraries, you can compile it once and use the library as a resource library.
The good thing about tcl is it is already there – either in the tool gui, or if you are running in Linux, it is available in the standard installs of the OS.
I don’t use VUnit, is there a compile script format for VUnit? The low level of OSVVM scripting, VendorScripts_***.tcl is adaptable and it should be easy to create one that generates VUnit compile scripts – rather than compiling the design.
Best Regards,
JimMay 26, 2025 at 23:59 #2715Jim Lewis
MemberHopefully I answered this in: https://osvvm.org/forums/topic/issues-with-ghdl-when-compiling-osvvm-packages
May 26, 2025 at 23:25 #2713Jim Lewis
MemberAre you running a current version of GHDL? Our github actions run with GHDL every day. See: https://github.com/OSVVM/OsvvmLibraries/actions
Did you use the directions in OsvvmLibraries/Documentation/Scripts_user_guide.pdf?
Here they are:
GHDL in Windows
~~~~~~~~~~~~~~~~Initialize the OSVVM Script environment by doing:
winpty tclsh source <path-to-OsvvmLibraries>/OsvvmLibraries/Scripts/StartUp.tcl
To simplify this, put
source
/OsvvmLibraries/Scripts/StartUp.tcl
in the.tclshrc
file and add a windows short cut that does
C:\tools\msys64\mingw64.exe winpty tclsh
.
GHDL in Linux
~~~~~~~~~~~~~~~~Initialize the OSVVM Script environment by doing:
rlwrap tclsh source <path-to-OsvvmLibraries>/OsvvmLibraries/Scripts/StartUp.tcl
To simplify this, put
source
/OsvvmLibraries/Scripts/StartUp.tcl
in the.tclshrc
file and in bash add
alias gsim=’rlwrap tclsh’
to your
.bashrc
.
Once you have done these, you build OsvvmLibraries by doing:
build $OsvvmLibraries/OsvvmLibraries.pro
Compiling the whole OsvvmLibraries in GHDL takes less than 60 seconds.
May 22, 2025 at 14:04 #2710Jim Lewis
MemberYou need XSIM if you are using Xilinx’s encrypted IP. However XSIM is very slow.
GHDL and nvc both support OSVVM well – however they need a third party waveform viewer such as SURFER.
OSVVM is updated several times each year. Even if a simulator includes an OSVVM release, it may be out of date.
OSVVM is fast to compile the project. Just start the scripts and then
build $OsvvmLibraries
. For all simulators except XSIM this takes less than a minuteFor more on running OSVVM scripts see OsvvmLibraries/documentation
May 17, 2025 at 23:16 #2706Jim Lewis
MemberYou need OSVVM 2025.02 (or newer) and XSIM 2024.2
May 6, 2025 at 07:45 #2699Jim Lewis
MemberI have released these updates as part of 2025.4
April 30, 2025 at 19:06 #2695Jim Lewis
MemberHi David,
Your pictures did not post for some reason. So I will give a more general perspective.When a CoverageID is created, a corresponding AlertLogID is also created. If the ID, created with NewID, it is used for tracking errors – such as when an error bin is encountered. This ID is also used to determine what level an error signals at: FAILURE, ERROR, or ERROR + No printing.
If the ID, created with NewReqID (Beta feature of 2025.02 – documentation just posted to dev branch – meaning I updated it after your post and before mine), it is used for tracking errors and treats coverage bins as requirements. The requirements goal is the sum of the number of coverage bins. The requirements count is the sum of the number of coverage bins with passing coverage goals.
The AlertLog report does end up with lots of extra stuff in it. In the call to NewID / NewReqID, ReportMode can be changed to either DISABLED (do not show in summary table) or NONZERO (only show in summary table if non-zero). AlertLogIDs report up, so even if there is an ERROR, it will be reported in the next higher level – which worst case is the DEFAULT bin.
For FIFOs, I always use ReportMode DISABLED in the call to NewID. FIFOs have one FAILURE, pop with an empty FIFO. In this case, the test ends with FAILURE from the FIFO and then the report. Printing that twice is not necessary.
Cheers,
JimApril 30, 2025 at 03:08 #2692Jim Lewis
MemberHi Jake,
The fixes for both of these issues are now on the OSVVM Dev branch. If you are allowed to fetch OSVVM from GitHub you can get the dev branch by doing:git clone --recursive --branch dev https://github.com/osvvm/OsvvmLibraries
Otherwise let me know and I can push it to main and do a release to osvvm.org.
Jim
April 29, 2025 at 21:52 #2690Jim Lewis
MemberHi Jake,
> I am using OSVVM version 2024.07 and the Siemens Questa version 2025.1_1 simulator.
>
> I am looking to moving to the most recent version of OSVVM as soon as possible, but sadly the aforementioned
> version of Questa doesn’t support all the VHDL2019 features that OSVVM leverages.This mainly controls whether to use the RandomPkg2019 or not. RandomPkg2019 was tested in 2024.2 and 2024.3 and it passed all of our regression tests. Not sure what happened to make them think it is not working. Questa has had a history of having issues with impure functions – including the protected type version of RandomPkg.
If you comment out the lines in OsvvmLibraries/Scripts/VeendorScripts_siemens.tcl, that sets 2019 everything will be fine:
# if {[expr [string compare $ToolVersion “2024.2”] >= 0]} {
# SetVHDLVersion 2019
# }This will be commented out in the next release or updated for 2025.2 (if it is supported there).
With respect to the generic naming, the files had to be renamed to incorporate the generics to ensure unique naming for each results file (in the event the test case is run with multiple different generics). That said, this was not an anticipated use model. Obviously a fix is needed. Brainstorming just a little, all ‘/’, ‘\’, and ‘:’ in a name can be replaced by ‘_’. Does this sound like it will work?
April 18, 2025 at 16:29 #2682Jim Lewis
MemberAll of the OSVVM VC have test cases for each feature. For AxiStream, they are in the directory AXI4/AxiStream/TestCases.
The ones that work the way you are thinking are: TbStream_AxiTiming1.vhd and TbStream_AxiTiming2.vhd. You can also use random delays, see TbStream_SendGetRandom1.vhd and TbStream_SendGetRandom2.vhd
April 17, 2025 at 16:23 #2680Jim Lewis
MemberBelow is what I see running XSIM using $OsvvmLibraries/RunAllTests.pro on the current dev branch. There are 8 failures in the RunAllTests.pro. They all have to do with passing integer and time values through the transaction record. With time it never works. With integer, it seems to work some of the time. For integer it may be possible to trace the why, for time it will require an XSIM update.
April 17, 2025 at 04:39 #2678Jim Lewis
MemberFrom the Scripts_user_guide.pdf:
To run OSVVM scripts in XSIM, start Vivado and then run the StartXSIM using:
source <path-to-OsvvmLibraries>/OsvvmLibraries/Scripts/StartXSIM.tcl
This will then set the simulator to XSIM and instead of analyzing MemoryGenericPkg.vhd, it will do:
analyze deprecated/MemoryGenericPkg_xilinx.vhd
All of the tests in $OsvvmLibraries/RunAllTests.pro should run. I was testing on Windows 10 with Xilinx 2024.2.
I noted that it is terribly slow – takes 8X longer to run OSVVM regressions than the next slowest simulator.
The OsvvmBuild.log is created when you start a build. It should exist until the test completes (either successful or not) and then it is moved to the log/
.log Are you running your own test case? Be aware if you use OSVVM’s memory that it creates an array that is 2**(Address_Length-10) in length. Generally, the package generates warnings when the AddressBits are 34. I am looking into new structures for the memory model to reduce allow the memory to be larger.
April 15, 2025 at 15:54 #2676Jim Lewis
MemberThese above write bin values were not generated by the code you posted. See annotations in the code.
-
AuthorPosts