-
-
Notifications
You must be signed in to change notification settings - Fork 75
Description
What I want to achieve
I am considering to use your great project to drive integration/system tests for a project I am responsible for in the NA61/SHINE experiment at CERN.
I have a directory structure like this:
IntegrationTests
├── Examples
│ ├── App1
│ └── App2
└── Standard
└── AppA
Each App directory would contain a single test_spec.sh file defining our expectations towards the given app. There would be an order of 100 apps. These apps actually may run quite long (from seconds to several minutes each), so it is quite important to run them in parallel. Each test_spec.sh file would essentially define a separate and independent test suite with multiple example blocks.
The tests would be run with GitLab CI so both a terse output showing the progress online, as well as the junit generator would be needed.
The goal is to run shellspec in the top directory executing all tests in subdirectories. It would also be nice to be able to execute it from the subdirectory.
The demonstration of my use case
I prepared an example in the https://gitlab.com/amarcine/test-shellspec repository. The apps are scripts but in real life these would be compiled programs or sets of xml files constituting an input to the main executable of the project (available in the $PATH when shellspec is executed). If you grep for "NOTE", you get my comments regarding what I would like to have and what I dislike in the current behaviour.
The problems
working directory of the test
Unfortunately my demonstration does not work currently, because each test is run from the directory in which shellspec was executed, while the executables under test expect to be run from the directories they reside in (App1 etc.). I understand that I could make this work putting cd in each test_spec.sh file with the relative path to the App directory, but this is very ugly. Furthermore, then it is not possible to execute shellspec in the App directory and the test becomes location dependent (cannot be moved).
I understand that currently there is no such feature, but would it be possible to add an option to shellspec, e.g. -C like in tar or make, to cd to the directory of each spec file before executing it?
formatter
Right now using any formatter which prints messages about which tests are run (documentation, tap), I have 2 problems:
- I can only see the tests after they are finished, so actually I don't know which test currently runs, what is rather problematic with long tests.
- All examples are listed, so the list is actually longer than I would expect.
For unit tests suites our project uses ctest. I show a (truncated) example output below. What I like about this formatting is that I get a message when the test starts and another when it ends. I also get the progress expressed as sequential number of the current test / number of all tests. I only get an information which test fails, but not the failure report, which is fine, because that appears in the xml report. Last but not least I get the timing for each test.
I would like to get something similar with shellspec. The "tests" printed should correspond to the level of parallelization, i.e. a spec file, so instead of printing titles of all examples, I should get only the top example group title. So instead of a full report this would be rather a terse summary of all test suites run.
Looking at the documentation output I have an impression that this should be possible, but I couldn't understand how to implement a custom formatter myself. Could you give some instructions on how to do that (unless you find it a nice feature to have and you implement it yourself)?
The ctest -j 8 example output
Test project /home/antek/Install/Shine/dev/build
Start 34: testMagneticField
Start 68: testTPCDEDXCalculatorSG
Start 37: testTPC
Start 35: testTimeBinMode
Start 22: testTime
Start 57: testTPCCalibManagers
Start 33: testDetector
Start 36: testTOF
1/69 Test #36: testTOF .................................. Passed 1.72 sec
Start 38: testTarget
2/69 Test #33: testDetector ............................. Passed 1.75 sec
Start 42: testGasBags
(....)
6/69 Test #37: testTPC .................................. Passed 2.39 sec
Start 67: testBPDReconstructorSG
7/69 Test #34: testMagneticField ........................ Passed 2.59 sec
Start 69: testMagneticTrackFitter
8/69 Test #68: testTPCDEDXCalculatorSG ..................***Failed 2.83 sec
Start 15: testMath
9/69 Test #15: testMath ................................. Passed 0.37 sec
Start 56: testBPDManagers
(....)
66/69 Test #29: testSparseVector ......................... Passed 0.13 sec
67/69 Test #23: testTrackParameters ...................... Passed 0.13 sec
68/69 Test #14: testPolynomialInterpolation .............. Passed 0.10 sec
69/69 Test #67: testBPDReconstructorSG ................... Passed 3.36 sec
99% tests passed, 1 tests failed out of 69
Total Test time (real) = 5.76 sec
The following tests FAILED:
68 - testTPCDEDXCalculatorSG (Failed)
Errors while running CTest