On 8/1/24 13:15, Nícolas F. R. A. Prado wrote:
On Wed, Jul 31, 2024 at 05:19:45PM -0600, Shuah Khan wrote:
On 7/24/24 15:40, Nícolas F. R. A. Prado wrote:
Introduce a new test to identify regressions causing devices to go missing on the system.
For each bus and class on the system the test checks the number of devices present against a reference file, which needs to have been generated by the program at a previous point on a known-good kernel, and if there are missing devices they are reported.
Can you elaborate on how to generate reference file? It isn't clear.
Indeed, I'll make that information clearer in future versions.
The reference file is generated by passing the --generate-reference flag to the test:
./exist.py --generate-reference
It will be printed as standard output.
How about adding an option to generate file taking filename? Makes it easier to use.
Signed-off-by: Nícolas F. R. A. Prado nfraprado@collabora.com
Hi,
Key points about this test:
- Goal: Identify regressions causing devices to go missing on the system
- Focus:
- Ease of maintenance: the reference file is generated programatically
- Minimum of false-positives: the script makes as few assumptions as possible about the stability of device identifiers to ensure renames/refactors don't trigger false-positives
- How it works: For each bus and class on the system the test checks the number of devices present against a reference file, which needs to have been generated by the program at a previous point on a known-good kernel, and if there are missing devices they are reported.
- Comparison to other tests: It might be possible(*) to replace the discoverable devices test [1] with this. The benefits of this test is that it's easier to setup and maintain and has wider coverage of devices.
Additional detail:
- Having more devices on the running system than the reference does not cause a failure, but a warning is printed in that case to suggest that the reference be updated.
- Missing devices are detected per bus/class based on the number of devices. When the test fails, the known metadata for each of the expected and detected devices is printed and some simple similitarity comparison is done to suggest the devices that are the most likely to be missing.
- The proposed place to store the generated reference files is the 'platform-test-parameters' repository in KernelCI [2].
How would a user run this on their systems - do they need to access this repository in KernelCI?
No, that repository would just be a place where people could find pre-generated reference files (which we'll be using when running this test in KernelCI), but anyone can always generate their own reference files and store them wherever they want.
Thanks for the clarification. This might be good addition to the document. I think this test could benefit from a README or howto
This is what I see when I run the test on my system:
make -C tools/testing/selftests/devices/exist/ run_tests make: Entering directory '/linux/linux_6.11/tools/testing/selftests/devices/exist' TAP version 13 1..1 # timeout set to 45 # selftests: devices/exist: exist.py # TAP version 13 # # No matching reference file found (tried './LENOVO,20XH005JUS.yaml')
First generate the reference file for your system like so:
tools/testing/selftests/devices/exist/exist.py --generate-reference > tools/testing/selftests/devices/exist/LENOVO,20XH005JUS.yaml
Worked - I see
TAP version 13 # Using reference file: ./LENOVO,20XH005JUS.yaml 1..76
--- # Totals: pass:76 fail:0 xfail:0 xpass:0 skip:0 error:0
Things to improve:
- Have the script take a file instead of assuming that the reference file is in the current directory. e.g: exist.py -f reference_file
thanks, -- Shuah