Hello,
The LITE team appreciates bootstrapping of Zephyr-related LAVA testing done by LAVA, LAVA Lab, B&B and QA teams. It was quite a backlogged task for ourselves to be more involved with LAVA testing, and hopefully, the time has come ;-).
I've reviewed the current status of on-device testing for Zephyr CI jobs and see the following picture (feel free to correct me if something is wrong are missing): "zephyr-upstream" and "zephyr-upstream-arm" (https://ci.linaro.org/view/lite-iot-ci/) CI jobs submit a number of tests to LAVA (via https://qa-reports.linaro.org/) for the following boards: arduino_101, frdm_k64f, frdm_kw41z, qemu_cortex_m3. Here's an example of cumulative test report for these platforms: https://qa-reports.linaro.org/lite/zephyr-upstream/tests/
That's really great! (Though the list of tests to run in LAVA seems to be hardcoded: https://git.linaro.org/ci/job/configs.git/tree/zephyr-upstream/submit_for_te...)
But we'd like to test things beyond Zephyr testsuite, for example, application frameworks (JerryScript, Zephyr.js, MicroPython) and the mcuboot bootloader. For starters, we'd like to perform just a boot test to make sure that each application can boot and start up, then later hopefully to extend that to functional testing.
The most basic testing would be just check that after boot there's an expected prompt from each of the apps, i.e. test it in "passive" manner, similar to Zephyr unittests discussed above. I tried this with Zephyr.js and was able to make it work (with manual submission so far): https://validation.linaro.org/scheduler/job/1534097 . A peculiarity in this case is that the default test app of Zephyr.js outputs just a single line "Hello, ZJS world!", whereas LAVA's test/monitors test job config specifies testsuite begin pattern, end pattern, and testcase patterns, and I had a suspicion that each of them need to be on a separate line. But I was able to make it pass with the following config:
- test: monitors: - name: foo start: "" end: Hello, ZJS world! pattern: (?P<result>(PASS|FAIL))\s-\s(?P<test_case_id>\w+).
So, the "start" substring is empty, and perhaps matches a line output by a USB multiplexer or board bootloader. "End" substring is actually the expected single-line output. And "pattern" is unused (dunno if it can be dropped without def file syntax error). Is there a better way to handle single-line test output?
Well, beyond a simple output matching, it would be nice even for the initial "smoke testing" to actually make some input into the application and check the expected output (e.g., input: "2+2", expected output: "4"). Is this already supported for LAVA "v2" pipeline tests? I may imagine that would be the same kind of functionality required to test bootloaders like U-boot for Linux boards.
Thanks, Paul
Linaro.org | Open source software for ARM SoCs Follow Linaro: http://www.facebook.com/pages/Linaro http://twitter.com/#%21/linaroorg - http://www.linaro.org/linaro-blog