On 19 June 2017 at 13:47, Paul Sokolovsky paul.sokolovsky@linaro.org wrote:
Hello,
The LITE team appreciates bootstrapping of Zephyr-related LAVA testing done by LAVA, LAVA Lab, B&B and QA teams. It was quite a backlogged task for ourselves to be more involved with LAVA testing, and hopefully, the time has come ;-).
I've reviewed the current status of on-device testing for Zephyr CI jobs and see the following picture (feel free to correct me if something is wrong are missing): "zephyr-upstream" and "zephyr-upstream-arm" (https://ci.linaro.org/view/lite-iot-ci/) CI jobs submit a number of tests to LAVA (via https://qa-reports.linaro.org/) for the following boards: arduino_101, frdm_k64f, frdm_kw41z, qemu_cortex_m3. Here's an example of cumulative test report for these platforms: https://qa-reports.linaro.org/lite/zephyr-upstream/tests/
That's really great! (Though the list of tests to run in LAVA seems to be hardcoded: https://git.linaro.org/ci/job/configs.git/tree/zephyr-upstream/submit_for_te...)
It is, as I wasn't really sure what to test. The build job needs to prepare the test templates to be submitted to LAVA. In case of zephyr each tests is a separate binary. So we end up with the number of file paths to substitute in the template. Hardcoding was the easiest thing to get things running. But I see no reason why it wouldn't be changed with some smarter code to discover the binaries. The problem with this approach is that some of these tests are just build time. They have no meaning when running on the board and need to be filter out somehow.
But we'd like to test things beyond Zephyr testsuite, for example, application frameworks (JerryScript, Zephyr.js, MicroPython) and the mcuboot bootloader. For starters, we'd like to perform just a boot test to make sure that each application can boot and start up, then later hopefully to extend that to functional testing.
The most basic testing would be just check that after boot there's an expected prompt from each of the apps, i.e. test it in "passive" manner, similar to Zephyr unittests discussed above. I tried this with Zephyr.js and was able to make it work (with manual submission so far): https://validation.linaro.org/scheduler/job/1534097 . A peculiarity in this case is that the default test app of Zephyr.js outputs just a single line "Hello, ZJS world!", whereas LAVA's test/monitors test job config specifies testsuite begin pattern, end pattern, and testcase patterns, and I had a suspicion that each of them need to be on a separate line. But I was able to make it pass with the following config:
- test: monitors:
- name: foo start: "" end: Hello, ZJS world! pattern: (?P<result>(PASS|FAIL))\s-\s(?P<test_case_id>\w+).
So, the "start" substring is empty, and perhaps matches a line output by a USB multiplexer or board bootloader. "End" substring is actually the expected single-line output. And "pattern" is unused (dunno if it can be dropped without def file syntax error). Is there a better way to handle single-line test output?
You're making a silent assumption that if there is a matching line, the test is passed. In case of other tests (zephyr unit tests), it's not the case. The 'start' matches some line which is displayed when zephyr is booting. End matches the line which is displayed after all testing is done. The pattern follows the unit test pattern.
Well, beyond a simple output matching, it would be nice even for the initial "smoke testing" to actually make some input into the application and check the expected output (e.g., input: "2+2", expected output: "4"). Is this already supported for LAVA "v2" pipeline tests? I may imagine that would be the same kind of functionality required to test bootloaders like U-boot for Linux boards.
I didn't use anything like this in v2 so far, but you're probably best off doing sth like
test 2+2=4 PASS.
than you can easily create pattern that will filter the output. In case of zephyr pattern is the only way to filter things out as there is no shell (?) on the board.
milosz