On 28 April 2017 at 14:20, Guillaume Tucker guillaume.tucker@collabora.com wrote:
Hi Neil,
Thanks for your explanation, also sorry if I've somewhat misused the terms job, shell and test. Trying to piece everything together, I've made a small test definition to see how this would all work in practice. Here's a run:
In such case it's easier to use in-line tests for prototyping I guess.
[cut]
We've tried to indicate this direction in the docs:
https://staging.validation.linaro.org/static/docs/v2/writing-tests.html#best...
Let us know if those guidelines can be expanded or clarified.
Essentially, anything a developer would need to do outside LAVA to be able to run the scripts used in your LAVA tests should be within the scripts themselves. LAVA needs to provide certain pieces of data and some handlers to report test cases but apart from that, the drive is towards portability and away from hidden steps being done by LAVA.
OK, I think I generally understand this. One part I'm not too sure of is claiming that LAVA test scripts should be portable on one hand, and on the other hand that LAVA should not be involved with how test scripts actually manage to run on a system. It sounds to me that it's rather making portability the user's problem, and LAVA will happily schedule any job that can be successfully submitted regardless of whether the test scripts involved would also manage to run on any other system.
I think this is a matter of convention. If your test is only targeting 1 build/os/device and you don't care about sharing the test - no problem. LAVA will happily run your test. But if you plan to re-use the test on different OSes, different shells and different boards, portability becomes an issue.
milosz