I've started on some basic documentation for LAVA at https://wiki.linaro.org/Platform/Validation/LAVA/Documentation This is mostly aimed at those who are trying to set it up for themselves, or get started on development for it. However I'm also starting to cover example jobs, and will expand that section further to list the available actions and the parameters they can take. It's a wiki, so feel free to add to it, or let me know if there's a particular section you have questions about and want to see expanded sooner.
Thanks, Paul Larson
On Tue, Jun 28, 2011 at 09:55:07AM +0100, Paul Larson wrote:
I've started on some basic documentation for LAVA at https://wiki.linaro.org/Platform/Validation/LAVA/Documentation
Good job! I think you should probably split the "setting up" section from the "writing jobs" and hacking parts, since they are pretty different audiences.
From the developer's perspective, I can think of two initial use cases: I am
in a WG and want to see a test run for this branch I'm working on, and I am in a Platform unit or Landing team and I want to see how an image we're generating is behaving on a certain piece of hardware. Can we make it easy for these people to figure out what to do?
On Tue, Jun 28, 2011 at 9:03 PM, Christian Robottom Reis kiko@linaro.orgwrote:
On Tue, Jun 28, 2011 at 09:55:07AM +0100, Paul Larson wrote:
I've started on some basic documentation for LAVA at https://wiki.linaro.org/Platform/Validation/LAVA/Documentation
Good job! I think you should probably split the "setting up" section from the "writing jobs" and hacking parts, since they are pretty different audiences.
Good point. To an extent, they are kinda the same at this point in time, because a lot of the questions we get are about: 1. how do I set it up to demo/test it in my environment 2. how do I create a job to test it out 3. how do I add support for XYZ
However as we go forward, I can see how all of these sections should grow, and probably be targeted better at their intended audience and crosslink to other sections as required.
From the developer's perspective, I can think of two initial use cases: I am in a WG and want to see a test run for this branch I'm working on, and I am in a Platform unit or Landing team and I want to see how an image we're generating is behaving on a certain piece of hardware. Can we make it easy for these people to figure out what to do?
For case 1, we really need tests to support those kinds of loads. We can install images, and in the not so distant future we hope to even have support for injecting kernels. This will capture the needs of some, but not all WGs once they have tests in place. We're also in talks currently with the toolchain WG for instance to figure out how we can best proceed for things they need.
For case 2, I think a lot of that centers around reports. We added a new report for this release which gives a better snapshot of the status of the latest test run of a particular image type, on each board type [1]. I'm sure we'll come up with other, better ways to look at the data, but this is a good starting point. Reports are still a *bit* difficult to write, but we're considering how we can better document that, provide more examples, and make it easier in other ways as well.
Thanks, Paul Larson
[1] http://validation.linaro.org/launch-control/dashboard/reports/boot-status-ov... (N.B. This actually tracks job completion status at the moment, which should not be equated to whether the board booted. A failure just means that *something* caused the test run to end prematurely. Could have been a failed test, or even just a timeout.)