Hi David, first off, thanks for bringing this forward. We really appreciate getting additional tests into lava, especially those that the engineers really care about.
Is this test part of an existing test suite? For instance, ltp? If so, then we just need to make sure we are running the right version of ltp to pick it up.
If it's a whole new test suite, then we have 2 routes we could go. In the short term, I'd look to help you look at at creating a wrapper so that lava-test can run it. Ignore the json sections you posted below, that's not something you need to mess with for this step, and in fact, is just something we will need to add to the daily template that gets run on all images. So it isn't anything you need to worry about at all. The test wrapper is just a small bit of python code that will tell lava-test how to install the test, run it, and parse the results. Most of them are pretty simple.
Long term, I believe both the kernel and dev platform teams are working on some test suites to go into lava that will be for the express purpose of sanity and regression tests in linaro images.
Could you point me at where I might find these pthread tests right now so that I can have a look?
Thanks, Paul Larson
On Mon, Nov 7, 2011 at 5:14 AM, David Gilbert david.gilbert@linaro.orgwrote:
On 7 November 2011 09:57, Zygmunt Krynicki zygmunt.krynicki@linaro.org wrote:
W dniu 04.11.2011 15:35, David Gilbert pisze:
Hi, I've got a pthread test that is the fall out of a bug fix which is a good test of kernel and libc and would like to add it into Lava.
I'm told that I currently have to pick a hardware pack and image - but for this case what I really want to do is to create a test that can be used as a regression test that gets run on whatever the latest kernel is on any of a set of platforms (basically anything of Panda speed or similar/faster).
LAVA is happy to run your test on any hardware pack and root fs you want. The first step is to make your test visible to the stack. To do that you should wrap it with our lava-test framework. If your test can be
compiled to
a simple executable and invoked to check for correctness then wrapping it should be easy as pie. Look at LAVA test documentation lava: http://lava-test.readthedocs.org/en/latest/index.html (and report back
on
missing pieces).
The next step is to determine when your test should run. I think that
there
is nothing wrong with simply running it daily on all the hardware packs.
Hi, Thanks for the response. My question was started off by reading some notes from ams; in those the example has a .json file that has a section that looks like:
{ "command": "deploy_linaro_image", "parameters": { "rootfs": "ROOTFS-URL-HERE", "hwpack": "HWPACK-URL-HERE" } },
Now you say 'I think that there is nothing wrong with simply running it daily on all the hardware packs' - what do I change that section to, to indicate that? (all hw packs is a bit over the top, it would be good to restrict it a little to save time, but still, it would be a good start).
I don't see references to that json file in the docs you link to.
Dave
linaro-dev mailing list linaro-dev@lists.linaro.org http://lists.linaro.org/mailman/listinfo/linaro-dev