Hey folks.
Initial batch of LAVA tests in fast models are now running in the Linaro validation lab. This initial run is designed to see how behaves in practice and to check for omissions occurring away from my computer.
The branch that I've deployed is lp:~zkrynicki/lava-core/demo-3 (it depends on unreleased json-document tree from github if you want to try it out, there are instructions in the tree).
We've got the licensing server setup for production usage and started a (arguably dummy) stream lava-test test based on hwpack_linaro-vexpressdt-rtsm_20120511-22_armhf_supported.tar.gz and linaro-precise-developer-20120426-86.tar.gz which is the combination I was using locally.
Over the next few days we'll be working on improving the process so that we can start accepting more realistic tests. Initially do expect high failure rate due to imperfections in the technology, configuration issues, etc.
The plan is to quickly move to practical use cases. So far I'm aware of the switcher tests that the QA team is using, and the kvm tests but I have not checked either, in practice, on a fast model yet.
My question to you, is to give me pointers (ideally as simple, practical instructions that show it works) for things that you want to run. I'm pretty ignorant about Android side of the story so any message from our android team would be appreciated.
Please note that iteraction cycle is very slow. It takes 10+ hours to do trivial things (doing apt-get update, installing a few packages, compiling trivial program and getting it to run for a short moment). Please don't ask us to run monkey for you as we'll be wasting time at this point.
My goal is to understand what's missing and to estimate how long given tests typically takes so that we can compare how our infrastructure compares to your needs.
Many thanks Zygmunt Krynicki