Hi all,
Inspired by, but mostly distinct from in the end, Spring's work on user
notifications, I've been working on adding "test run filters" to LAVA.
The idea is that you can define criteria for which test results you are
interested in. The next round of the work will be to allow subscribing
to filters (so you get an email when results matching the filter's
criteria are submitted), and maybe to allow sharing filters so other
people can view and subscribe to your filters.
The feature is live on staging now, please see:
http://staging.validation.linaro.org/dashboard/filters/
The code is at
https://code.launchpad.net/~mwhudson/lava-dashboard/test-run-filter
and has its crufty corners, but I think the design of the feature is
mostly OK.
Please let me know what you think!
Cheers,
mwh
Hi, I was looking into the daily pre-built tests and saw that panda was
still not tested since August 2nd. There's a problem with the prebuilt
image getting generated on jenkins that is unrelated, but I saw that on at
least 2 days, it did manage to build successfully, but no LAVA job was
started.
The server returned:
xmlrpclib.Fault: <Fault 400: "tag 'panda4430' does not exist">
I checked in the admin panel, and it looks like panda4430 and panda4460
tags were renamed to panda and panda-es. Was this intentional? Is it going
to stay this way? If we're going to use tags to identify which is which,
then we really need to make sure that they are stable names as we have
scripts depending on them already.
Thanks,
Paul Larson
Dave's on vacation, Michaels in NZ. Lets hold off on a meeting tomorrow.
We should probably re-think when/how we sync up now that the team has
changed so much.
W dniu 30.07.2012 16:10, Alexander Sack pisze:
> Hi,
>
> big items we need to sort out:
>
> + linaro-media-create install support
> + lava-test support
> + lava support
I had some thought on it (OpenEmbedded builds and LAVA) and due to that
here are some questions.
1. Do tested image needs to have LAVA client code installed?
If it does then I would have to add LAVA client components into
OpenEmbedded because I can not use Ubuntu packages as they are not
compatible with each other.
2. Does lava build require rootfs + hwpack or can boot with just rootfs?
Rootfs will contain kernel in /boot/uImage and u-boot configuration in
/boot/boot.scr (or other defined location). If hwpack is required then I
will have to create one with OE and add support for them in l-m-c (as we
can not use Ubuntu packages for things other than kernel/bootloader
cause there is no warranty about binary compatibility).
> I think if the fast model route is blocked we should take the pain to
> work on real board.
For now using real board is less pain then using fast model. I have real
boards available at home so can test images on them.
Just an FYI,
While doing some acceptance testing before deploying a new version of
the dispatcher to production I noticed a bug:
https://bugs.launchpad.net/lava-dispatcher/+bug/1032467
I'm not sure how our external users of LAVA have things configured, but
this could potentially cause you some breakage. I worked around our
issue in Linaro by adding:
tester_hostname = linaro
to the device-defaults.conf file.
I'm not really finding good sources for help with defining json schema
rules, so I thought I'd ask here.
I'd like to update our current boot-linaro-image action in the
lava-dispatcher to take a new optional parameter called "boot_options", eg:
{
"command": "boot_linaro_image",
"parameters": {
"options": [ "coretile.cache_state_modelled=1" ]
}
}
I tried to do this by updating parameters_schema in boot_control.py with:
+_boot_schema = {
+ 'type': 'object',
+ 'properties': {
+ 'options': {'type': 'array', 'items': {'type': 'string'},
+ 'optional': True},
+ },
+ 'additionalProperties': False,
+ }
However, now "parameters" is always required which we would be horrible
for backward compatibility. Anybody know how I should go about this?
-andy
+ @linaro-android
+ @linaro-validation
On 27 July 2012 09:32, YongQin Liu <yongqin.liu(a)linaro.org> wrote:
>
>
> On 13 July 2012 11:19, Michael Hudson-Doyle <michael.hudson(a)linaro.org>wrote:
>
>> YongQin Liu <yongqin.liu(a)linaro.org> writes:
>>
>> > Hi, All
>> >
>> > Here just some thought about the implementation of black-box test.
>> > If you have any ideas, or something I missed, please give a comment.
>> > Anything will be appreciated.
>>
>> Thanks for sending this.
>>
>> > ------------------------------------------------------------
>> > *Glue between lava and android*
>> > In android there is a directory /data/blackbox_tesxt/, under it are
>> TODO,
>> > TESTING, DONE 3 direcories.
>> >
>> > - TODO: the flags for test that need to run will be put here
>> > - TESTING: the flags for test that are running will be put here.
>> > normally, there should be only one entry.In the future will be more
>> > entries when we support test execution via thread
>>
>> Do we actually want to support running more than one test at once? It
>> doesn't seem like a super good idea to me.
>>
>
> Yeah, seems support only one test once is more realistic now.
> If so, seems we just need one action that will do
> install/run/wait/get_result
>
>
>> > - DONE: the flags for test that have been completed will be put
>> here
>> >
>> > About the entry format, will use JSON or just key/value pair. but need
>> to
>> > have the following two features
>> > 1. one item to indicate the command to run
>> > 2. other items used for pass information between android test tool and
>> lava
>> > job
>> >
>> > *Black-box test framework on Android*
>> > On android, a test framework will check the entries in TODO, and run the
>> > command indicated in the entry.
>> > Before the test is start to run, the framework will put the entry to
>> > TESTING, and after test finished will put the entry to DONE.
>> > when run the test command, this framework will run the command and pass
>> the
>> > entry file as parameter.
>> >
>> > The black-box test framework in android mainly do:
>> > 1. invoked after boot up and home screen is displayed.
>> > also charge for prepare the test environment like unlock screen,
>> > disable suspend
>> > 2. charging for invoking test command and changing the status of the
>> each
>> > test
>> >
>> > *Framework on LAVA*
>> >
>> > Will have 2 actions
>> > 1. install_black-box_test
>>
>> Will this action have a list of tests to install?
>>
> I'd like to install one test with one action.
> as I described in another mail.
>
>>
>> > 2. wait_black_test_finish
>> > will loop to check until there is no entry in TODO and TESTING
>> > also will check if the test framework is running, if it is not in ps
>> > and there are still test to be run, will invoke it to run.
>> > show the output of the running test
>>
>> Do we want to reboot between tests?
>>
> I think this can be done out the lava-dispatcher like the system-reboot on
> android-build now.
>
>
> Thanks,
> Yongqin Liu
>
Hi, All
Here just some thought about the implementation of black-box test.
If you have any ideas, or something I missed, please give a comment.
Anything will be appreciated.
------------------------------------------------------------
*Glue between lava and android*
In android there is a directory /data/blackbox_tesxt/, under it are TODO,
TESTING, DONE 3 direcories.
- TODO: the flags for test that need to run will be put here
- TESTING: the flags for test that are running will be put here.
normally, there should be only one entry.In the future will be more
entries when we support test execution via thread
- DONE: the flags for test that have been completed will be put here
About the entry format, will use JSON or just key/value pair. but need to
have the following two features
1. one item to indicate the command to run
2. other items used for pass information between android test tool and lava
job
*Black-box test framework on Android*
On android, a test framework will check the entries in TODO, and run the
command indicated in the entry.
Before the test is start to run, the framework will put the entry to
TESTING, and after test finished will put the entry to DONE.
when run the test command, this framework will run the command and pass the
entry file as parameter.
The black-box test framework in android mainly do:
1. invoked after boot up and home screen is displayed.
also charge for prepare the test environment like unlock screen,
disable suspend
2. charging for invoking test command and changing the status of the each
test
*Framework on LAVA*
Will have 2 actions
1. install_black-box_test
2. wait_black_test_finish
will loop to check until there is no entry in TODO and TESTING
also will check if the test framework is running, if it is not in ps
and there are still test to be run, will invoke it to run.
show the output of the running test
3. collect_test_result
parse and upload to lava
------------------------------------------------------------
*Related BPs*
1. lava side:
https://blueprints.launchpad.net/lava-dispatcher/+spec/black-box-test-actio…
2. android-side:
https://blueprints.launchpad.net/linaro-android/+spec/support-blackbox-test
Thanks,
Yongqin Liu
I'd like to move the cbuild infrastructure out of my home office and
domains and into the validation lab. That makes things one step
cleaner than the current setup and one step closer to hooking things
into LAVA.
There's more terse notes at:
https://wiki.linaro.org/MichaelHope/Sandbox/CBuildMove
but here's how I'd deploy it:
* Create cbuild.linaro.org to replace ex.seabright.co.nz
* Add a real or virtual medium capacity machine to run the web
server, scheduler, snapshotter, storage, and other administrative
stuff
* Add 500 GB+ of backup storage
* Add a reverse proxy to expose the server to the internet
* Delete orion that currently runs ex.seabright.co.nz
* Delete the EC2 micro instance that runs apus.seabright.co.nz
* Redirect and delete builds.linaro.org
tcserver01 stays as a build and benchmark machine.
I'm not happy with control being the web server, bounce host, and a
build machine. It's too taxing and unreliable. I'd like an unloaded
minimal host there instead.
Thoughts? Who can drive this?
-- Michael