Hello everyone,
Collabora has been working on `lqa', a tool to submit and manage LAVA jobs, which helps to get many of the LAVA job administration and monitoring tasks conveniently done from the command line.
`lqa' brings a new API, lqa_api python module, a complete set of classes to easily interact with LAVA and offering at the same time a clean API on top of which further applications can be built upon (like `lqa' itself).
It has a templating system (using jinja2 package) that allows to use variables in json job files (in future could be expanded to support yaml), specifying their values either from a profile file or directly from the command line making possible the dynamic assignments of template variables during the `lqa' command execution. The templating mechanism allows to handle groups of jobs, therefore it makes it easier to submit jobs in bulk.
`lqa' also features a flexible profile system (in YAML) which allows to specify a 'main-profile' from which further sub-profiles can inherit values, avoiding information duplication between similar profiles.
Other of the current features include:
- Test report generation with the 'analyse' subcommand. - Polling to check for job completion. - All the operations offer logging capabilities. - Independent profile and configuration files.
We invite everyone to check out its official git repo at:
https://git.collabora.com/cgit/singularity/tools/lqa.git/
Suggestions and comments are welcome.
--- Luis
Luis,
I'm now doing a similar thing. The only difference is the target is web application rather than command line tool. By checking your code it seems there are some common parts. You can check the data polling code from here: https://git.linaro.org/people/milosz.wasilewski/dataminer.git
On 17 June 2015 at 16:35, Luis Araujo luis.araujo@collabora.co.uk wrote:
Hello everyone,
Collabora has been working on `lqa', a tool to submit and manage LAVA jobs, which helps to get many of the LAVA job administration and monitoring tasks conveniently done from the command line.
`lqa' brings a new API, lqa_api python module, a complete set of classes to easily interact with LAVA and offering at the same time a clean API on top of which further applications can be built upon (like `lqa' itself).
It has a templating system (using jinja2 package) that allows to use variables in json job files (in future could be expanded to support yaml), specifying their values either from a profile file or directly from the command line making possible the dynamic assignments of template variables during the `lqa' command execution. The templating mechanism allows to handle groups of jobs, therefore it makes it easier to submit jobs in bulk.
`lqa' also features a flexible profile system (in YAML) which allows to specify a 'main-profile' from which further sub-profiles can inherit values, avoiding information duplication between similar profiles.
Other of the current features include:
- Test report generation with the 'analyse' subcommand.
I'm not sure if _find_missing_tests [1] works properly for you. The tests in the JSON job definition are identified using git repository URL and YAML file path. In the result bundle you have git repository URL, commit ID and test name (comes form metadata->name property). So in order to check what is missing you need to checkout the proper commit from repository, go through all YAML files, find the proper metadata->name and match it to file name. Since the names in metadata are not guaranteed to be unique, you can't be 100% you're hitting the right YAML file :(
[1] https://git.collabora.com/cgit/singularity/tools/lqa.git/tree/lqa_tool/comma...
milosz
- Polling to check for job completion.
- All the operations offer logging capabilities.
- Independent profile and configuration files.
We invite everyone to check out its official git repo at:
https://git.collabora.com/cgit/singularity/tools/lqa.git/
Suggestions and comments are welcome.
--- Luis
linaro-validation mailing list linaro-validation@lists.linaro.org https://lists.linaro.org/mailman/listinfo/linaro-validation
Hello Milosz,
On 06/18/2015 09:52 PM, Milosz Wasilewski wrote:
Luis,
I'm now doing a similar thing. The only difference is the target is web application rather than command line tool. By checking your code it seems there are some common parts. You can check the data polling code from here: https://git.linaro.org/people/milosz.wasilewski/dataminer.git
This is really interesting. I am thinking to add some similar DB support to lqa to allow some of the query options you have there.
On 17 June 2015 at 16:35, Luis Araujo luis.araujo@collabora.co.uk wrote:
Hello everyone,
Collabora has been working on `lqa', a tool to submit and manage LAVA jobs, which helps to get many of the LAVA job administration and monitoring tasks conveniently done from the command line.
`lqa' brings a new API, lqa_api python module, a complete set of classes to easily interact with LAVA and offering at the same time a clean API on top of which further applications can be built upon (like `lqa' itself).
It has a templating system (using jinja2 package) that allows to use variables in json job files (in future could be expanded to support yaml), specifying their values either from a profile file or directly from the command line making possible the dynamic assignments of template variables during the `lqa' command execution. The templating mechanism allows to handle groups of jobs, therefore it makes it easier to submit jobs in bulk.
`lqa' also features a flexible profile system (in YAML) which allows to specify a 'main-profile' from which further sub-profiles can inherit values, avoiding information duplication between similar profiles.
Other of the current features include:
- Test report generation with the 'analyse' subcommand.
I'm not sure if _find_missing_tests [1] works properly for you. The tests in the JSON job definition are identified using git repository URL and YAML file path. In the result bundle you have git repository URL, commit ID and test name (comes form metadata->name property). So in order to check what is missing you need to checkout the proper commit from repository, go through all YAML files, find the proper metadata->name and match it to file name. Since the names in metadata are not guaranteed to be unique, you can't be 100% you're hitting the right YAML file :(
The main idea of this method is finding the tests that are specified in the JSON job file but have no available results in the final bundle (maybe a more accurate name would be _find_missing_results).
So far, it has been working fine properly reporting the missing results.
Maybe your point is more about finding the missing test definitions from the repositories?
[1] https://git.collabora.com/cgit/singularity/tools/lqa.git/tree/lqa_tool/comma...
milosz
- Polling to check for job completion.
- All the operations offer logging capabilities.
- Independent profile and configuration files.
We invite everyone to check out its official git repo at:
https://git.collabora.com/cgit/singularity/tools/lqa.git/
Suggestions and comments are welcome.
--- Luis
linaro-validation mailing list linaro-validation@lists.linaro.org https://lists.linaro.org/mailman/listinfo/linaro-validation
On 19 June 2015 at 11:46, Luis Araujo luis.araujo@collabora.co.uk wrote:
Hello Milosz,
On 06/18/2015 09:52 PM, Milosz Wasilewski wrote:
Luis,
I'm now doing a similar thing. The only difference is the target is web application rather than command line tool. By checking your code it seems there are some common parts. You can check the data polling code from here: https://git.linaro.org/people/milosz.wasilewski/dataminer.git
This is really interesting. I am thinking to add some similar DB support to lqa to allow some of the query options you have there.
I already have DB. Should be rolled out to qa-reports this or next week (still fixing a few bugs)
On 17 June 2015 at 16:35, Luis Araujo luis.araujo@collabora.co.uk wrote:
Hello everyone,
Collabora has been working on `lqa', a tool to submit and manage LAVA jobs, which helps to get many of the LAVA job administration and monitoring tasks conveniently done from the command line.
`lqa' brings a new API, lqa_api python module, a complete set of classes to easily interact with LAVA and offering at the same time a clean API on top of which further applications can be built upon (like `lqa' itself).
It has a templating system (using jinja2 package) that allows to use variables in json job files (in future could be expanded to support yaml), specifying their values either from a profile file or directly from the command line making possible the dynamic assignments of template variables during the `lqa' command execution. The templating mechanism allows to handle groups of jobs, therefore it makes it easier to submit jobs in bulk.
`lqa' also features a flexible profile system (in YAML) which allows to specify a 'main-profile' from which further sub-profiles can inherit values, avoiding information duplication between similar profiles.
Other of the current features include:
- Test report generation with the 'analyse' subcommand.
I'm not sure if _find_missing_tests [1] works properly for you. The tests in the JSON job definition are identified using git repository URL and YAML file path. In the result bundle you have git repository URL, commit ID and test name (comes form metadata->name property). So in order to check what is missing you need to checkout the proper commit from repository, go through all YAML files, find the proper metadata->name and match it to file name. Since the names in metadata are not guaranteed to be unique, you can't be 100% you're hitting the right YAML file :(
The main idea of this method is finding the tests that are specified in the JSON job file but have no available results in the final bundle (maybe a more accurate name would be _find_missing_results).
So far, it has been working fine properly reporting the missing results.
Sounds strange. The code shouldn't work. I'll try it locally and let you know how does that look like.
Maybe your point is more about finding the missing test definitions from the repositories?
no, I'm talking about exactly the same case - find out if the test-shell produced results or not.
milosz
[1] https://git.collabora.com/cgit/singularity/tools/lqa.git/tree/lqa_tool/comma...
milosz
- Polling to check for job completion.
- All the operations offer logging capabilities.
- Independent profile and configuration files.
We invite everyone to check out its official git repo at:
https://git.collabora.com/cgit/singularity/tools/lqa.git/
Suggestions and comments are welcome.
--- Luis
linaro-validation mailing list linaro-validation@lists.linaro.org https://lists.linaro.org/mailman/listinfo/linaro-validation
On 24 June 2015 at 16:00, Milosz Wasilewski milosz.wasilewski@linaro.org wrote:
On 19 June 2015 at 11:46, Luis Araujo luis.araujo@collabora.co.uk wrote:
Hello Milosz,
On 06/18/2015 09:52 PM, Milosz Wasilewski wrote:
Luis,
I'm now doing a similar thing. The only difference is the target is web application rather than command line tool. By checking your code it seems there are some common parts. You can check the data polling code from here: https://git.linaro.org/people/milosz.wasilewski/dataminer.git
This is really interesting. I am thinking to add some similar DB support to lqa to allow some of the query options you have there.
I already have DB. Should be rolled out to qa-reports this or next week (still fixing a few bugs)
On 17 June 2015 at 16:35, Luis Araujo luis.araujo@collabora.co.uk wrote:
Hello everyone,
Collabora has been working on `lqa', a tool to submit and manage LAVA jobs, which helps to get many of the LAVA job administration and monitoring tasks conveniently done from the command line.
`lqa' brings a new API, lqa_api python module, a complete set of classes to easily interact with LAVA and offering at the same time a clean API on top of which further applications can be built upon (like `lqa' itself).
It has a templating system (using jinja2 package) that allows to use variables in json job files (in future could be expanded to support yaml), specifying their values either from a profile file or directly from the command line making possible the dynamic assignments of template variables during the `lqa' command execution. The templating mechanism allows to handle groups of jobs, therefore it makes it easier to submit jobs in bulk.
`lqa' also features a flexible profile system (in YAML) which allows to specify a 'main-profile' from which further sub-profiles can inherit values, avoiding information duplication between similar profiles.
Other of the current features include:
- Test report generation with the 'analyse' subcommand.
I'm not sure if _find_missing_tests [1] works properly for you. The tests in the JSON job definition are identified using git repository URL and YAML file path. In the result bundle you have git repository URL, commit ID and test name (comes form metadata->name property). So in order to check what is missing you need to checkout the proper commit from repository, go through all YAML files, find the proper metadata->name and match it to file name. Since the names in metadata are not guaranteed to be unique, you can't be 100% you're hitting the right YAML file :(
The main idea of this method is finding the tests that are specified in the JSON job file but have no available results in the final bundle (maybe a more accurate name would be _find_missing_results).
So far, it has been working fine properly reporting the missing results.
Sounds strange. The code shouldn't work. I'll try it locally and let you know how does that look like.
Maybe your point is more about finding the missing test definitions from the repositories?
no, I'm talking about exactly the same case - find out if the test-shell produced results or not.
I checked and it doesn't work (doesn't detect missing results). Here is example job: https://validation.linaro.org/scheduler/job/382325 (I'm not sure it's publicly available) There are a couple of LTP test shells with parameters. Results for TST_CMDFILES=fs are missing and lqa doesn't show that. Here is the output I got:
./lqa -c examples/lqa.yaml analyse 382325 Generating lqa report for job(s): 382325
Report for job(s) (Wed Jun 24 19:52:31 2015): 382325 1 test job(s) ran: 1 complete (0 fully successful, 1 wih failures), 0 incomplete
* --- Failed Jobs --- *
(F) Jobs with failed tests:
382325: https://ci.linaro.org/jenkins/job/linux-linaro-stable-lsk-v3.14/hwpack=vexpr... ========================================================================================================= 2075 passed, 44 failed, 2 skipped, 0 unknown FAILED | kselftest-net:net FAILED | kselftest-net:psock_fanout test FAILED | kselftest-net:psock_tpacket test FAILED | ltp:LTP_admin_tools FAILED | ltp:su01 FAILED | ltp:LTP_containers FAILED | ltp:netns_devices FAILED | ltp:netns_devices2 FAILED | ltp:netns_isolation FAILED | perf:perf report test FAILED | perf:perf test - vmlinux symtab matches kallsyms FAILED | perf:perf test - detect open syscall event FAILED | perf:perf test - detect open syscall event on all cpus FAILED | perf:perf test - read samples using the mmap interface FAILED | perf:perf test - parse events tests FAILED | perf:perf test - Test breakpoint overflow signal handler FAILED | perf:perf test - Test breakpoint overflow sampling FAILED | perf:perf test - Test tracking with sched_switch FAILED | kselftest-vm:vm FAILED | kselftest-vm:vm FAILED | kselftest-vm:hugetlbfstest FAILED | ltp:LTP_syscalls FAILED | ltp:accept4_01 FAILED | ltp:connect01 FAILED | ltp:fsync02 FAILED | ltp:ftruncate04 FAILED | ltp:ftruncate04_64 FAILED | ltp:fanotify06 FAILED | ltp:recv01 FAILED | ltp:recvfrom01 FAILED | ltp:recvmsg01 FAILED | ltp:send01 FAILED | ltp:sendfile02 FAILED | ltp:sendfile02_64 FAILED | ltp:sendfile04 FAILED | ltp:sendfile04_64 FAILED | ltp:sendfile05 FAILED | ltp:sendfile05_64 FAILED | ltp:sendfile06 FAILED | ltp:sendfile06_64 FAILED | ltp:sendmsg01 FAILED | lava:wait_for_master_image_boot_msg FAILED | lava:lava_test_shell FAILED | lava:wait_for_master_image_boot_msg Job: https://validation.linaro.org/scheduler/job/382325 Bundle: https://validation.linaro.org/dashboard/permalink/bundle/9d3a579ed6ccb89073b...
Does the list show the missing test-shells? Do I miss some options to show the missing stuff?
milosz
On 06/25/2015 02:59 AM, Milosz Wasilewski wrote:
On 24 June 2015 at 16:00, Milosz Wasilewski milosz.wasilewski@linaro.org wrote:
On 19 June 2015 at 11:46, Luis Araujo luis.araujo@collabora.co.uk wrote:
Hello Milosz,
On 06/18/2015 09:52 PM, Milosz Wasilewski wrote:
Luis,
I'm now doing a similar thing. The only difference is the target is web application rather than command line tool. By checking your code it seems there are some common parts. You can check the data polling code from here: https://git.linaro.org/people/milosz.wasilewski/dataminer.git
This is really interesting. I am thinking to add some similar DB support to lqa to allow some of the query options you have there.
I already have DB. Should be rolled out to qa-reports this or next week (still fixing a few bugs)
On 17 June 2015 at 16:35, Luis Araujo luis.araujo@collabora.co.uk wrote:
Hello everyone,
Collabora has been working on `lqa', a tool to submit and manage LAVA jobs, which helps to get many of the LAVA job administration and monitoring tasks conveniently done from the command line.
`lqa' brings a new API, lqa_api python module, a complete set of classes to easily interact with LAVA and offering at the same time a clean API on top of which further applications can be built upon (like `lqa' itself).
It has a templating system (using jinja2 package) that allows to use variables in json job files (in future could be expanded to support yaml), specifying their values either from a profile file or directly from the command line making possible the dynamic assignments of template variables during the `lqa' command execution. The templating mechanism allows to handle groups of jobs, therefore it makes it easier to submit jobs in bulk.
`lqa' also features a flexible profile system (in YAML) which allows to specify a 'main-profile' from which further sub-profiles can inherit values, avoiding information duplication between similar profiles.
Other of the current features include:
- Test report generation with the 'analyse' subcommand.
I'm not sure if _find_missing_tests [1] works properly for you. The tests in the JSON job definition are identified using git repository URL and YAML file path. In the result bundle you have git repository URL, commit ID and test name (comes form metadata->name property). So in order to check what is missing you need to checkout the proper commit from repository, go through all YAML files, find the proper metadata->name and match it to file name. Since the names in metadata are not guaranteed to be unique, you can't be 100% you're hitting the right YAML file :(
The main idea of this method is finding the tests that are specified in the JSON job file but have no available results in the final bundle (maybe a more accurate name would be _find_missing_results).
So far, it has been working fine properly reporting the missing results.
Sounds strange. The code shouldn't work. I'll try it locally and let you know how does that look like.
Maybe your point is more about finding the missing test definitions from the repositories?
no, I'm talking about exactly the same case - find out if the test-shell produced results or not.
I checked and it doesn't work (doesn't detect missing results). Here is example job: https://validation.linaro.org/scheduler/job/382325 (I'm not sure it's publicly available)
I cannot access it (even logged in with my launchpad account).
Can you send me a link publicly available with this same problem?, I really would like to check this out.
There are a couple of LTP test shells with parameters. Results for TST_CMDFILES=fs are missing and lqa doesn't show that. Here is the output I got:
./lqa -c examples/lqa.yaml analyse 382325 Generating lqa report for job(s): 382325
Report for job(s) (Wed Jun 24 19:52:31 2015): 382325 1 test job(s) ran: 1 complete (0 fully successful, 1 wih failures), 0 incomplete
- --- Failed Jobs --- *
Bundles (F) Jobs with failed tests:
382325: https://ci.linaro.org/jenkins/job/linux-linaro-stable-lsk-v3.14/hwpack=vexpress64,label=build/81/ ========================================================================================================= 2075 passed, 44 failed, 2 skipped, 0 unknown FAILED | kselftest-net:net FAILED | kselftest-net:psock_fanout test FAILED | kselftest-net:psock_tpacket test FAILED | ltp:LTP_admin_tools FAILED | ltp:su01 FAILED | ltp:LTP_containers FAILED | ltp:netns_devices FAILED | ltp:netns_devices2 FAILED | ltp:netns_isolation FAILED | perf:perf report test FAILED | perf:perf test - vmlinux symtab matches kallsyms FAILED | perf:perf test - detect open syscall event FAILED | perf:perf test - detect open syscall event on all cpus FAILED | perf:perf test - read samples using the mmap interface FAILED | perf:perf test - parse events tests FAILED | perf:perf test - Test breakpoint overflow signal handler FAILED | perf:perf test - Test breakpoint overflow sampling FAILED | perf:perf test - Test tracking with sched_switch FAILED | kselftest-vm:vm FAILED | kselftest-vm:vm FAILED | kselftest-vm:hugetlbfstest FAILED | ltp:LTP_syscalls FAILED | ltp:accept4_01 FAILED | ltp:connect01 FAILED | ltp:fsync02 FAILED | ltp:ftruncate04 FAILED | ltp:ftruncate04_64 FAILED | ltp:fanotify06 FAILED | ltp:recv01 FAILED | ltp:recvfrom01 FAILED | ltp:recvmsg01 FAILED | ltp:send01 FAILED | ltp:sendfile02 FAILED | ltp:sendfile02_64 FAILED | ltp:sendfile04 FAILED | ltp:sendfile04_64 FAILED | ltp:sendfile05 FAILED | ltp:sendfile05_64 FAILED | ltp:sendfile06 FAILED | ltp:sendfile06_64 FAILED | ltp:sendmsg01 FAILED | lava:wait_for_master_image_boot_msg FAILED | lava:lava_test_shell FAILED | lava:wait_for_master_image_boot_msg Job: https://validation.linaro.org/scheduler/job/382325 Bundle: https://validation.linaro.org/dashboard/permalink/bundle/9d3a579ed6ccb89073bfdd73e112daf3078d355d/
Does the list show the missing test-shells? Do I miss some options to show the missing stuff?
No missing options, it should just work running the command like that.
milosz
On 25 June 2015 at 16:32, Luis Araujo luis.araujo@collabora.co.uk wrote:
I checked and it doesn't work (doesn't detect missing results). Here is example job: https://validation.linaro.org/scheduler/job/382325 (I'm not sure it's publicly available)
I cannot access it (even logged in with my launchpad account).
Can you send me a link publicly available with this same problem?, I really would like to check this out.
I'll try to reproduce it with some other device.
milosz
On 30 June 2015 at 10:05, Milosz Wasilewski milosz.wasilewski@linaro.org wrote:
On 25 June 2015 at 16:32, Luis Araujo luis.araujo@collabora.co.uk wrote:
I checked and it doesn't work (doesn't detect missing results). Here is example job: https://validation.linaro.org/scheduler/job/382325 (I'm not sure it's publicly available)
I cannot access it (even logged in with my launchpad account).
Can you send me a link publicly available with this same problem?, I really would like to check this out.
I'll try to reproduce it with some other device.
Luis,
Not exactly the same case, but I'm able to show that the missing results are not detected. Here is the job: https://validation.linaro.org/scheduler/job/410147
Here is my output: Generating lqa report for job(s): 410147
Report for job(s) (Tue Jun 30 14:33:11 2015): 410147 1 test job(s) ran: 1 complete (0 fully successful, 1 wih failures), 0 incomplete
* --- Failed Jobs --- *
(F) Jobs with failed tests:
410147: https://ci.linaro.org/jenkins/job/linux-linaro-stable-lsk-v3.18/hwpack=beagl... ==================================================================================================================== 319 passed, 17 failed, 27 skipped, 0 unknown FAILED | rcutorture:rcutorture-start FAILED | rcutorture:lava-test-shell-run FAILED | pwrmgmt:cputopology_01 FAILED | usb-test-basic:list-all-usb-devices FAILED | usb-test-basic:examine-all-usb-devices FAILED | usb-test-basic:print-supported-protocols FAILED | usb-test-basic:print-supported-speeds FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 Job: https://validation.linaro.org/scheduler/job/410147 Bundle: https://validation.linaro.org/dashboard/permalink/bundle/f022ec4a6ce822e154f...
In this case the LTP entries are there in the bundle. However one is missing the parameters. Anyway, I'll try to reproduce the original problem in some other test (maybe an artificial one, so you). In case of the original problem the entry for one LTP shell is missing from the bundle.
milosz
milosz
On 06/30/2015 09:39 PM, Milosz Wasilewski wrote:
On 30 June 2015 at 10:05, Milosz Wasilewski milosz.wasilewski@linaro.org wrote:
On 25 June 2015 at 16:32, Luis Araujo luis.araujo@collabora.co.uk wrote:
I checked and it doesn't work (doesn't detect missing results). Here is example job: https://validation.linaro.org/scheduler/job/382325 (I'm not sure it's publicly available)
I cannot access it (even logged in with my launchpad account).
Can you send me a link publicly available with this same problem?, I really would like to check this out.
I'll try to reproduce it with some other device.
Luis,
Not exactly the same case, but I'm able to show that the missing results are not detected. Here is the job: https://validation.linaro.org/scheduler/job/410147
Here is my output: Generating lqa report for job(s): 410147
Report for job(s) (Tue Jun 30 14:33:11 2015): 410147 1 test job(s) ran: 1 complete (0 fully successful, 1 wih failures), 0 incomplete
--- Failed Jobs --- *
(F) Jobs with failed tests:
410147: https://ci.linaro.org/jenkins/job/linux-linaro-stable-lsk-v3.18/hwpack=beagl...
319 passed, 17 failed, 27 skipped, 0 unknown FAILED | rcutorture:rcutorture-start FAILED | rcutorture:lava-test-shell-run FAILED | pwrmgmt:cputopology_01 FAILED | usb-test-basic:list-all-usb-devices FAILED | usb-test-basic:examine-all-usb-devices FAILED | usb-test-basic:print-supported-protocols FAILED | usb-test-basic:print-supported-speeds FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 Job: https://validation.linaro.org/scheduler/job/410147 Bundle: https://validation.linaro.org/dashboard/permalink/bundle/f022ec4a6ce822e154f...
In this case the LTP entries are there in the bundle. However one is missing the parameters. Anyway, I'll try to reproduce the original problem in some other test (maybe an artificial one, so you). In case of the original problem the entry for one LTP shell is missing from the bundle.
milosz
Right, the method won't report missing results for this job since all the testdef's specified in the yaml are available in the bundle:
https://validation.linaro.org/scheduler/job/410147/definition https://validation.linaro.org/dashboard/streams/anonymous/mwasilew/bundles/f...
The _find_missing_tests method is actually used only to check for those testdef from the lava_test_shell command that for some reason were not properly reported in the bundle stream. Now, if these tests have missing results, certainly, the method won't check for those, but this could be a later improvement for sure.
milosz
Luis,
On 1 July 2015 at 08:21, Luis Araujo luis.araujo@collabora.co.uk wrote:
On 06/30/2015 09:39 PM, Milosz Wasilewski wrote:
On 30 June 2015 at 10:05, Milosz Wasilewski milosz.wasilewski@linaro.org wrote:
On 25 June 2015 at 16:32, Luis Araujo luis.araujo@collabora.co.uk wrote:
I checked and it doesn't work (doesn't detect missing results). Here is example job: https://validation.linaro.org/scheduler/job/382325 (I'm not sure it's publicly available)
I cannot access it (even logged in with my launchpad account).
Can you send me a link publicly available with this same problem?, I really would like to check this out.
I'll try to reproduce it with some other device.
Luis,
Not exactly the same case, but I'm able to show that the missing results are not detected. Here is the job: https://validation.linaro.org/scheduler/job/410147
Here is my output: Generating lqa report for job(s): 410147
Report for job(s) (Tue Jun 30 14:33:11 2015): 410147 1 test job(s) ran: 1 complete (0 fully successful, 1 wih failures), 0 incomplete
--- Failed Jobs --- *
(F) Jobs with failed tests:
410147:
https://ci.linaro.org/jenkins/job/linux-linaro-stable-lsk-v3.18/hwpack=beagl...
==================================================================================================================== 319 passed, 17 failed, 27 skipped, 0 unknown FAILED | rcutorture:rcutorture-start FAILED | rcutorture:lava-test-shell-run FAILED | pwrmgmt:cputopology_01 FAILED | usb-test-basic:list-all-usb-devices FAILED | usb-test-basic:examine-all-usb-devices FAILED | usb-test-basic:print-supported-protocols FAILED | usb-test-basic:print-supported-speeds FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 Job: https://validation.linaro.org/scheduler/job/410147 Bundle: https://validation.linaro.org/dashboard/permalink/bundle/f022ec4a6ce822e154f...
In this case the LTP entries are there in the bundle. However one is missing the parameters. Anyway, I'll try to reproduce the original problem in some other test (maybe an artificial one, so you). In case of the original problem the entry for one LTP shell is missing from the bundle.
milosz
Right, the method won't report missing results for this job since all the testdef's specified in the yaml are available in the bundle:
https://validation.linaro.org/scheduler/job/410147/definition https://validation.linaro.org/dashboard/streams/anonymous/mwasilew/bundles/f...
The _find_missing_tests method is actually used only to check for those testdef from the lava_test_shell command that for some reason were not properly reported in the bundle stream. Now, if these tests have missing results, certainly, the method won't check for those, but this could be a later improvement for sure.
As I wrote this isn't exactly the same issue as I reported previously. If you take one of LTP result entries out from the bundle you will get what I wrote about before. So the situation is you ask for 2 LTP test shells with different parameters and result bundle only contains one result for LTP. The other one is entirely missing.
If you want to do the missing detection a bit better, take a look here: https://git.linaro.org/qa/qa-reports.git/blob/refs/heads/refactoring:/utils/... Eventually it should be fixed on LAVA side, but for the moment this is what we get.
milosz
Luis,
I have the exact case that fails for restricted job. Here it is: ./lqa -c ./examples/lqa.yaml analyse 415150 Generating lqa report for job(s): 415150
Report for job(s) (Thu Jul 02 16:16:09 2015): 415150 1 test job(s) ran: 1 complete (0 fully successful, 1 wih failures), 0 incomplete
* --- Failed Jobs --- *
(F) Jobs with failed tests:
415150: https://ci.linaro.org/jenkins/job/linux-linaro-stable-lsk-v3.18/hwpack=beagl... ==================================================================================================================== 116 passed, 4 failed, 0 skipped, 0 unknown FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:lava_test_shell FAILED | lava:test_kernel_exception_1 Job: https://validation.linaro.org/scheduler/job/415150 Bundle: https://validation.linaro.org/dashboard/permalink/bundle/d54379110280ae1d33c...
Hope that helps (one instance of mmtest is missing - compare results with definition). As I wrote before, I have the code that fixes the issue if you're interested. For simple cases your code works well (checked earlier).
milosz
On 1 July 2015 at 10:10, Milosz Wasilewski milosz.wasilewski@linaro.org wrote:
Luis,
On 1 July 2015 at 08:21, Luis Araujo luis.araujo@collabora.co.uk wrote:
On 06/30/2015 09:39 PM, Milosz Wasilewski wrote:
On 30 June 2015 at 10:05, Milosz Wasilewski milosz.wasilewski@linaro.org wrote:
On 25 June 2015 at 16:32, Luis Araujo luis.araujo@collabora.co.uk wrote:
I checked and it doesn't work (doesn't detect missing results). Here is example job: https://validation.linaro.org/scheduler/job/382325 (I'm not sure it's publicly available)
I cannot access it (even logged in with my launchpad account).
Can you send me a link publicly available with this same problem?, I really would like to check this out.
I'll try to reproduce it with some other device.
Luis,
Not exactly the same case, but I'm able to show that the missing results are not detected. Here is the job: https://validation.linaro.org/scheduler/job/410147
Here is my output: Generating lqa report for job(s): 410147
Report for job(s) (Tue Jun 30 14:33:11 2015): 410147 1 test job(s) ran: 1 complete (0 fully successful, 1 wih failures), 0 incomplete
--- Failed Jobs --- *
(F) Jobs with failed tests:
410147:
https://ci.linaro.org/jenkins/job/linux-linaro-stable-lsk-v3.18/hwpack=beagl...
==================================================================================================================== 319 passed, 17 failed, 27 skipped, 0 unknown FAILED | rcutorture:rcutorture-start FAILED | rcutorture:lava-test-shell-run FAILED | pwrmgmt:cputopology_01 FAILED | usb-test-basic:list-all-usb-devices FAILED | usb-test-basic:examine-all-usb-devices FAILED | usb-test-basic:print-supported-protocols FAILED | usb-test-basic:print-supported-speeds FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 Job: https://validation.linaro.org/scheduler/job/410147 Bundle: https://validation.linaro.org/dashboard/permalink/bundle/f022ec4a6ce822e154f...
In this case the LTP entries are there in the bundle. However one is missing the parameters. Anyway, I'll try to reproduce the original problem in some other test (maybe an artificial one, so you). In case of the original problem the entry for one LTP shell is missing from the bundle.
milosz
Right, the method won't report missing results for this job since all the testdef's specified in the yaml are available in the bundle:
https://validation.linaro.org/scheduler/job/410147/definition https://validation.linaro.org/dashboard/streams/anonymous/mwasilew/bundles/f...
The _find_missing_tests method is actually used only to check for those testdef from the lava_test_shell command that for some reason were not properly reported in the bundle stream. Now, if these tests have missing results, certainly, the method won't check for those, but this could be a later improvement for sure.
As I wrote this isn't exactly the same issue as I reported previously. If you take one of LTP result entries out from the bundle you will get what I wrote about before. So the situation is you ask for 2 LTP test shells with different parameters and result bundle only contains one result for LTP. The other one is entirely missing.
If you want to do the missing detection a bit better, take a look here: https://git.linaro.org/qa/qa-reports.git/blob/refs/heads/refactoring:/utils/... Eventually it should be fixed on LAVA side, but for the moment this is what we get.
milosz
Hello Milosz,
Thanks for going checking this issue that far and pointing me to the cause of it.
I glanced over your code and this certainly will involve quite few operations to properly get reported the exact missing test defs from LAVA. I also noticed that one of the problem for lqa not reporting the 'mmtests' is this particular case was about not properly handling duplicated tests names from the bundles.
I put a simple solution that should handle the below case better, with the patch from my branch at:
https://git.collabora.com/cgit/user/araujo/lqa.git/log/?h=WIP-fix-find-missi...
Even though we don't detect the same exact missing tests, we are mainly interested at the beginning to have an idea of what went wrong with any of the tests.
Let me know if you still don't get missing results with that patch, or maybe if you find other case giving such a failure.
Cheers,
Luis
On 07/02/2015 11:18 PM, Milosz Wasilewski wrote:
Luis,
I have the exact case that fails for restricted job. Here it is: ./lqa -c ./examples/lqa.yaml analyse 415150 Generating lqa report for job(s): 415150
Report for job(s) (Thu Jul 02 16:16:09 2015): 415150 1 test job(s) ran: 1 complete (0 fully successful, 1 wih failures), 0 incomplete
--- Failed Jobs --- *
(F) Jobs with failed tests:
415150: https://ci.linaro.org/jenkins/job/linux-linaro-stable-lsk-v3.18/hwpack=beagl...
116 passed, 4 failed, 0 skipped, 0 unknown FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:lava_test_shell FAILED | lava:test_kernel_exception_1 Job: https://validation.linaro.org/scheduler/job/415150 Bundle: https://validation.linaro.org/dashboard/permalink/bundle/d54379110280ae1d33c...
Hope that helps (one instance of mmtest is missing - compare results with definition). As I wrote before, I have the code that fixes the issue if you're interested. For simple cases your code works well (checked earlier).
milosz
On 1 July 2015 at 10:10, Milosz Wasilewski milosz.wasilewski@linaro.org wrote:
Luis,
On 1 July 2015 at 08:21, Luis Araujo luis.araujo@collabora.co.uk wrote:
On 06/30/2015 09:39 PM, Milosz Wasilewski wrote:
On 30 June 2015 at 10:05, Milosz Wasilewski milosz.wasilewski@linaro.org wrote:
On 25 June 2015 at 16:32, Luis Araujo luis.araujo@collabora.co.uk wrote:
> I checked and it doesn't work (doesn't detect missing results). Here > is example job: > https://validation.linaro.org/scheduler/job/382325 (I'm not sure it's > publicly available)
I cannot access it (even logged in with my launchpad account).
Can you send me a link publicly available with this same problem?, I really would like to check this out.
I'll try to reproduce it with some other device.
Luis,
Not exactly the same case, but I'm able to show that the missing results are not detected. Here is the job: https://validation.linaro.org/scheduler/job/410147
Here is my output: Generating lqa report for job(s): 410147
Report for job(s) (Tue Jun 30 14:33:11 2015): 410147 1 test job(s) ran: 1 complete (0 fully successful, 1 wih failures), 0 incomplete
--- Failed Jobs --- *
(F) Jobs with failed tests:
410147:
https://ci.linaro.org/jenkins/job/linux-linaro-stable-lsk-v3.18/hwpack=beagl...
==================================================================================================================== 319 passed, 17 failed, 27 skipped, 0 unknown FAILED | rcutorture:rcutorture-start FAILED | rcutorture:lava-test-shell-run FAILED | pwrmgmt:cputopology_01 FAILED | usb-test-basic:list-all-usb-devices FAILED | usb-test-basic:examine-all-usb-devices FAILED | usb-test-basic:print-supported-protocols FAILED | usb-test-basic:print-supported-speeds FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 Job: https://validation.linaro.org/scheduler/job/410147 Bundle: https://validation.linaro.org/dashboard/permalink/bundle/f022ec4a6ce822e154f...
In this case the LTP entries are there in the bundle. However one is missing the parameters. Anyway, I'll try to reproduce the original problem in some other test (maybe an artificial one, so you). In case of the original problem the entry for one LTP shell is missing from the bundle.
milosz
Right, the method won't report missing results for this job since all the testdef's specified in the yaml are available in the bundle:
https://validation.linaro.org/scheduler/job/410147/definition https://validation.linaro.org/dashboard/streams/anonymous/mwasilew/bundles/f...
The _find_missing_tests method is actually used only to check for those testdef from the lava_test_shell command that for some reason were not properly reported in the bundle stream. Now, if these tests have missing results, certainly, the method won't check for those, but this could be a later improvement for sure.
As I wrote this isn't exactly the same issue as I reported previously. If you take one of LTP result entries out from the bundle you will get what I wrote about before. So the situation is you ask for 2 LTP test shells with different parameters and result bundle only contains one result for LTP. The other one is entirely missing.
If you want to do the missing detection a bit better, take a look here: https://git.linaro.org/qa/qa-reports.git/blob/refs/heads/refactoring:/utils/... Eventually it should be fixed on LAVA side, but for the moment this is what we get.
milosz
Luis,
I got the patch but I can't test it now as we have LAVA maintenance/migration happening this week. I'll get back to it when migration is done.
milosz
On 3 July 2015 at 11:56, Luis Araujo luis.araujo@collabora.co.uk wrote:
Hello Milosz,
Thanks for going checking this issue that far and pointing me to the cause of it.
I glanced over your code and this certainly will involve quite few operations to properly get reported the exact missing test defs from LAVA. I also noticed that one of the problem for lqa not reporting the 'mmtests' is this particular case was about not properly handling duplicated tests names from the bundles.
I put a simple solution that should handle the below case better, with the patch from my branch at:
https://git.collabora.com/cgit/user/araujo/lqa.git/log/?h=WIP-fix-find-missi...
Even though we don't detect the same exact missing tests, we are mainly interested at the beginning to have an idea of what went wrong with any of the tests.
Let me know if you still don't get missing results with that patch, or maybe if you find other case giving such a failure.
Cheers,
Luis
On 07/02/2015 11:18 PM, Milosz Wasilewski wrote:
Luis,
I have the exact case that fails for restricted job. Here it is: ./lqa -c ./examples/lqa.yaml analyse 415150 Generating lqa report for job(s): 415150
Report for job(s) (Thu Jul 02 16:16:09 2015): 415150 1 test job(s) ran: 1 complete (0 fully successful, 1 wih failures), 0 incomplete
--- Failed Jobs --- *
(F) Jobs with failed tests:
415150:
https://ci.linaro.org/jenkins/job/linux-linaro-stable-lsk-v3.18/hwpack=beagl...
==================================================================================================================== 116 passed, 4 failed, 0 skipped, 0 unknown FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:lava_test_shell FAILED | lava:test_kernel_exception_1 Job: https://validation.linaro.org/scheduler/job/415150 Bundle: https://validation.linaro.org/dashboard/permalink/bundle/d54379110280ae1d33c...
Hope that helps (one instance of mmtest is missing - compare results with definition). As I wrote before, I have the code that fixes the issue if you're interested. For simple cases your code works well (checked earlier).
milosz
On 1 July 2015 at 10:10, Milosz Wasilewski milosz.wasilewski@linaro.org wrote:
Luis,
On 1 July 2015 at 08:21, Luis Araujo luis.araujo@collabora.co.uk wrote:
On 06/30/2015 09:39 PM, Milosz Wasilewski wrote:
On 30 June 2015 at 10:05, Milosz Wasilewski milosz.wasilewski@linaro.org wrote:
On 25 June 2015 at 16:32, Luis Araujo luis.araujo@collabora.co.uk wrote: >> >> I checked and it doesn't work (doesn't detect missing results). Here >> is example job: >> https://validation.linaro.org/scheduler/job/382325 (I'm not sure >> it's >> publicly available) > > > I cannot access it (even logged in with my launchpad account). > > Can you send me a link publicly available with this same problem?, I > really > would like to check this out. > I'll try to reproduce it with some other device.
Luis,
Not exactly the same case, but I'm able to show that the missing results are not detected. Here is the job: https://validation.linaro.org/scheduler/job/410147
Here is my output: Generating lqa report for job(s): 410147
Report for job(s) (Tue Jun 30 14:33:11 2015): 410147 1 test job(s) ran: 1 complete (0 fully successful, 1 wih failures), 0 incomplete
--- Failed Jobs --- *
(F) Jobs with failed tests:
410147:
https://ci.linaro.org/jenkins/job/linux-linaro-stable-lsk-v3.18/hwpack=beagl...
==================================================================================================================== 319 passed, 17 failed, 27 skipped, 0 unknown FAILED | rcutorture:rcutorture-start FAILED | rcutorture:lava-test-shell-run FAILED | pwrmgmt:cputopology_01 FAILED | usb-test-basic:list-all-usb-devices FAILED | usb-test-basic:examine-all-usb-devices FAILED | usb-test-basic:print-supported-protocols FAILED | usb-test-basic:print-supported-speeds FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 Job: https://validation.linaro.org/scheduler/job/410147 Bundle:
https://validation.linaro.org/dashboard/permalink/bundle/f022ec4a6ce822e154f...
In this case the LTP entries are there in the bundle. However one is missing the parameters. Anyway, I'll try to reproduce the original problem in some other test (maybe an artificial one, so you). In case of the original problem the entry for one LTP shell is missing from the bundle.
milosz
Right, the method won't report missing results for this job since all the testdef's specified in the yaml are available in the bundle:
https://validation.linaro.org/scheduler/job/410147/definition
https://validation.linaro.org/dashboard/streams/anonymous/mwasilew/bundles/f...
The _find_missing_tests method is actually used only to check for those testdef from the lava_test_shell command that for some reason were not properly reported in the bundle stream. Now, if these tests have missing results, certainly, the method won't check for those, but this could be a later improvement for sure.
As I wrote this isn't exactly the same issue as I reported previously. If you take one of LTP result entries out from the bundle you will get what I wrote about before. So the situation is you ask for 2 LTP test shells with different parameters and result bundle only contains one result for LTP. The other one is entirely missing.
If you want to do the missing detection a bit better, take a look here:
https://git.linaro.org/qa/qa-reports.git/blob/refs/heads/refactoring:/utils/... Eventually it should be fixed on LAVA side, but for the moment this is what we get.
milosz
Hello Milosz,
I wonder if you had the chance to test this patch after all.
Cheers,
Luis
On 07/07/2015 04:16 PM, Milosz Wasilewski wrote:
Luis,
I got the patch but I can't test it now as we have LAVA maintenance/migration happening this week. I'll get back to it when migration is done.
milosz
On 3 July 2015 at 11:56, Luis Araujo luis.araujo@collabora.co.uk wrote:
Hello Milosz,
Thanks for going checking this issue that far and pointing me to the cause of it.
I glanced over your code and this certainly will involve quite few operations to properly get reported the exact missing test defs from LAVA. I also noticed that one of the problem for lqa not reporting the 'mmtests' is this particular case was about not properly handling duplicated tests names from the bundles.
I put a simple solution that should handle the below case better, with the patch from my branch at:
https://git.collabora.com/cgit/user/araujo/lqa.git/log/?h=WIP-fix-find-missi...
Even though we don't detect the same exact missing tests, we are mainly interested at the beginning to have an idea of what went wrong with any of the tests.
Let me know if you still don't get missing results with that patch, or maybe if you find other case giving such a failure.
Cheers,
Luis
On 07/02/2015 11:18 PM, Milosz Wasilewski wrote:
Luis,
I have the exact case that fails for restricted job. Here it is: ./lqa -c ./examples/lqa.yaml analyse 415150 Generating lqa report for job(s): 415150
Report for job(s) (Thu Jul 02 16:16:09 2015): 415150 1 test job(s) ran: 1 complete (0 fully successful, 1 wih failures), 0 incomplete
--- Failed Jobs --- *
(F) Jobs with failed tests:
415150:
https://ci.linaro.org/jenkins/job/linux-linaro-stable-lsk-v3.18/hwpack=beagl...
==================================================================================================================== 116 passed, 4 failed, 0 skipped, 0 unknown FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:lava_test_shell FAILED | lava:test_kernel_exception_1 Job: https://validation.linaro.org/scheduler/job/415150 Bundle: https://validation.linaro.org/dashboard/permalink/bundle/d54379110280ae1d33c...
Hope that helps (one instance of mmtest is missing - compare results with definition). As I wrote before, I have the code that fixes the issue if you're interested. For simple cases your code works well (checked earlier).
milosz
On 1 July 2015 at 10:10, Milosz Wasilewski milosz.wasilewski@linaro.org wrote:
Luis,
On 1 July 2015 at 08:21, Luis Araujo luis.araujo@collabora.co.uk wrote:
On 06/30/2015 09:39 PM, Milosz Wasilewski wrote:
On 30 June 2015 at 10:05, Milosz Wasilewski milosz.wasilewski@linaro.org wrote: > On 25 June 2015 at 16:32, Luis Araujo luis.araujo@collabora.co.uk > wrote: >>> I checked and it doesn't work (doesn't detect missing results). Here >>> is example job: >>> https://validation.linaro.org/scheduler/job/382325 (I'm not sure >>> it's >>> publicly available) >> >> I cannot access it (even logged in with my launchpad account). >> >> Can you send me a link publicly available with this same problem?, I >> really >> would like to check this out. >> > I'll try to reproduce it with some other device. > Luis,
Not exactly the same case, but I'm able to show that the missing results are not detected. Here is the job: https://validation.linaro.org/scheduler/job/410147
Here is my output: Generating lqa report for job(s): 410147
Report for job(s) (Tue Jun 30 14:33:11 2015): 410147 1 test job(s) ran: 1 complete (0 fully successful, 1 wih failures), 0 incomplete
--- Failed Jobs --- *
(F) Jobs with failed tests: 410147:
https://ci.linaro.org/jenkins/job/linux-linaro-stable-lsk-v3.18/hwpack=beagl...
==================================================================================================================== 319 passed, 17 failed, 27 skipped, 0 unknown FAILED | rcutorture:rcutorture-start FAILED | rcutorture:lava-test-shell-run FAILED | pwrmgmt:cputopology_01 FAILED | usb-test-basic:list-all-usb-devices FAILED | usb-test-basic:examine-all-usb-devices FAILED | usb-test-basic:print-supported-protocols FAILED | usb-test-basic:print-supported-speeds FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 Job: https://validation.linaro.org/scheduler/job/410147 Bundle:
https://validation.linaro.org/dashboard/permalink/bundle/f022ec4a6ce822e154f...
In this case the LTP entries are there in the bundle. However one is missing the parameters. Anyway, I'll try to reproduce the original problem in some other test (maybe an artificial one, so you). In case of the original problem the entry for one LTP shell is missing from the bundle.
milosz
Right, the method won't report missing results for this job since all the testdef's specified in the yaml are available in the bundle:
https://validation.linaro.org/scheduler/job/410147/definition
https://validation.linaro.org/dashboard/streams/anonymous/mwasilew/bundles/f...
The _find_missing_tests method is actually used only to check for those testdef from the lava_test_shell command that for some reason were not properly reported in the bundle stream. Now, if these tests have missing results, certainly, the method won't check for those, but this could be a later improvement for sure.
As I wrote this isn't exactly the same issue as I reported previously. If you take one of LTP result entries out from the bundle you will get what I wrote about before. So the situation is you ask for 2 LTP test shells with different parameters and result bundle only contains one result for LTP. The other one is entirely missing.
If you want to do the missing detection a bit better, take a look here:
https://git.linaro.org/qa/qa-reports.git/blob/refs/heads/refactoring:/utils/... Eventually it should be fixed on LAVA side, but for the moment this is what we get.
milosz
Hi Luis,
I totally forgot about that - adding to my 'todo' list.
milosz
On 9 September 2015 at 11:27, Luis Araujo luis.araujo@collabora.co.uk wrote:
Hello Milosz,
I wonder if you had the chance to test this patch after all.
Cheers,
Luis
On 07/07/2015 04:16 PM, Milosz Wasilewski wrote:
Luis,
I got the patch but I can't test it now as we have LAVA maintenance/migration happening this week. I'll get back to it when migration is done.
milosz
On 3 July 2015 at 11:56, Luis Araujo luis.araujo@collabora.co.uk wrote:
Hello Milosz,
Thanks for going checking this issue that far and pointing me to the cause of it.
I glanced over your code and this certainly will involve quite few operations to properly get reported the exact missing test defs from LAVA. I also noticed that one of the problem for lqa not reporting the 'mmtests' is this particular case was about not properly handling duplicated tests names from the bundles.
I put a simple solution that should handle the below case better, with the patch from my branch at:
https://git.collabora.com/cgit/user/araujo/lqa.git/log/?h=WIP-fix-find-missi...
Even though we don't detect the same exact missing tests, we are mainly interested at the beginning to have an idea of what went wrong with any of the tests.
Let me know if you still don't get missing results with that patch, or maybe if you find other case giving such a failure.
Cheers,
Luis
On 07/02/2015 11:18 PM, Milosz Wasilewski wrote:
Luis,
I have the exact case that fails for restricted job. Here it is: ./lqa -c ./examples/lqa.yaml analyse 415150 Generating lqa report for job(s): 415150
Report for job(s) (Thu Jul 02 16:16:09 2015): 415150 1 test job(s) ran: 1 complete (0 fully successful, 1 wih failures), 0 incomplete
--- Failed Jobs --- *
(F) Jobs with failed tests:
415150:
https://ci.linaro.org/jenkins/job/linux-linaro-stable-lsk-v3.18/hwpack=beagl...
==================================================================================================================== 116 passed, 4 failed, 0 skipped, 0 unknown FAILED | lava:test_kernel_exception_1 FAILED | lava:test_kernel_exception_1 FAILED | lava:lava_test_shell FAILED | lava:test_kernel_exception_1 Job: https://validation.linaro.org/scheduler/job/415150 Bundle:
https://validation.linaro.org/dashboard/permalink/bundle/d54379110280ae1d33c...
Hope that helps (one instance of mmtest is missing - compare results with definition). As I wrote before, I have the code that fixes the issue if you're interested. For simple cases your code works well (checked earlier).
milosz
On 1 July 2015 at 10:10, Milosz Wasilewski milosz.wasilewski@linaro.org wrote:
Luis,
On 1 July 2015 at 08:21, Luis Araujo luis.araujo@collabora.co.uk wrote:
On 06/30/2015 09:39 PM, Milosz Wasilewski wrote: > > On 30 June 2015 at 10:05, Milosz Wasilewski > milosz.wasilewski@linaro.org wrote: >> >> On 25 June 2015 at 16:32, Luis Araujo luis.araujo@collabora.co.uk >> wrote: >>>> >>>> I checked and it doesn't work (doesn't detect missing results). >>>> Here >>>> is example job: >>>> https://validation.linaro.org/scheduler/job/382325 (I'm not sure >>>> it's >>>> publicly available) >>> >>> >>> I cannot access it (even logged in with my launchpad account). >>> >>> Can you send me a link publicly available with this same problem?, >>> I >>> really >>> would like to check this out. >>> >> I'll try to reproduce it with some other device. >> > Luis, > > Not exactly the same case, but I'm able to show that the missing > results are not detected. Here is the job: > https://validation.linaro.org/scheduler/job/410147 > > Here is my output: > Generating lqa report for job(s): 410147 > > Report for job(s) (Tue Jun 30 14:33:11 2015): > 410147 > 1 test job(s) ran: 1 complete (0 fully successful, 1 wih failures), 0 > incomplete > > * --- Failed Jobs --- * > > (F) Jobs with failed tests: > > 410147: > > > https://ci.linaro.org/jenkins/job/linux-linaro-stable-lsk-v3.18/hwpack=beagl... > > > > ==================================================================================================================== > 319 passed, 17 failed, 27 skipped, 0 unknown > FAILED | rcutorture:rcutorture-start > FAILED | rcutorture:lava-test-shell-run > FAILED | pwrmgmt:cputopology_01 > FAILED | usb-test-basic:list-all-usb-devices > FAILED | usb-test-basic:examine-all-usb-devices > FAILED | usb-test-basic:print-supported-protocols > FAILED | usb-test-basic:print-supported-speeds > FAILED | lava:test_kernel_exception_1 > FAILED | lava:test_kernel_exception_1 > FAILED | lava:test_kernel_exception_1 > FAILED | lava:test_kernel_exception_1 > FAILED | lava:test_kernel_exception_1 > FAILED | lava:test_kernel_exception_1 > FAILED | lava:test_kernel_exception_1 > FAILED | lava:test_kernel_exception_1 > FAILED | lava:test_kernel_exception_1 > FAILED | lava:test_kernel_exception_1 > Job: https://validation.linaro.org/scheduler/job/410147 > Bundle: > > > https://validation.linaro.org/dashboard/permalink/bundle/f022ec4a6ce822e154f... > > In this case the LTP entries are there in the bundle. However one is > missing the parameters. Anyway, I'll try to reproduce the original > problem in some other test (maybe an artificial one, so you). In case > of the original problem the entry for one LTP shell is missing from > the bundle. > > milosz
Right, the method won't report missing results for this job since all the testdef's specified in the yaml are available in the bundle:
https://validation.linaro.org/scheduler/job/410147/definition
https://validation.linaro.org/dashboard/streams/anonymous/mwasilew/bundles/f...
The _find_missing_tests method is actually used only to check for those testdef from the lava_test_shell command that for some reason were not properly reported in the bundle stream. Now, if these tests have missing results, certainly, the method won't check for those, but this could be a later improvement for sure.
As I wrote this isn't exactly the same issue as I reported previously. If you take one of LTP result entries out from the bundle you will get what I wrote about before. So the situation is you ask for 2 LTP test shells with different parameters and result bundle only contains one result for LTP. The other one is entirely missing.
If you want to do the missing detection a bit better, take a look here:
https://git.linaro.org/qa/qa-reports.git/blob/refs/heads/refactoring:/utils/... Eventually it should be fixed on LAVA side, but for the moment this is what we get.
milosz
linaro-validation@lists.linaro.org