Hi Ricardo,
On 11/20/23 15:30, Ricardo Cañuelo wrote:
On mié, nov 15 2023 at 19:43:49, Nikolai Kondrashov Nikolai.Kondrashov@redhat.com wrote:
Introduce a new tag, 'Tested-with:', documented in the Documentation/process/submitting-patches.rst file. The tag is expected to reference the documented test suites, similarly to the 'V:' field, and to certify that the submitter executed the test suite on the change, and that it passed.
I think the 'V:' field in MAINTAINERS is a good addition to document what developers are supposed to test for every subsystem, but in the case of the per-commit "Tested-with:" tag, I think the real value of it would be in using it for accountability and traceability purposes instead, that is, to link to the actual results of the (automatic) tests that were used to validate a commit.
This would provide two important features:
Rather than trusting that the tester did things right and that the test environment used was appropriate, we'd now have proof that the test results are as expected and a way to reproduce the steps.
A history of test results for future reference. When a regression is introduced, now we'd have more information about how things worked back when the test was still passing.
This is not trivial because tests vary a lot and we'd first need to define which artifacts to link to, and because whatever is linked (test commands, output log, results summary) would need to be stored forever. But since we're doing that already for basically all kernel mailing lists, I wonder if something like "public-inbox for test results" could be possible some day.
I agree that it would be good to have a record of the actual test results uploaded somewhere. For the start, I think it's fine to just have them uploaded to places like Dropbox or Google Drive, or whatever can be accessed with an unauthenticated URL.
Otherwise I'm seriously considering opening up submissions to KCIDB for the general (authenticated) public (with pre-moderation and whitelisting). That would require a bunch of work, though. We already have basic artifact mirroring system, but it relies on the submitter hosting the files somewhere in the first place. So we'd have to add some file upload service to that. And then we'd have to think really hard on how to keep the access public, and at the same time not to go bankrupt from somebody scraping our archive in S3 storage. Any help would be super-welcome!
I think we can make space for the results URL after the test name in the Tested-with: tag. We can probably make up some syntax for the V: field that would say if the URL is required or not, but it could just be always accepted. It will be the maintainer's call to require it or not.
I think it should be up to the test to define what their output should be, and it would be very hard (and not that useful) to unify them in a single output format (speaking from Red Hat's CKI experience executing many different suites for the kernel). The test name in the Tested-with: tag would help identify the format, if necessary.
Nick