Hello everyone,
one of the discussion we had during connect was to find (and use) a
common testing framework for unit (and maybe beyond) tests.
What we should use is probably a framework that still supports all the
unittest based tests we already have in our projects.
Following also other people suggestions, I looked around and did some
initial tests. What follows is a list of what we could use:
- pytest: http://pytest.org/latest/
- nose: https://nose.readthedocs.org/en/latest/
- stick with Python provided unittest: no need to install anything else
Personally I do not dislike unittest even if of the three is the most
verbose, but pytest is a powerful handy tool and easier to use (you
don't even need to inherit from TestCase).
I didn't play with pytest and Django tests, but looks like it is
possible to easily integrate it:
http://pytest-django.readthedocs.org/en/latest/
Tools:
These kind of tools are more targeted at mocking/patching objects or
behavior, should we avoid them or use them? (no flame wars please! :-)
Personally I find that sometimes (probably too often) I need them. If
we need them we should suggest which one to use, and stick with it:
- mock: http://www.voidspace.org.uk/python/mock/
- mocker: https://pypi.python.org/pypi/mocker
There are more, but these two are the ones I know about or have been using.
Other suggestions?
Ciao!
--
Milo Casagrande | Automation Engineer
Linaro.org <www.linaro.org> │ Open source software for ARM SoCs
At LCE13, we proposed to check code against PEP8
(http://cards.linaro.org/browse/LAVA-485) prior to merges and the
initial thought was that we could just rely on PyCharm to show the code
as "all green" to achieve PEP8 compatibility.
Turns out that PyCharm doesn't restrict itself to PEP8 and can override
PEP8 with project-specific settings in some areas (typically line
length).
Line length is actually a bit awkward in places and I've been using the
default PyCharm length of 120 whilst PEP8 advises 79. To keep the code
readable, this could involve a number of new single-use sub-routines to
reduce the indenting. Is this worth doing? If not, we can look at a
project-specific [pep8] setting in setup.py.
Other tools for PEP8 include pep8 itself, pylint and pychecker.
The command line pep8 has various false positives and false negatives
in it's bug history, so we may all have to run the same version and then
mandate that version when others test the code too.
pylint disagrees with pep8 and PyCharm on what is an error (e.g. line
length 80).
pychecker is more of a source code checker and it tries to import all
modules before checking which can lead to errors. e.g. when checking my
multinode files, it complained about missing SSL imports when there is
no such warning when running the code or in PyCharm.
PyCharm itself includes it's own error checking on top of PEP8 and
merges PEP8 with PyCharm's own validation output. e.g. mutable default
arguments raise a warning of the same type as a PEP8 warning:
lava_dispatcher/actions/lava_android_test.py
def run(self, commands=[], command_file=None, parser=None,
timeout=-1):
needs to become:
def run(self, commands=None, command_file=None, parser=None,
timeout=-1): if not commands:
commands = []
Arguably, this is probably a correct change, but there are other
situations where PyCharm appears to get it wrong:
e.g. in lava_dispatcher/signals/__init__.py (a file which has had lots
of changes in MultiNode). PyCharm thinks that this (working) code is an
error:
def signal(self, name, params, context=None):
self.context = context
handler = getattr(self, '_on_' + name, None)
if not handler and self._cur_handler:
handler = self._cur_handler.custom_signal
params = [name] + list(params)
if handler:
try:
handler(*params)
except:
logging.exception("handling signal %s failed", name)
PyCharm complains of two things here. handler: "'object' object is not
callable" and except: "Too broad exception clause". The exception
warning is a "weak warning" in PyCharm and wouldn't stop the code
getting a "green light". However, the handler code works just fine and
it's not clear to me how to change the code to add a __call__ function
which will reference the correct handler. Most of the changes I have
considered are quite intrusive for this part of the code.
We could document these as "overrides" using a comment but that won't
change how PyCharm shows the file, yet this "error" has nothing to do
with PEP8 AFAICT.
How intrusive do we want to go for PEP8? PyCharm isn't going to be the
PEP8 checker used during packaging or deployment, so what checker are
we going to use and how?
--
Neil Williams
=============
http://www.linux.codehelp.co.uk/
Hi,
I am trying to get a couple of new device types into LAVA. I have a
new config file in a checkout of lava-dispatcher, which is linked in
<instance>/code/current.
Path to code I want to run:
/srv/lava/instances/lab/code/current/local/lava-dispatcher/lava_dispatcher/default-config/lava-dispatcher/device-types/
Running bin/buildout in /srv/lava/instances/lab/code/current/
The new code isn't ending up in
/srv/lava/.cache/branch-cache/lava-dispatcher/checkouts.
Clearly I am missing something!
Any ideas?
--
James Tunnicliffe
Hi, All
>From this file:
http://bazaar.launchpad.net/~linaro-validation/lava-lab/salt-states/view/he…
we can see that the /usr/local/bin/adb file should be a symbolic of /usr/
local/android-sdk-linux/platform-tools/adb
but now it's a normal file(the adb wrapper before):
liuyq0307@fastmodels01:~$ ll `which adb`
-rwxr-xr-x 1 root root 877 May 31 03:54 /usr/local/bin/adb*
liuyq0307@fastmodels01:~$
Anyone know where is wrong?
--
Thanks,
Yongqin Liu
---------------------------------------------------------------
#mailing list
linaro-android(a)lists.linaro.org <linaro-dev(a)lists.linaro.org>
http://lists.linaro.org/mailman/listinfo/linaro-android
linaro-validation(a)lists.linaro.org <linaro-dev(a)lists.linaro.org>
http://lists.linaro.org/pipermail/linaro-validation
Hi,
I have a board here that doesn't boot to a prompt, it boots, but you
don't get a prompt until you hit enter. At that point it is the most
basic sh prompt (# ). I am hoping to get the developer to get it to a
prompt, but until then it would be good to work around that if I can.
I imagine LAVA uses a regexp to detect when a board has booted, but I
can't find it in the source. Is it waiting for the command prompt or
can I apply some expect style scripting? Do I need a prompt in a
specific form for testing? I have seen LAVA with the return code of
the previous command in the prompt before, but don't know if I need to
get images set up to do that or if I can script setting that up as
part of a test or boot process.
Thanks,
--
James Tunnicliffe
Hi all,
We need to start the switch over from launchpad single sign on, and to do this there will need to be a very small downtime of the LAVA server - it should be less than 5 minutes, maybe less than a minute if I do it right.
I intend to do this on Wednesday at 11:00UTC (12:00BST). I would do it tomorrow, but I have a doctor's appointment in the morning.
Thanks for your patience
Dave
Hey Lava devs,
I realise that right now might not be the best time in the development
cycle to mention this, but I've just watched an interesting talk on
really using postgres with Django:
http://www.youtube.com/watch?v=S-kbfpRpVsY
In particular, I think the (already postgres-specific) horrific code I
wrote for matching tags on job dispatch could be replaced with simple
tags arrays on Job and Device and a "<@" operator.
There are some other bits that could potentially be used (e.g. the JSON
stuff, maybe hstore for test results) but this part jumped out at me!
Cheers,
mwh
Hi all,
We need to take staging down for a while to move it from it's current cloud node to the master node, so that we can release a node from the cloud so that the multi-node dispatcher can run on bare metal. I plan to do the migration tomorrow morning starting at 08:00UTC. If all goes well, staging should be back up before 12:00UTC.
If anyone knows a reason why I should delay this, please let me know by e-mail before 08:00UTC tomorrow.
Thanks
Dave
Hi All,
Something I've been meaning to do for a long time is synchronise the latest LAVA master images with lava-create-master, and I've now completed this and tidied things up.
The definitive master images can now be found at:
http://images.validation.linaro.org/lava-masters/
I've tidied the directory above this up, so if you were relying on the master images being there, please note the change.
Thanks
Dave