Hi Tyler:
Thanks your information, it is useful.
For current status, i can pass "android_install_binaries" but getting stuck
in "boot_linaro_android_image". I think the problem is still relevant to
panda-driver.
This web have four files about pandaboard driver you posted(
https://developers.google.com/android/nexus/drivers)
20111114
20111216
20120430
20120807
i dont know which one is suitable for my pandaboard so i have tried
everyone.
There have no one can pass "boot_linaro_android_image".
The Log as following...
LOG:
20111114:
[ 521.131805] PVR_K:(Error): BridgedDispatchKM: Initialisation failed.
Driver unusable. [4812,
/mnt/jenkins/workspace/linaro-android-member-ti_panda-linaro-13.02-release/build/kernel/drivers/gpu/pvr/bridged_pvr_bridge.c]
[ 521.202728] PVR_K:(Error): BridgedDispatchKM: Initialisation failed.
Driver unusable. [4812,
/mnt/jenkins/workspace/linaro-android-member-ti_panda-linaro-13.02-release/build/kernel/drivers/gpu/pvr/bridged_pvr_bridge.c]
[ 521.552642] init: untracked pid 3766 exited
[ 521.626464] init: untracked pid 3762 exited
LEO COMMENT: PVR_K error. Does this driver not suitable for my pandaboard?
20111216
linaro-13.02-release/build/kernel/drivers/gpu/pvr/bridged_pvr_bridge.c]
[ 37.700134] PVR_K:(Error): BridgedDispatchKM: Initialisation failed.
Driver unusable. [4812,
/mnt/jenkins/workspace/linaro-android-member-ti_panda-linaro-13.02-release/build/kernel/drivers/gpu/pvr/bridged_pvr_bridge.c]
[ 38.766967] init: untracked pid 1771 exited
[ 41.124938] init: untracked pid 1774 exited
LEO COMMENT: It looks like same as 20111114.
20120430:
<LAVA_DISPATCHER>2013-03-27 02:30:45 PM DEBUG: expect (1800): '['Displayed
com.android.launcher/com.android.launcher2.Launcher:']'
logcat -s ActivityManager:I
--------- beginning of /dev/log/main
--------- beginning of /dev/log/system
[ 96.093536] warning: `zygote' uses 32-bit capabilities (legacy support
in use)
I/ActivityManager( 1776): Memory class: 48
I/ActivityManager( 1776): Enabled StrictMode logging for AThread's Looper
LEO COMMENT: No error message on this version. But it hang up and no more
message print out. No console command line when i click "enter" key.
20120807:
[ 53.677490] PVR_K:(Error): BridgedDispatchKM: Driver initialisation not
completed yet. [4836,
/mnt/jenkins/workspace/linaro-android-member-ti_panda-linaro-13.02-release/build/kernel/drivers/gpu/pvr/bridged_pvr_bridge.c]
[ 53.737884] PVR_K:(Error): BridgedDispatchKM: Driver initialisation not
completed yet. [4836,
/mnt/jenkins/workspace/linaro-android-member-ti_panda-linaro-13.02-release/build/kernel/drivers/gpu/pvr/bridged_pvr_bridge.c]
[ 54.163696] init: untracked pid 1781 exited
[ 55.792907] init: untracked pid 1784 exited
LEO COMMENT: It looks like same as 20111114.
One thing need to be mentiond, there have no
"pvrsrvinit" but "pvrsrvctl" in version 20120807. Both files are same?
LEO
On 26 March 2013 20:57, Tyler Baker <tyler.baker(a)linaro.org> wrote:
> Hi Leo Wu,
>
> The "android_install_binaries" dispatcher command will download, unpack,
> and deploy proprietary shared objects to the Android file system.
>
> You can get these binaries from here:
> https://developers.google.com/android/nexus/drivers - Choose the correct
> binaries based on your Android build target.
>
> You will then have to create the following directory structure and tgz it:
>
> ./bin:
> pvrsrvinit
>
> ./vendor:
> lib
>
> ./vendor/lib:
> egl hw libglslcompiler.so libIMGegl.so libpvr2d.so
> libpvrANDROID_WSEGL.so libPVRScopeServices.so libsrv_init.so
> libsrv_um.so libusc.so
>
> ./vendor/lib/egl:
> libEGL_POWERVR_SGX540_120.so libGLESv1_CM_POWERVR_SGX540_120.so
> libGLESv2_POWERVR_SGX540_120.so
>
> ./vendor/lib/hw:
> gralloc.omap4.so
>
> At this point you have created your own panda-driver.tgz
>
> Now you will need to host it on a web server/file system that your LAVA
> server can access.
>
> For instance, my LAVA server has the following defined:
>
> android_binary_drivers = http://192.168.1.2/panda-drivers.tgz <-- You
> have to host this URL
>
> Hopefully this clears up any confusion you may have. Thanks.
>
>
>
>
>
>
> On 26 March 2013 02:15, Leo Wu <leo.wu(a)linaro.org> wrote:
>
>> hi:
>> Does anyone knows how to use "android_install_binaries" command in json
>> file? Please give me some suggestion.
>> I am trying to perfome lava android test on pandaboard. Currently, i face
>> a problem while json file execute "android_install_binaries".
>> The log shows it tries to connect to
>> http://192.168.1.21/LAVA_HTTP/android-binaries/panda-drivers.tgz.
>>
>> log: http://192.168.1.21/LAVA_HTTP/android-binaries/panda-drivers.tgz Connecting
>> to 192.168.1.21:80... failed:Connection timed out.
>> RuntimeError: Extracting
>> http://192.168.1.21/LAVA_HTTP/android-binaries/panda-drivers.tgz on
>> target failed
>>
>> LAVA SERVER IP: 222.222.222.4
>> Board: pandaboard
>> Master Image: Linaro PreBuild Image
>> Test Image: LEB android
>>
>>
>> 1. My LAVA SERVER ip is 222.222.222.4. i confused that why it tries to
>> connect to 192.168.1.21?
>> 2. what is panda-drivers.tgz? is it exist in lava server already or i
>> need to download panda-drivers.tgz from somewhere by manual and stored it
>> in local at first?
>> 3. how to configure it?
>>
>> Thank you
>> Leo
>>
>> _______________________________________________
>> linaro-validation mailing list
>> linaro-validation(a)lists.linaro.org
>> http://lists.linaro.org/mailman/listinfo/linaro-validation
>>
>>
>
>
> --
> Tyler Baker
> Technical Architect, Automation & CI
> Linaro.org | Open source software for ARM SoCs
> Follow Linaro: http://www.facebook.com/pages/Linaro
> http://twitter.com/#!/linaroorg - http://www.linaro.org/linaro-blog
>
Hi,
I can't create stream anymore:
XML-RPC error 403: Only a member of group 'linaro' could create this stream
ps: used to work, I'm creating streams on a weekly basis....most
likely a recent change.
Cheers,
--
Fathi Boudra
Builds and Baselines Manager | Release Manager
Linaro.org | Open source software for ARM SoCs
Hi,
I made a small modification to lava-scheduler app code that enabled
direct linking to line numbers in full log view. The branch was posted
to launchpad:
http://bazaar.launchpad.net/~mwasilew/lava-scheduler/log_linenumbers/revisi…
The solution isn't perfect as it hardcodes HTML tags into django
templatetag, but I wanted minimal changes in the scheduler app code.
If you like it I will request the merge :)
Best Regards,
milosz
Hello everyone,
Today we deployed a change to validation.linaro.org to drop the
/lava-server prefix. This was motivated by the fact that new LAVA
installs do not have any prefix by default, and having a different
format in the main LAVA instance out there caused some confusion.
So all requests to /lava-server/$ADDRESS are now being automatically
redirected to /$ADDRESS.
A special case is the API URL for job submission. Before you would use
this:
https://validation.linaro.org/lava-server/RPC2/
Now, you should use this:
https://validation.linaro.org/RPC2/
API requests at the old address still work, but we kindly request that
everyone stop using it and start using the new URL.
We want to drop support for the old URL in the future to reduce the
delta between our LAVA instance and a standard setup, so please do
upgrade automated scripts and static job files you maintain to use the
new URL instead of the old one. In a few months we will come back to you
announcing a deadline for definitively discontinuing support for the old
URL format.
--
Antonio Terceiro
Software Engineer - Linaro
http://www.linaro.org
Hello everyone,
one of the discussion we had during connect was to find (and use) a
common testing framework for unit (and maybe beyond) tests.
What we should use is probably a framework that still supports all the
unittest based tests we already have in our projects.
Following also other people suggestions, I looked around and did some
initial tests. What follows is a list of what we could use:
- pytest: http://pytest.org/latest/
- nose: https://nose.readthedocs.org/en/latest/
- stick with Python provided unittest: no need to install anything else
Personally I do not dislike unittest even if of the three is the most
verbose, but pytest is a powerful handy tool and easier to use (you
don't even need to inherit from TestCase).
I didn't play with pytest and Django tests, but looks like it is
possible to easily integrate it:
http://pytest-django.readthedocs.org/en/latest/
Tools:
These kind of tools are more targeted at mocking/patching objects or
behavior, should we avoid them or use them? (no flame wars please! :-)
Personally I find that sometimes (probably too often) I need them. If
we need them we should suggest which one to use, and stick with it:
- mock: http://www.voidspace.org.uk/python/mock/
- mocker: https://pypi.python.org/pypi/mocker
There are more, but these two are the ones I know about or have been using.
Other suggestions?
Ciao!
--
Milo Casagrande | Automation Engineer
Linaro.org <www.linaro.org> │ Open source software for ARM SoCs
At LCE13, we proposed to check code against PEP8
(http://cards.linaro.org/browse/LAVA-485) prior to merges and the
initial thought was that we could just rely on PyCharm to show the code
as "all green" to achieve PEP8 compatibility.
Turns out that PyCharm doesn't restrict itself to PEP8 and can override
PEP8 with project-specific settings in some areas (typically line
length).
Line length is actually a bit awkward in places and I've been using the
default PyCharm length of 120 whilst PEP8 advises 79. To keep the code
readable, this could involve a number of new single-use sub-routines to
reduce the indenting. Is this worth doing? If not, we can look at a
project-specific [pep8] setting in setup.py.
Other tools for PEP8 include pep8 itself, pylint and pychecker.
The command line pep8 has various false positives and false negatives
in it's bug history, so we may all have to run the same version and then
mandate that version when others test the code too.
pylint disagrees with pep8 and PyCharm on what is an error (e.g. line
length 80).
pychecker is more of a source code checker and it tries to import all
modules before checking which can lead to errors. e.g. when checking my
multinode files, it complained about missing SSL imports when there is
no such warning when running the code or in PyCharm.
PyCharm itself includes it's own error checking on top of PEP8 and
merges PEP8 with PyCharm's own validation output. e.g. mutable default
arguments raise a warning of the same type as a PEP8 warning:
lava_dispatcher/actions/lava_android_test.py
def run(self, commands=[], command_file=None, parser=None,
timeout=-1):
needs to become:
def run(self, commands=None, command_file=None, parser=None,
timeout=-1): if not commands:
commands = []
Arguably, this is probably a correct change, but there are other
situations where PyCharm appears to get it wrong:
e.g. in lava_dispatcher/signals/__init__.py (a file which has had lots
of changes in MultiNode). PyCharm thinks that this (working) code is an
error:
def signal(self, name, params, context=None):
self.context = context
handler = getattr(self, '_on_' + name, None)
if not handler and self._cur_handler:
handler = self._cur_handler.custom_signal
params = [name] + list(params)
if handler:
try:
handler(*params)
except:
logging.exception("handling signal %s failed", name)
PyCharm complains of two things here. handler: "'object' object is not
callable" and except: "Too broad exception clause". The exception
warning is a "weak warning" in PyCharm and wouldn't stop the code
getting a "green light". However, the handler code works just fine and
it's not clear to me how to change the code to add a __call__ function
which will reference the correct handler. Most of the changes I have
considered are quite intrusive for this part of the code.
We could document these as "overrides" using a comment but that won't
change how PyCharm shows the file, yet this "error" has nothing to do
with PEP8 AFAICT.
How intrusive do we want to go for PEP8? PyCharm isn't going to be the
PEP8 checker used during packaging or deployment, so what checker are
we going to use and how?
--
Neil Williams
=============
http://www.linux.codehelp.co.uk/
Hi,
I am trying to get a couple of new device types into LAVA. I have a
new config file in a checkout of lava-dispatcher, which is linked in
<instance>/code/current.
Path to code I want to run:
/srv/lava/instances/lab/code/current/local/lava-dispatcher/lava_dispatcher/default-config/lava-dispatcher/device-types/
Running bin/buildout in /srv/lava/instances/lab/code/current/
The new code isn't ending up in
/srv/lava/.cache/branch-cache/lava-dispatcher/checkouts.
Clearly I am missing something!
Any ideas?
--
James Tunnicliffe
Hi, All
>From this file:
http://bazaar.launchpad.net/~linaro-validation/lava-lab/salt-states/view/he…
we can see that the /usr/local/bin/adb file should be a symbolic of /usr/
local/android-sdk-linux/platform-tools/adb
but now it's a normal file(the adb wrapper before):
liuyq0307@fastmodels01:~$ ll `which adb`
-rwxr-xr-x 1 root root 877 May 31 03:54 /usr/local/bin/adb*
liuyq0307@fastmodels01:~$
Anyone know where is wrong?
--
Thanks,
Yongqin Liu
---------------------------------------------------------------
#mailing list
linaro-android(a)lists.linaro.org <linaro-dev(a)lists.linaro.org>
http://lists.linaro.org/mailman/listinfo/linaro-android
linaro-validation(a)lists.linaro.org <linaro-dev(a)lists.linaro.org>
http://lists.linaro.org/pipermail/linaro-validation
Hi,
I have a board here that doesn't boot to a prompt, it boots, but you
don't get a prompt until you hit enter. At that point it is the most
basic sh prompt (# ). I am hoping to get the developer to get it to a
prompt, but until then it would be good to work around that if I can.
I imagine LAVA uses a regexp to detect when a board has booted, but I
can't find it in the source. Is it waiting for the command prompt or
can I apply some expect style scripting? Do I need a prompt in a
specific form for testing? I have seen LAVA with the return code of
the previous command in the prompt before, but don't know if I need to
get images set up to do that or if I can script setting that up as
part of a test or boot process.
Thanks,
--
James Tunnicliffe
Hi all,
We need to start the switch over from launchpad single sign on, and to do this there will need to be a very small downtime of the LAVA server - it should be less than 5 minutes, maybe less than a minute if I do it right.
I intend to do this on Wednesday at 11:00UTC (12:00BST). I would do it tomorrow, but I have a doctor's appointment in the morning.
Thanks for your patience
Dave