Introducing “Python for Android”

I’m glad to share a new project called Python for Android. The goal of this project is to package your python application into an APK.

https://github.com/kivy/python-for-android

The project is under the umbrella of Kivy organization, but is not designed to be limited to Kivy only. Read the documentation to correctly install the NDK/SDK Android, and set the needed environment variables.

The packaging is done in 4 steps:
1. Ensure you have Android SDK/NDK downloaded and correctly installed
2. Ensure you have some environment set
3. Create a Python distribution containing the selected modules
4. Use that distribution to build an APK of your Python application

Creating the python distribution is as simple as that:

A directory dist/default will be created, including the result of the whole arm compilation.
Available libraries as for today: jpeg pil png sdl sqlite3 pygame kivy android libxml2 libxslt lxml ffmpeg openssl.

The second step is a little bit harder, since you need to provide more information for Android:

Then you’ll get a nicely bin/touchtracer-1.0-debug.apk

Pro:

  • A blacklist.txt file that can be used to exclude files in the final APK
  • Reusable distribution for other applications
  • Modular recipes architecture
  • Be able to build independents python distributions

Cons:

  • You need a main.py file that will be used for starting your application
  • Only one java bootstrap available, using OpenGL ES 2.0.
  • Only Kivy toolkit is working. I’m sure that other people can enhance it to add other toolkit recipes. But for example, pygame is not gonna to work because the android project is OpenGL ES 2.0: pygame drawing will not work.

I hope you’ll like it :)

We would like to thank Renpy / PGS4A for its initial pygame for android project

Kivy on Android, part 2

Hi guys,

Look like people are following my blog and waiting for android version of Kivy.
We have a launcher that you can already use. Check the :

Maybe during the next release, or a little bit after, i’ll release a software to create an Android package of a Kivy application. The code is already on launchpad, but it’s still a work in progress. As soon as i have finished, i’ll publish it on kivy-dev mailing list. If you didn’t subscribe yet, do it now ! :)

More to come by the end of that week… !

Kivy (next PyMT) on Android, step 1 done !

Tonight is a wonderful night.

I know that i didn’t announce Kivy officially yet, but i’ll do it in another blog post very soon. You just need to know that Kivy is the next PyMT version. From 2 years ago with thomas, we have regulary doubts and reflections about using Python for PyMT. And i’ve started to look more at the future, and i was deeply convince that for our sake, we must be able to run on a Webbrowser. The goal is simple: same code for every platform, at least what we use every day: Linux / Windows / Macosx / Android / iOS.

Android and iOS are new OS, and we was thinking that except running in webbrowser, we will be never able to run on it. And we have started to target a futur with fewer dependencies, OpenGL ES 2.0 compatible, and so on. This vision have been named Kivy. Theses last days, i’ve removed numpy and pyopengl dependencies. Pygame is the only library required for running an application with widgets. (minimal don’t mean full featured).

And i’ve started to look at the android platform, since Tom from Renpy library have deliver a pygame subset for android. He just made an awesome work. My part was just to understand how it work, and get Kivy compilation done.

For now, here is what i’ve got :

Ok, but what i got exactly ?

  • Python/Pygame running from renpytom project
  • Failed attempt to use numpy on android
  • Kivy adapation for android (opengl debug mode, removing numpy and pyopengl, link on opengl es 2.0…)
  • Pygame change to create OpenGL ES 2.0
  • Various patch on the build system

And here is my step 2 :

  • Send to upstream all the patch on the build system
  • Resolve symbol conflict when 2 compiled module have the same name (kivy.event and pygame.event… nice naming.)
  • Add a way of detecting Android platform from python
  • Add multitouch support to pygame and/or kivy
  • Add android sleep/wakeup in kivy
  • Write documentation about how to compile a kivy application on android

For now, sleep time ! Enjoy.

PyMT used in Fourvière Museum

Last week, the Museotouch project have been inaugurated in the Fourvière Museum. The project have been done by many actors including Erasme and Mucho-Media.

One part of this project is a multitouch application to explore museum content. You can select the origin, the period/era, and you have museum content display on the table. Then you can select each of them to have more information.
As an user, you have another way to communicate with the application : when you come into that museum, they give you an badge with rfid inside. Walk in the museum, use your badge to “save” the object and then, put your badge on the table to explore what you have selected.
You can read the Erasme Museotouch page for more information, and see the Museotouch photos.

This application has been developed by Nadège Bourguignon :)

Joojoo and multitouch on Ubuntu 10.10

Hi everyone,

Some people are asking how to make multitouch working on joojoo. It’s now possible, starting from Ubuntu 10.10 + some manipulations.

Utouch ppa

The working drivers for joojoo wasn’t finished and ready for ubuntu 10.10 release. However, the driver is available from the utouch-team ppa. So add the ppa :

And then, install the driver :

The drivers should compile in live, and will be available in the next reboot. But don’t reboot now, we have more things to do.
But i got some troubles with default drivers, the dkms one don’t load if the old driver still exist.
So, ensure you have no other drivers in your /lib/modules :

udev rules

The joojoo screen have never the same /dev/input/eventX, and change from time to time. And it’s not readable by users.
We will create an udev rule to change permission, and symlink the /dev/input/eventX to /dev/input/event-joojoo

And put this content in the file :

Reboot.

You should have your screen correctly working with one finger.

PyMT

If you want to test multitouch, you can test PyMT :

Launch pymt one time, and hit escape :

Edit the configuration file ~/.pymt/config and add in [input] section :

Then, you can test rapid demo or full desktop :

Using pymt-dev version

If you are using pymt-dev version from github, they are new features to make joojoo experience more fun :

Disable mouse on activity

If you still want to use the mouse, but not the mouse when the touchscreen is used, you may want this feature. It will automaticly disable the mouse when touchscreen is used. Just change your [input] section to :

Sleeping when no activity detected

This is highly experimental: sleep module. If you will not touch the screen from a moment, the sleep module will introduce a sleep() call inside the main loop, in order to reduce the framerate. The sleep ramp and time can be configured. Check the sleep module documentation.

To activate it, put in ~/.pymt/config :

Conclusion

That’s all. The Xorg is missing right now, it will be added soon :)

PyMT -to javascript-, the end.

As planned in september, i’ve explored how PyMT can be translated in javascript, using Pyjamas project. All is possible, and this translation could be possible, but i will need more than 2 week actually. It’s more like 4 month. Let’s check in details…

Pyjamas… not fully compatible Python !

Pyjamas don’t actually support :

  1. tuple construction (for x, y in tuplelist / for x, y in zip(a[::2], a[1::2]))
  2. with keyword with custom class (actively used in PyMT, gx_matrix, gx_begin…)
  3. Preprocessing. It would be very nice if we can have a way to NOT parse python, and not resolve all input. Like #ifdef / #endif in C. For example, it touch providers, we are testing if the system is linux, mac or windows to include the good input providers. But Pyjamas don’t care about if, since it’s a python translator. And then it will try to convert Windows part.)

Numpy

When the scatter widget have been rewritten, we have focusing about performance. All the math are now depend of Numpy. A fully C/Python library. For translation, we should write all the part we are using in a pure javascript library.

pyOpenGL / graphx

Same as Numpy, we are using OpenGL everywhere. pyOpenGL is performant when we are using contiguous array. That’s also why we are using Numpy when passing data to OpenGL.
But the most important is.. that all our call right now is not compatible WebGL.

As defined on the Kronos website, WebGL is based on the OpenGL ES 2.0 API. Direct rendering is not available on the OpenGL ES subset (glBegin/glEnd/glVertex…). Our entire graphx package, and all our widgets need to be rewritten.

And you know what ? We are moving to our new graphics package for the next PyMT version. But it will not resolve the issue because the new graphics package are written… in Cython, a C/Python language. Not compatible with Pyjamas so.

So after the move, the graphics library should be also completely rewritten for Javascript.

Core providers

I didn’t reach this part. The goal would be to create a Javascript platform for every providers, and implement it using JS() from Pyjamas. Not a big deal.

Conclusion

The end ? The work to do is huge. It’s possible, but not alone. So if anyone want to contribute on the Javascript translation, please contact me :)

PyMT 0.5 out, what’s next ?

One week ago, we’ve released PyMT 0.5 (release notes). I’m very glad to see this release. We’ve working so much to make it stable, and we’ll continue.

The good part of this release is the availability on next Ubuntu Maverick. If you don’t know, Ubuntu Maverick will be multitouch. Before making the announcement public, we’ve joined the HCI group, and they have done documentation about Multitouch in Ubuntu, and PyMT in Ubuntu.
That’s a very good news, and i hope that we’ll be able to enhance our collaboration with Ubuntu, and share experience.

So, what’s next ?

Since people are asking always that “we should do a roadmap”, PyMT Roadmap is updated.
Next version will be the PyMT 0.5.1, and release is scheduled for the 6 September. This will include another set of fixes, found during the 0.5 release. And this will be the version shipped on next Ubuntu too. Auto detection of MT Hardware, fixes text rendering outside of the text input, gstreamer appsink missing, more unit test…

Starting September, the “PyMT Week Report” will back. Since the community is growing, it will be a good thing to know what’s new on the toolkit every weeks, and explain if we are hitting trouble, or if all the developments is ok :)

Then… I’ve lot of idea on my mind, and will focus on :

  • Spatialization: be able to handle 1 millions of widgets, added on an infinite plane.
  • Graphics: finish and use graphics library (new from 0.5) for all of our widgets.
  • App subsystem: replace the old MTPlugin with a new Application subsystem, that permit the use to browse and launch any PyMT application available on his system
  • Enhance CSS writing (less or sass-lang)

Then in november, i’ll try to port PyMT on Javascript… using Pyjamas and HTML5.

PyMT is really going to be… awesome !

Multitouch experience on Joojoo

I’m writing this blog post just to remember what i’ve done, and how. Don’t want to forget it for the next time :) It’s incomplete, and done for developer.

Last week, i’ve got a Joojoo, the old named “Crunchpad”. The hardware look good: Intel ATOM N270 + NVidia ION (GeForce 9400M) + 4Go SSD + 1Go Ram + 1.3M Webcam + USB + eGalax dualtouch screen + Wifi. At least, it look like a good hardware to hack on. Before you ask, i’ve buyed this tablet cause of NVidia ION, it support OpenGL 3.0, enough to run PyMT ! I’ll not say anything about the default OS installed on Joojoo… but will start directly how to install custom OS on it.

How to install custom OS on JooJoo

  1. You can use any usb device on the joojoo (CD or USB). I’ve used Unetbootin to create a custom USB boot key for Lucid Ubuntu. Don’t select net install, wifi driver is not included in the setup.
  2. Plug a usb switch + usb keyboard + usb key on the usb port of Joojoo
  3. Stay pressed on the DEL key + press twice the power button (once to make Joojoo logo appear, and a second longer to power the screen)
  4. In the boot menu, change the order of Joojoo SSD and your USB key, to make your USB key primary.
  5. Reboot by pressing twice the power button as described.

If you see nothing on the screen, reset the joojoo with the tool, and press the power button twice again.
Twice is really important, without that, the screen will stay black. On the second time, you must see the screen power up.

About drivers :

  • The default wifi driver is not working, so go on http://www.realtek.com/downloads/ , search for 8192SE, and take RTL8192SE drivers for linux 2.6.
  • For Nvidia ION, just download latest drivers from http://nvidia.com/.
  • For touchscreen… that’s the whole problem. It’s not working by default. You can use the constructor driver available http://home.eeti.com.tw/web20/eGalaxTouchDriver/linuxDriver.htm, but dual touch is not working.

So, i’ve buyed this tablet for Dual Touch + OpenGL 3.0… Let’s check if windows is working with dual touch.

How to install window 7 on JooJoo

On this part, i would like to thank a lot Fabien for his experience on this domain.
The problem is, with 4Go SSD, you can’t install Normal Windows Seven. You must build a custom Window. To do that :

  1. go on Microsoft Windows 7 Embedded
  2. download the Standard 7 Toolkit
  3. launch the image configurator
  4. select all the things you want on windows… Like Window desktop, USB Boot, Power Management… I’ll upload my “answer” file tomorrow.
  5. your image must be < 2Go.
  6. burn your answer on usb (Tool menu). You’ll have a USB boot key ready to install Window Seven Embedded !

Then, it’s the same step as before, plug, reboot, install.

After installation of eGalax windows driver, the touchscreen work with dual touch !

How to capture USB on windows seven ?

Since wireshark is not able to capture USB, and no free toolkit is able to do it at this moment, i’ve used HHD Usb monitor, in trial version. Just run it, select Raw capture, Start, play with the screen, reboot the screen, play again with the screen (1 finger, then 2). And export the whole capture into HTML.
And here is the Joojoo eGalax USB Capture.

And that’s a bingo. We have bulk message with a 0×6 len, and other with 0xc len (double of 0×6.) Look really like we got 2 touch ! I don’t know what is the setup message, but look like theses messages activate the screen ! Now… go back on linux to test :)

Check eGalax dual touch support with python-usb

Enable usb_debug (optionnal)

Ok, this was really easy, cause of the old experience of Hack of Dell SX2210T screen. Even if the experience was not a success, learning how USB work was really funny. First thing, activate USB sniffing to check if we are doing it right :


$ sudo modprobe usb_debug
$ sudo cat /sys/kernel/debug/usb/usbmon/0u (0u is where my device is connected)

You’ll have a bunch of message, as soon as you play with the device. You can read usbmon/usb_debug documentation about the format of the message.

Search your device with python-usb

When you’re searching for device on dmesg, you’ll have :

generic-usb 0003:0EEF:720C.0001: input,hiddev96,hidraw0: USB HID v2.10 Pointer [eGalax Inc. USB TouchController] on usb-0000:00:06.0-1/input0

You can also use lsusb :

Bus 004 Device 002: ID 0eef:720c D-WAV Scientific Co., Ltd

The 0EEF is your vendor ID, and 720C is your product ID. Let’s search our device in USB busses:

Send SETUP configuration

To understand more how USB work, you can check USB nut shell, the first part is talking about Setup packet.
This is exactly what we want, since on the capture we have :


000006: Control Transfer (UP), 02.08.2010 01:50:04.275 +0.0. Status: 0x00000000
Pipe Handle: 0x855dbe94

0A 01 41 0A 01 41

Setup Packet

40 00 00 00 00 00 06 00

Recipient: Device
Request Type: Vendor
Direction: Host->Device
Request: 0x0 (Unknown)
Value: 0x0
Index: 0x0
Length: 0x6

We have exactly 2 setup packet to transfer to the device. The most complicated is to understand python-usb.

  • Recipient -> usb.RECIP_DEVICE
  • Request Type -> usb.TYPE_VENDOR
  • Direction -> usb.ENDPOINT_OUT

So, let’s send the first setup packet. This packet activate the device to talk with us (we’re receiving lot of packet from him).

And the second request activate dual touch support (without him, we’re receiving only one touch) :

And, that’s done ! When we read the device, we are receiving packet !
The format is :

  • Flags : 8 bits (161 = ALIVE, 160 = UP)
  • X : 16 bits
  • Y : 16 bits
  • ID : 8 bits (65 = touch 1, 66 = touch 2)

The flags and id is not really flags/id… because the value is not common. But it’s not changing on pressure, and not in the time… Dunno why.

Still, here is the full source code of the eGalax Joojoo dual touchscreen with python-usb.

And after ?

Normally, 2.6.35 include eGalax driver, but it’s not working. Will check another day why the kernel driver don’t support dual touch for this screen !

FITG 2010 finished, debriefing.

Another awesome event, organized by Damien Marchal and Nicolas Roussel. And for the first time, the PyMT Core team was in the same place. Thanks to all my friends in Lille for coming during the show, it was really nice to see you too !

We’ve done one conference, using the “presemt” application. For all who wanted to use our presentation application, you can get it on PyMT Apps repository. The slides will be available in PDF soon. A lot of peoples was interested, and impressed by PyMT itself. I’ve been very glad to meet some PyMT users, specially Jay !

On the barcamp, we’ve animated 3 sessions:

  1. Write your first PyMT Application (mathieu)
  2. Make a game with PyMT + PyMunk (christopher + thomas)
  3. Advanced OpenGL optimization + Python optimization

(Content will be available soon)

During the event, some work have been done :

  • finish blobtracker in Movid
  • ability to record / save PyMT events (event-recording branch)
  • ability to share event over network (event-share branch)
  • using PyMT + Ogre 3D with Anthony Martinet

And we’ve think a lot about the future of PyMT, and where we want to go for 1.0. More information soon :)

Also, a intern in the MINT team is actually researched to work on a project with PyMT. If any people is interested, don’t hesitate to send our CV to Nicolas.

It look like the future of PyMT is very good, and it’s just a start….

THSF 2010 finished, debriefing.

First of all, thanks to every awesome guys i’ve meet on this event. In particular to Marc Bruyère who give me a nice seat in his appartment. And every other guys in the tetalab !

I’ve been in Mix’art – Myrys, where living lot of artists, very nice. Creativity and OpenMind spirit is the keyword of the place.
The event involved Corpus Media, and they made fantastic show, based on the body, with video visualization and voice/audio transformation. I’ve take lot of pictures of mix’art, myrys, just browse them a little bit to feel the spirit :)

I’ve made a conference on PyMT on saturday. Unfortunatly, the streaming was cutted due to some guys using Free Internet connectivity for downloading torrents… Thanks guys ): Part of the streaming is available, but audio is desync. Specially for this event, you can see the Presemt application, made with Thomas and I. It’s available on PyMT-Apps : http://github.com/tito/pymt-apps/tree/master/presemt/. Presemt is a application to make dynamic presentation, based on PyMT.
I’ll record a little video, and push it on the website !
I think people have really appreciated the conference, and we already got new users :)


During the workshop day, Jimmy ask me if we can do an SMS Wall with PyMT, and play with it… I’ve spend more than 1 hour to check how i can get SMS from my HTC Hero. And i found Remote Notifier. Once a message is received on my Phone, it broadcast a TCP Packet using Wifi. 30 minutes later, we’ve got our first version of the SMSWall.
With a nice girl from tetalab (i don’t remember his name :/), we’ve started to make a V2 on saturday, with some other vizualization. But i’ve not been able to finish the code…


During the last day, i wanted to play with their LedWall in construction. An awesome work made by Lionel and Fabrice. This was my first attempt to use Arduino… and success ! After fixing some issues in the Arduino code, they have been able to use a sketch: Camera -> Pixelize -> Gray scale -> Send to arduino.

Now, you can check the video :
THSF – PyMT SMSWall
THSF – PyMT SMSWall + Deaf people
THSF – Led Wall !

And the pictures : http://picasaweb.google.fr/txprog/THSF2010