Kivy (next PyMT) on Android, step 1 done !

Tonight is a wonderful night.

I know that i didn’t announce Kivy officially yet, but i’ll do it in another blog post very soon. You just need to know that Kivy is the next PyMT version. From 2 years ago with thomas, we have regulary doubts and reflections about using Python for PyMT. And i’ve started to look more at the future, and i was deeply convince that for our sake, we must be able to run on a Webbrowser. The goal is simple: same code for every platform, at least what we use every day: Linux / Windows / Macosx / Android / iOS.

Android and iOS are new OS, and we was thinking that except running in webbrowser, we will be never able to run on it. And we have started to target a futur with fewer dependencies, OpenGL ES 2.0 compatible, and so on. This vision have been named Kivy. Theses last days, i’ve removed numpy and pyopengl dependencies. Pygame is the only library required for running an application with widgets. (minimal don’t mean full featured).

And i’ve started to look at the android platform, since Tom from Renpy library have deliver a pygame subset for android. He just made an awesome work. My part was just to understand how it work, and get Kivy compilation done.

For now, here is what i’ve got :

Ok, but what i got exactly ?

  • Python/Pygame running from renpytom project
  • Failed attempt to use numpy on android
  • Kivy adapation for android (opengl debug mode, removing numpy and pyopengl, link on opengl es 2.0…)
  • Pygame change to create OpenGL ES 2.0
  • Various patch on the build system

And here is my step 2 :

  • Send to upstream all the patch on the build system
  • Resolve symbol conflict when 2 compiled module have the same name (kivy.event and pygame.event… nice naming.)
  • Add a way of detecting Android platform from python
  • Add multitouch support to pygame and/or kivy
  • Add android sleep/wakeup in kivy
  • Write documentation about how to compile a kivy application on android

For now, sleep time ! Enjoy.

Receiving SMS in python, for cheap.

From few month now, i was using my android phone to transmit new SMS to my computer using Android Notifier. All the new SMS was sent over Wifi/UDP. Now i was searching a standalone and cheap alternative. Recently, Hack a Day have published an article: Cheap and easy SMS via GMS for your MCU. And they talk about something very interesting: a USB GSM modem for only $25.

To be able to use this modem, i need also a SIM card. You have lot of possibilities to explore, but the one i’ve take is a prepaid card on Bouygues, for only 9.90€. You have only 5 minutes for dialing.. but you don’t care. What you must care about is the availability of the phone number. On this case, the number is still available 6 month. 9.90€ for having a 6 month number, that’s totally enough for only receiving SMS.

The GSM modem is internally a BenQ M23 (Specifications). You can use a simple serial command line to talk with :

$ cu -l /dev/ttyUSB0


SW ver: 1.80
HW ver: 1.00
FS ver: 1.00
Build Date: 2004/6/25
Build Time: 18:40:37


After exploring some documentation and specifications, what i found out:

  • Messaging system can be read in 2 form, text and PDU
  • Text mode is not ok for our French encoding. All specials characters are stripped.

Exemple of a session in text mode (AT+CMGF=1) :

+CMGL: 1,"REC UNREAD","336XXXXXXXXX",,"11/01/07,12:15:53+04",145,21
Voici les rsultats. 


The correct word must be “résultats”, not “rsultats”. When switching to PDU mode (AT+CMGF=0), we obtain :

+CMGL: 1,1,,38


The big hexadecimal string is the PDU. For decoding this string, we can use the Python Messaging library, very simple to use :

from messaging.sms import SmsDeliver
print SmsDeliver(
# output => u'Voici les r\xe9sultats. '

Here we are !

Next blog post: how to read MMS from Bouygues Telecom 🙂

PyMT used in Fourvière Museum

Last week, the Museotouch project have been inaugurated in the Fourvière Museum. The project have been done by many actors including Erasme and Mucho-Media.

One part of this project is a multitouch application to explore museum content. You can select the origin, the period/era, and you have museum content display on the table. Then you can select each of them to have more information.
As an user, you have another way to communicate with the application : when you come into that museum, they give you an badge with rfid inside. Walk in the museum, use your badge to “save” the object and then, put your badge on the table to explore what you have selected.
You can read the Erasme Museotouch page for more information, and see the Museotouch photos.

[vimeo clip_id=17859244]

This application has been developed by Nadège Bourguignon 🙂

Joojoo and multitouch on Ubuntu 10.10

Hi everyone,

Some people are asking how to make multitouch working on joojoo. It’s now possible, starting from Ubuntu 10.10 + some manipulations.

Utouch ppa

The working drivers for joojoo wasn’t finished and ready for ubuntu 10.10 release. However, the driver is available from the utouch-team ppa. So add the ppa :

sudo add-apt-repository ppa:utouch-team/utouch
sudo apt-get update

And then, install the driver :

sudo apt-get install hid-egalax-dkms

The drivers should compile in live, and will be available in the next reboot. But don’t reboot now, we have more things to do.
But i got some troubles with default drivers, the dkms one don’t load if the old driver still exist.
So, ensure you have no other drivers in your /lib/modules :

$ sudo rm /lib/modules/2.6.35/kernel/drivers/hid/hid-egalax.ko
$ sudo depmod -a

udev rules

The joojoo screen have never the same /dev/input/eventX, and change from time to time. And it’s not readable by users.
We will create an udev rule to change permission, and symlink the /dev/input/eventX to /dev/input/event-joojoo

$ sudo gedit /etc/udev/rules.d/80-joojoo.rules

And put this content in the file :

SUBSYSTEM=="input", ATTRS{idVendor}=="0eef", ATTRS{idProduct}=="720c", MODE="0644", SYMLINK="input/event-joojoo"


You should have your screen correctly working with one finger.


If you want to test multitouch, you can test PyMT :

$ sudo apt-get install python-pymt

Launch pymt one time, and hit escape :

$ python -m

Edit the configuration file ~/.pymt/config and add in [input] section :

joojoo = mtdev,/dev/input/event-joojoo
# And comment tuio and mouse input
# mouse = mouse
# default = tuio,

Then, you can test rapid demo or full desktop :

# rapid demo
$ python -m -a

# full desktop
$ python /usr/share/pymt-examples/desktop/ -a

Using pymt-dev version

If you are using pymt-dev version from github, they are new features to make joojoo experience more fun :

Disable mouse on activity

If you still want to use the mouse, but not the mouse when the touchscreen is used, you may want this feature. It will automaticly disable the mouse when touchscreen is used. Just change your [input] section to :

mouse = mouse,disable_on_activity

Sleeping when no activity detected

This is highly experimental: sleep module. If you will not touch the screen from a moment, the sleep module will introduce a sleep() call inside the main loop, in order to reduce the framerate. The sleep ramp and time can be configured. Check the sleep module documentation.

To activate it, put in ~/.pymt/config :

sleep =


That’s all. The Xorg is missing right now, it will be added soon 🙂

PyMT -to javascript-, the end.

As planned in september, i’ve explored how PyMT can be translated in javascript, using Pyjamas project. All is possible, and this translation could be possible, but i will need more than 2 week actually. It’s more like 4 month. Let’s check in details…

Pyjamas… not fully compatible Python !

Pyjamas don’t actually support :

  1. tuple construction (for x, y in tuplelist / for x, y in zip(a[::2], a[1::2]))
  2. with keyword with custom class (actively used in PyMT, gx_matrix, gx_begin…)
  3. Preprocessing. It would be very nice if we can have a way to NOT parse python, and not resolve all input. Like #ifdef / #endif in C. For example, it touch providers, we are testing if the system is linux, mac or windows to include the good input providers. But Pyjamas don’t care about if, since it’s a python translator. And then it will try to convert Windows part.)


When the scatter widget have been rewritten, we have focusing about performance. All the math are now depend of Numpy. A fully C/Python library. For translation, we should write all the part we are using in a pure javascript library.

pyOpenGL / graphx

Same as Numpy, we are using OpenGL everywhere. pyOpenGL is performant when we are using contiguous array. That’s also why we are using Numpy when passing data to OpenGL.
But the most important is.. that all our call right now is not compatible WebGL.

As defined on the Kronos website, WebGL is based on the OpenGL ES 2.0 API. Direct rendering is not available on the OpenGL ES subset (glBegin/glEnd/glVertex…). Our entire graphx package, and all our widgets need to be rewritten.

And you know what ? We are moving to our new graphics package for the next PyMT version. But it will not resolve the issue because the new graphics package are written… in Cython, a C/Python language. Not compatible with Pyjamas so.

So after the move, the graphics library should be also completely rewritten for Javascript.

Core providers

I didn’t reach this part. The goal would be to create a Javascript platform for every providers, and implement it using JS() from Pyjamas. Not a big deal.


The end ? The work to do is huge. It’s possible, but not alone. So if anyone want to contribute on the Javascript translation, please contact me 🙂

PyMT 0.5 out, what’s next ?

One week ago, we’ve released PyMT 0.5 (release notes). I’m very glad to see this release. We’ve working so much to make it stable, and we’ll continue.

The good part of this release is the availability on next Ubuntu Maverick. If you don’t know, Ubuntu Maverick will be multitouch. Before making the announcement public, we’ve joined the HCI group, and they have done documentation about Multitouch in Ubuntu, and PyMT in Ubuntu.
That’s a very good news, and i hope that we’ll be able to enhance our collaboration with Ubuntu, and share experience.

So, what’s next ?

Since people are asking always that “we should do a roadmap”, PyMT Roadmap is updated.
Next version will be the PyMT 0.5.1, and release is scheduled for the 6 September. This will include another set of fixes, found during the 0.5 release. And this will be the version shipped on next Ubuntu too. Auto detection of MT Hardware, fixes text rendering outside of the text input, gstreamer appsink missing, more unit test…

Starting September, the “PyMT Week Report” will back. Since the community is growing, it will be a good thing to know what’s new on the toolkit every weeks, and explain if we are hitting trouble, or if all the developments is ok 🙂

Then… I’ve lot of idea on my mind, and will focus on :

  • Spatialization: be able to handle 1 millions of widgets, added on an infinite plane.
  • Graphics: finish and use graphics library (new from 0.5) for all of our widgets.
  • App subsystem: replace the old MTPlugin with a new Application subsystem, that permit the use to browse and launch any PyMT application available on his system
  • Enhance CSS writing (less or sass-lang)

Then in november, i’ll try to port PyMT on Javascript… using Pyjamas and HTML5.

PyMT is really going to be… awesome !

Joojoo, funny part :)

    description: Desktop Computer
    product: PM235
    vendor: nVidia
    version: To Be Filled By O.E.M.
    serial: To Be Filled By O.E.M.
    width: 32 bits
    capabilities: smbios-2.6 dmi-2.6
    configuration: boot=normal chassis=desktop uuid=00020003-0004-0005-0006-000700080009
       description: Motherboard
       product: To be filled by O.E.M.
       vendor: To be filled by O.E.M.
       physical id: 0
       version: To be filled by O.E.M.
       serial: To be filled by O.E.M.
       slot: To Be Filled By O.E.M.

Multitouch experience on Joojoo

I’m writing this blog post just to remember what i’ve done, and how. Don’t want to forget it for the next time 🙂 It’s incomplete, and done for developer.

Last week, i’ve got a Joojoo, the old named “Crunchpad”. The hardware look good: Intel ATOM N270 + NVidia ION (GeForce 9400M) + 4Go SSD + 1Go Ram + 1.3M Webcam + USB + eGalax dualtouch screen + Wifi. At least, it look like a good hardware to hack on. Before you ask, i’ve buyed this tablet cause of NVidia ION, it support OpenGL 3.0, enough to run PyMT ! I’ll not say anything about the default OS installed on Joojoo… but will start directly how to install custom OS on it.

How to install custom OS on JooJoo

  1. You can use any usb device on the joojoo (CD or USB). I’ve used Unetbootin to create a custom USB boot key for Lucid Ubuntu. Don’t select net install, wifi driver is not included in the setup.
  2. Plug a usb switch + usb keyboard + usb key on the usb port of Joojoo
  3. Stay pressed on the DEL key + press twice the power button (once to make Joojoo logo appear, and a second longer to power the screen)
  4. In the boot menu, change the order of Joojoo SSD and your USB key, to make your USB key primary.
  5. Reboot by pressing twice the power button as described.

If you see nothing on the screen, reset the joojoo with the tool, and press the power button twice again.
Twice is really important, without that, the screen will stay black. On the second time, you must see the screen power up.

About drivers :

  • The default wifi driver is not working, so go on , search for 8192SE, and take RTL8192SE drivers for linux 2.6.
  • For Nvidia ION, just download latest drivers from
  • For touchscreen… that’s the whole problem. It’s not working by default. You can use the constructor driver available, but dual touch is not working.

So, i’ve buyed this tablet for Dual Touch + OpenGL 3.0… Let’s check if windows is working with dual touch.

How to install window 7 on JooJoo

On this part, i would like to thank a lot Fabien for his experience on this domain.
The problem is, with 4Go SSD, you can’t install Normal Windows Seven. You must build a custom Window. To do that :

  1. go on Microsoft Windows 7 Embedded
  2. download the Standard 7 Toolkit
  3. launch the image configurator
  4. select all the things you want on windows… Like Window desktop, USB Boot, Power Management… I’ll upload my “answer” file tomorrow.
  5. your image must be < 2Go.
  6. burn your answer on usb (Tool menu). You’ll have a USB boot key ready to install Window Seven Embedded !

Then, it’s the same step as before, plug, reboot, install.

After installation of eGalax windows driver, the touchscreen work with dual touch !

How to capture USB on windows seven ?

Since wireshark is not able to capture USB, and no free toolkit is able to do it at this moment, i’ve used HHD Usb monitor, in trial version. Just run it, select Raw capture, Start, play with the screen, reboot the screen, play again with the screen (1 finger, then 2). And export the whole capture into HTML.
And here is the Joojoo eGalax USB Capture.

And that’s a bingo. We have bulk message with a 0x6 len, and other with 0xc len (double of 0x6.) Look really like we got 2 touch ! I don’t know what is the setup message, but look like theses messages activate the screen ! Now… go back on linux to test 🙂

Check eGalax dual touch support with python-usb

Enable usb_debug (optionnal)

Ok, this was really easy, cause of the old experience of Hack of Dell SX2210T screen. Even if the experience was not a success, learning how USB work was really funny. First thing, activate USB sniffing to check if we are doing it right :

$ sudo modprobe usb_debug
$ sudo cat /sys/kernel/debug/usb/usbmon/0u (0u is where my device is connected)

You’ll have a bunch of message, as soon as you play with the device. You can read usbmon/usb_debug documentation about the format of the message.

Search your device with python-usb

When you’re searching for device on dmesg, you’ll have :

generic-usb 0003:0EEF:720C.0001: input,hiddev96,hidraw0: USB HID v2.10 Pointer [eGalax Inc. USB TouchController] on usb-0000:00:06.0-1/input0

You can also use lsusb :

Bus 004 Device 002: ID 0eef:720c D-WAV Scientific Co., Ltd

The 0EEF is your vendor ID, and 720C is your product ID. Let’s search our device in USB busses:

def get_egalax_device(vendor=0x0eef, product=0x720c):
    busses = usb.busses()
    for bus in busses:
        for device in bus.devices:
            if device.idVendor != vendor:
            if device.idProduct != product:
            return device
    return None

Send SETUP configuration

To understand more how USB work, you can check USB nut shell, the first part is talking about Setup packet.
This is exactly what we want, since on the capture we have :

000006: Control Transfer (UP), 02.08.2010 01:50:04.275 +0.0. Status: 0x00000000
Pipe Handle: 0x855dbe94

0A 01 41 0A 01 41

Setup Packet

40 00 00 00 00 00 06 00

Recipient: Device
Request Type: Vendor
Direction: Host->Device
Request: 0x0 (Unknown)
Value: 0x0
Index: 0x0
Length: 0x6

We have exactly 2 setup packet to transfer to the device. The most complicated is to understand python-usb.

  • Recipient -> usb.RECIP_DEVICE
  • Request Type -> usb.TYPE_VENDOR
  • Direction -> usb.ENDPOINT_OUT

So, let’s send the first setup packet. This packet activate the device to talk with us (we’re receiving lot of packet from him).

conf = '\x40\x00\x00\x00\x00\x00\x06\x00'
handle.controlMsg(reqType, usb.REQ_SET_CONFIGURATION, conf)

And the second request activate dual touch support (without him, we’re receiving only one touch) :

conf = '\x0a\x01\x41\x0a\x01\x41'
handle.controlMsg(reqType, usb.REQ_SET_CONFIGURATION, conf)

And, that’s done ! When we read the device, we are receiving packet !
The format is :

  • Flags : 8 bits (161 = ALIVE, 160 = UP)
  • X : 16 bits
  • Y : 16 bits
  • ID : 8 bits (65 = touch 1, 66 = touch 2)

The flags and id is not really flags/id… because the value is not common. But it’s not changing on pressure, and not in the time… Dunno why.

Still, here is the full source code of the eGalax Joojoo dual touchscreen with python-usb.

And after ?

Normally, 2.6.35 include eGalax driver, but it’s not working. Will check another day why the kernel driver don’t support dual touch for this screen !