Initially, this article was supposed to be “How to add an external flash to a Firefox OS phone” but my project did not work out. The idea was to add an external light to the camera of the Geeksphone Keon, as it doesn’t have any built in. Unfortunately, I couldn’t complete the project, here is why.
My plan was to build a simple photo app that could trigger an external light via the headphone jack. All smartphones have this socket and usually do not require “hardware certification” to work. Square is doing something similar with their card reader, which is pretty cool.
The hardware part of the project was fairly easy. All you need is:
a soft modem
some ultra bright LEDs and resistors
Communicating with a phone through the headphone socket doesn’t require a lot of electronics. The web has schematics so you can build a soft modem with simple components. Another option is to buy one on eBay, Sparkfun or DX, ready to use.
These kinds of modems use FSK modulation for the communication. Like good old modems, they use audible tones. If you plug speaker you can hear very fast blip sequences (think R2D2).
There is a SoftModem library that does the work of generating and reading FSK signals on the Arduino. It only works with Arduino Uno (as it needs specific timers) and you will need a patched version to be able to compile it with Arduino 1.0.
Unexpectedly, building the software was the most difficult part of the project.
1. The Keon doesn’t redirect sound to the headphone socket
The project will probably not work with the Keon as initially planned. It appears that the device will only redirect sound to the headphone socket if you plug an actual headphone in it. Plugging a TRRS cable only won’t work and sound will come out from the internal speaker instead.
The iPhone supports this feature, so why not try on the iPhone?
2. Safari on iOS can not play sounds in dataURI
Even if the iPhone redirect sound to the headphone socket, it can not play the generated FSK signal that is rendered as a Base64 dataURI. It just doesn’t.
My solution is to pre-record the signal and play it as a wav file (with proper MIME type, which is important). You could also use more advanced stuff from the Web Audio API. But in my case, I only have a few commands, so pre-recording is OK.
At this point, I can send commands from the phone to the external flash. Let’s build the camera app.
3. WebRTC support is very poor
My idea was to use getUserMedia to get the live stream from the camera. Unfortunately, getUserMedia isn’t supported by any mobile browser at the moment. It works on Firefox for Mac, but not on Firefox OS. I works on Safari for Mac, but not on Safari for iOS. Even Phonegap doesn’t help on the iPhone.
The closest thing would be a file input, but taking a picture usually happens outside of the browser.
No getUserMedia, no photo app.
At the end, the HTML5 camera app with an external flash only works well in desktop browsers. On mobile, all I can do is trigger the flash, but not take a picture. Pointless isn’t it?
Anyway, communicating between from an HTML5 app to a third-party device works and we can probably build more cool things with it.