Streaming Video for Pebble
Back in 2012, I participated in the Kickstarter for Pebble, a smart watch that talks to your smart phone via Bluetooth. I was looking forward to writing apps for it. Unfortunately, my first Pebble had a display problem and by the time I got around to getting it exchanged, all the easy watch apps had been written.
I racked my brain for an application that hadn’t already been written. Then it hit me — streaming video! I could take a movie, dither it, and send it over Bluetooth from my iPhone to the Pebble. The only problem was: how would I get the video source?
Then I remembered, “Duh, I just wrote an app for that.” CVFunhouse was ideal for my purposes, since it converts video frames into easier-to-handle OpenCV image types, and then back to UIImages for display. All I had to do was process the incoming video into an image suitable for Pebble display, and then ship it across Bluetooth to the Pebble.
My first iteration just tried to send a buffer of data the size of the screen to the Pebble, and then have the Pebble copy the data to the screen. This failed fairly spectacularly. The hard part about debugging on the Pebble is that there’s no feedback. You build your app, copy it to the watch, and then run it. It either works or it doesn’t. (Internally, your code may receive an error code. But unless you do something to display it, you’ll never know about it.) Also, if your Pebble app crashes several times in rapid succession, it goes into “safe mode” and forces you to reinstall the Pebble OS from scratch. I had to do this several times during this process.
Eventually, I wrote a simple binary display routine, and lo and behold, I was getting errors. APP_MSG_BUFFER_OVERFLOW errors, to be exact, even though my buffer should have been more than sufficiently large to handle the data the watch was receiving. I discovered that there is a maximum allowed value for Bluetooth receive buffer size on Pebble, and if you exceed it, you’ll either get an error, or crash the watch entirely. I wanted to send 3360 bytes of data to the Pebble. I discovered empirically that the most I could send in one packet was 116 bytes. (AFAIK, this is still not documented anywhere.) Once I realized this, I was able to send image data to the Pebble in fairly short order, albeit only 5 scan lines at a time.
All that remained was to dither the image on the iPhone side. From back in the monochrome Mac days, I remembered a name: Floyd-Steinberg dithering. I Googled it, and it turns out that the Wikipedia article includes the algorithm, and it’s all of 10 lines of code. Once I coded that, I had streaming video.
Unfortunately, the video only streamed at around 1 FPS on an iPhone 5. How I got it streaming faster is a tale for another day.