Monday, May 28, 2012

Eyes and nervous system

The USB hub, the cameras and the GPS have been stripped of unnecessary plastic and attached to the hull. And they actually work... sort of... except for the cameras.

Since all these posts are back dated... I'll compress the story of the cameras a bit in this post.

At the time of these photos, we started playing with video. We were done last monday - on september 3:rd!

First idea: VLC

Let's just install it and tell it to broadcast a stream. The stream format of choice - off course; HTTP Live Streaming. Hey! It'll work on the iPad.

Fail

VLC is just to slow. There's a lag of about a second just from camera to local screen.

Camera is connected and operational... and now we now that VLC is NOT a choice.

Second idea: ffmpeg with custom segmenter for HLS

The only official segmenter for creating HLS only works on OS X. But there is an OSS alternative. It just refuses to compile on the Panda. We spent the whole night fiddling with header files and wreaking all kinds of havoc to the Debian install on the Panda... but we got it working! That's when we realized that HLS SUCKS!
HTTP Live Streaming is the only format of live video that is supported in HTML on iOS devices. It works like this:
  1. You record the stream and pipe it into a
  2. Segmenter chops the video into little bits and shoves them into a folder. It then updates a
  3. m3u file. Yes you got it! It updates a playlist. So we serve the files and the playlist through
  4. nodejs static folder. Now all we have to do is place a video tag on the page and we're done.

Fail

Well... apart from the CPU running at about 90% and the Panda almost catching on fire, we have a lag of about 30 seconds. By tweaking the settings for ffmpeg and the segmenter, we manage to get it down to about 15 seconds. That's BAD NEWS when you're using the video to control a boat.

Third idea: ffmpeg to produce a stream of jpegs

We don't really care about framerate do we? We care about lag! If a rock is coming at you, you don't give a crap if it's choppy as long as you get the information in time so you can evade it.

So let's just let ffmpeg produce a series of jpegs and tell the browser to update regularly!

Fail

This is how browsers work when an image changes: they replace it on screen and keep the old ones in memory. That's good for optimization but CRAP for running a boat for half an hour with image updates 5 times a second.

Fourth idea: multipart/x-mixed-replace

At Devsum in Stockholm, I start talking to Eric Lawrence, author of Fiddler. He mentions the multipart/x-mixed-replace protocol which emulates server push. It doesn't work on Internet Explorer but who cares? We try it by building a node module which spawns an ffmpeg child process, captures an image and writes it to the response... and it works!

Fail

When I say works, I mean it works but it's sloooow. After thinking about it for a while, we realize what we're actually doing is this:
  1. Spawn a child process staring ffmpeg
  2. Wait for it to connect to the camera and return a jpeg
  3. Let the process die
  4. Repeat
This just isn't going to cut it.

Fifth idea: build a node module which parses a mjpeg stream

What we really want to do is keep the camera streaming and just push jpegs to the response. So we use ffmpeg to start a mjpeg stream and parse the contents of stdout to cut it up into individual jpegs.

Fail

Again it works but it's to slow. And the CPU is running at about 70%.

Interlude: 4008

It's about this time we feel life is just to easy. Let's make it a bit more interesting! Let's connect the second webcam! Say hello to our new friend: Error 4008. It turns out the webcams are bandwidth hogs. With twocameras running, we just deplete the memory of the USB. By this time, we've switched to the Raspberry Pi (you can read about that in later posts). We have currently updated Raspbian twice AND upgraded the firmware but 4008 still haunts us.

Sixth idea: let's go native!

So this is where Kristian decides javascript sucks and reverts to c. He builds a node module which cats the stream from the camera and emits events when a jpeg is ready to deliver.

Fail

Threading, threading, threading! While the code works, it just locks the process. When we start the camera, everything just dies.

Seventh idea: let's go native AND not be idiots!

The c lib gets rewritten. This time, we pass it a writeable stream (ie the response). The module spawns a child process which cats the stream, finds a jpeg, writes it to the stream and then sleeps for 1M/framerate microseconds.

Win

It works. It uses about 3% CPU. It pumps jpegs like there's no tomorrow. Still 4008 with two cameras though... but we've found a solution for that to: I taped shut the hole in the boat for the back cam.

Pandaboard and stripped down (massive) USB hub

GPS and forward camera

Tuesday, May 8, 2012

The Raspberry is a Panda

Well we finally got tired of waiting for the Raspberry Pi. It's currently scheduled to be delivered in late June.

So we ordered a Panda Board. It's 3 x size, 6 x price and 2 x power - what's not to like? So on with installing Debian!