On Slideshare’s 7th birthday, the engineers at Slideshare were treated to Slideshare’s special treat: the Internal Hackday. If you have been anywhere near Slideshare’s Hackdays before, you know that they are a breed of their own. There are people pulling all nighters, there are others struggling to, but succumbing to fatigue and sleep. There are people who lock themselves in a conf room and yell at anyone who tries to feed them food. And there are those who like to walk around dispensing product advice. But one thing is for sure, you know something is up when you look at the haggard, unkempt bunch with a frenzy in their eyes. Plus the sales at the cigarette shop at the corner goes up as worried geeks pace puffing anxiously. What I am trying to say is that the energy at these Hackdays has to be seen to be believed. This one was the same as teams began their pre Hackday brainstorming, shortlisting ideas and discarding them before finally settling on one they thought would win them the prizes.
I have been in a few of Slideshare’s hackdays before and decided another one with a script/server wasn’t going to be fun. Decided I wanted to do something different. A few ideas later, i began to tend towards getting an ARDrone for this hack. The fact that I encountered demos at both rubyconf and jsfoo about the ARDrone probably helped. Also, the past year has seen much more support in various language for the ARDrone. Add to it the fact that the past few months had been colleague after colleague unboxing LeapMotion devices and it doesn’t take long for a nerd’s mind to make the leap. I named the project ‘Minority Report’, pulled up the submission form and signed on the dotted line.
While I had grand plans to have the ARDrone detect faces, recognize people and shoot nerf ammo at them, the reality of a two day hackday meant that I had to trim it down a much leaner and sweeter MVP.
As the gong rang, I lugged my ARDrone into the office and the surprise on my colleagues’s faces set the mood for the hackday. The fact that they were worried about competing with something that made them drool made it all the more fun. I spent an hour playing with it and amusing them (boys will be boys eh?) and then settled down to code.
Turns out that writing coding to interface with real life devices has its own unique challenges. Who imagined that I would have to go get a muscle relaxant spray after a couple of hours because my shoulder began to ache painfully because of me holding my arm over the LeapMotion? Or that you always had to be prepared to duck when the drone was a meter away? Or that you had to factor in an hours recharge time after a 15 minute flight? Or that you had to include a safety instruction slide in your demo pitch so that people knew what to do if you drone decided to fly into them?
Well, as I said, interesting. And tough. I had initially wanted to use opencv for face detection and facial recognition, but turns out what you want to do and what you can get done are two different things entirely. After struggling a few hours with compile flags, 32bit vs 64bit compile modes and commenting out lines .h files I decided that trying to compile opencv at 3 am in the morning while still working through the actual product’s ruby code the next day wasn’t going to be practical. Time to fall back to plan B.
The hack started off well as getting gesture information into the ruby program was simple. The setup was quick and the packaged software was reliable and smooth. The problem however, was sensitivity. The LeapMotion is sensitive, let me tell you. Since it can detect a few predefined gestures, i had initially planned to hook into Leapmotion’s swipe gestures to control the drone. A few minutes of swiping later, it was obvious that it wasn’t going to cut it. For one, getting the gestures right was difficult. If your gestures aren’t smooth enough, LeapMotion doesn’t pick it up. Or you swipe up and you bring your hand back down to swipe up again, LeapMotion detects a swipe down gesture as well. You don’t really want to be struggling with your LeapMotion gestures when your drone has decided to careen towards the demo audience, right? So I settled for the hand objects that LeapMotion provided data about and wrote my own gesture detection code with gestures thresholds and so forth.
Once I had a decent setup going for how to control the drone, I hooked the drone up. Worth knowing that the setup for the drone is a bit tricky. You need to connect to the Ardrone’s wifi network. Since it is on udp, you can’t really be sure your drone is receiving your signals. In any case, coding the drone was fairly simple. I tried using artoo-ardrone, but finally settled on using argus directly. The drone has a bit of a lag from the time you give it a command to when it actually executes it. Just a second or so, but it is obvious and is a bit disorienting when you are waving your hands about trying to get it to stay away from a wall. Again, one of those things about the real world.
The drone’s battery also runs only for about 15 minutes and has to be recharged for an hour. I should probably have gotten some spare batteries. Would have been able to test out the program in much more detail.
It took a few hours to get the code right, add in the necessary checks to ensure that the program would not crash due to a wrong gesture or any of the obvious edge cases and then it was time to write something relevant to the hackday host. The camera seemed interesting so I set up some code to pick up some image frames from the video feed and upload it to slideshare. Here are a few of those slides. Hack done! Time to make the presentation and finally demo it.
The demo was great. We cleaned out a large area towards the front, and set the drone on a disassembed table tennis table. A few safety slides later, the drone was up. The audience actually enjoyed the demo. Turned out to be one of those cool hacks and when the judges asked what the audience though would with the first prize, the unanimous vote was for the drone. But since it didn’t really have much to do with slideshare itself, the judges had to pass. Here is a demo of the hack
I probably need to spend some time cleaning up the code a little bit and working on the face detection part. Maybe in a week or so. Feel free to look at the code over at Github. And do check out the official blog post on the hackday.