Dropbox full-text search announced!

Woot! My work on making searching your Dropbox even better is on TechCrunch!┬áHere’s a screenshot:

Fulltext Search Screenshot


Check if a reservation is availble

I wrote a bash script that checks if a reservation is available on OpenTable.

It would be handy if you want to:

  • Be immediately notified if a reservation opens up
  • Want to check which restaurants have availability at a certain time from a list of favorites

You can find the script here

 


How to quickly pick movies at a film festival using jQuery, Google Spreadsheets, and Python

My Dad and I are going to TIFF’12, which is one of the biggest film festivals in the world. Unlike many of the other top-tier film festivals, TIFF goes to incredible lengths to make the festival accessible to everyone, not just people in the film industry. There’s roughly 400 films this year, and each are shown at least two or three times over the week and a half that TIFF takes over virtually every movie screen in Toronto.

Picking movies for TIFF is always a complicated process, and one that changes every year. This is the first time you can sign up online – you used to fill out a crazy sheet of paper listing your first and second choices and hope your sheet got picked first. This year, you get a randomly assigned 1 hour window, but you can now see which films are already sold out and pick replacements.

I’m currently finishing up my internship at Dropbox this week, and I’m also on the opposite coast as my dad, so picking movies was going to be especially tough this time around, but I think I have a good system down.

  1. I scraped the TIFF webpage to build a spreadsheet that had the name of each film and a link to its description, stills, and trailer.
  2. I put shared the spreadsheet with my dad using Google Docs
  3. We both rated each film
  4. I wrote a script that organized the films based on our ratings
  5. I’ll use this sorted list to pick films

In case anyone else wants to do this, I’m making my scripts available:

Scraping the TIFF movie list:

Since the TIFF site uses jQuery, and has a nice DOM hierarchy for each film, this was pretty easy to do in my browser’s Javascript console: parse-film-page.js


The result is a CSV formatted list of each movie. Just put that in a Google Spreadsheet (or some other spreadsheet program) and you can quickly click each link, decide how interesting the movie looks, and add your rating in the spreadsheet.

In case you just want the CSV, you can download it here: tiff-12-films.csv

Sorting film picks

My dad and I added our scores to each movie, 1 (avoid) to 5 (must see). Google Spreadsheets made this really easy for us to both do at the same time. Since we are both fully employed and the rating process takes hours, this was very helpful.

I exported the spreadsheet with scores back out to a CSV and implemented a Python script that put each movie into one of six buckets:

  1. Both of us rated ’5′
  2. One of us rated ’5′
  3. Both of us rated ’4′
  4. One rated ’4′, one rated ’3′
  5. One rated ’4′, one rated ’2′
  6. Both rated ’3′

The script spit out another CSV that I’ve imported back into the Google Spreadsheet, which I’ll use when I’m picking movies in my one hour window. I’ll try to fit in as many of the first bucket before going down the next, until I’ve picked all twenty movies we’ll see over the five days we’ll be in Toronto.

You can find the sorting script here: pick-films.py

Good luck to everyone doing their TIFF’12 picks this weekend!


Helicopter camera live test

Figured out how to record video from the helicopter’s optical flow camera. Here’s me waving my hand over it quickly:

Video [mp4]


Remotely displaying the Helicopter Brain Prototype’s camera.

Now that vSPI lets me quickly send data between the HBP FPGA board and my computer, I’ve been able to see what the helicopter sees in real time (~100fps).

It only took a day (well, a grad student day) to set up. Most of the time was debugging the proprietary camera interface and figuring out how to render video using OpenCV.

The Python script running on my desktop computer, which grabs frames from the FPGA over vSPI and renders the video is only 44 lines. The code running on the FPGA itself to send the image data to the computer is less than 10 lines.


Announcing vSPI: An Open Source SPI Slave

I’d like to announce the v1.0 release of vSPI!

It’s a fast serial port (up to 27.9 mbps on my Atlys Spartan-6 Devkit). It allows anyone to transfer data between their FPGA or ASIC project and their computer. I’m planning on using it to monitor the RoboBee brain in our Helicopter Brain Prototype (HBP). Pretty soon you’ll be able to see what the helicopter sees on my computer screen!

It’s written in Verilog, has an optional Xilinx EDK peripheral interface, and a easy to use python library for your PC. Go try it!

You can access the project here: https://github.com/mjlyons/vSPI

The license is pretty flexible. The only thing you have to do in order to use it is let me know that you’re using it. I’ll keep a public list on the website of who is using vSPI. If you need to secretly use it, let me know and we’ll figure out some alternative licensing scheme. My only condition is that you don’t use it for things that hurt or kill people (such as missile guidance systems).


HBPv2 boards are back and working

Great news!

The HBPv2 boards came back last Friday, and I’ve been testing them since.

The boards, fully assembled, weigh 4.16g. We calculated that the boards needed to be under 6.5g to take off, but our goal was 4.5g to achieve a longer battery life, easier flight control, and room for additional sensors. Doing better than our goal lets us add even more sensors and get longer flight times since the rotors don’t have to work as hard.

Our two big concerns with the v2 boards was the regulator circuit and the interface with our optical flow camera since both had problems in the v1 design. It was pretty easy to test the regulator circuit: just plug it in and see if it regulates. I’m happy to say that it did. Yessssssssss!

After a little debugging, we got the camera up and running too:

I drew a test pattern on a Post-It note (left), then photographed it using an OF camera attached to the HBP board. The result is the image in the middle. If you stand back, you can faintly see the same test pattern in the lower right corner of the image. I took the same image, and turned up the contrast and smoothed it a bit so that it’s easier to see. Software running on the HBP does a similar operation to compensate for the OF camera’s low range.

Now we know the HBP hardware is working. We’ll refine the RoboBee brain RTL and software running on top so it can fly on its own.


Follow

Get every new post delivered to your Inbox.