Machine Learning for Photo Search

After 9 months, I’ve completed a Certificate in Machine Learning from University of Washington.

My final project was a photo search engine. It uses deep learning to find similar images. I trained the neural net on a subset of ImageNet, but used it to search 25,000 photos from Caltech-256.

My slides are here: UWML410_Kelleher_Presentation

Killer whale

killer whale

German Shepherd

German Shepherd

Touring bike

touring bike

Laptop

laptop

Hummingbird

hummingbird

Face

faces

Face

faces

Face

faces

Face

faces

Butterfly

butterfly

Cactus

cactus

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Augmented reality and computer vision

I don’t often post, because I’m too busy learning and building. But you guys, I am so excited!

I’ve been deep in study as I continue to learn about computer vision. Reading an endless stack of papers, building sexy black-on-black computers with powerful GPUs, running late-night experiments.

A revolution is coming – a revolution in wearable computing and augmented reality. Smart glasses that see, understand, and augment our environment will be the biggest thing since the Internet and mobile phones.



In years to come, as you walk amidst the world, your glasses will know when you’re at the store, or home in the kitchen. What you’re looking at, and who you’re with. The glasses will insert anything you can think of, including other people in distant cities.

They will help us shop, create, learn, tell stories, attend concerts, and wander unfamiliar neighborhoods. They will help us see in the dark, and find our friends in crowded stadiums.

Old notions of space and distance, reality and unreality, will break down.

We will share games and art that move in the space between us. Imagine fighting a zombie horde – on your college campus. Or watching Godzilla attack your city – while you stare open-jawed at the sky. Or planting a Japanese garden with your friends, taking it with you, and visiting it whenever.

I fully expect flash mobs of penguins will appear on my morning bus commute.

Our homes and tools will sprout powerful virtual interfaces.

All of this requires computers that can understand the space around us, and seamlessly render virtual objects into it. Convolutional neural nets have emerged as a key technology, breaking records when it comes to computer vision. And you can play with them at home if you have a photo library and a gaming PC.

More about that in my next posts.

Happy Game featured by 71Squared!

Happy Game for iOS has been featured in 71Squared’s showcase!

71Squared makes great tools for indie developers, like Particle Designer and Glyph Designer. These saved me so much time when making Happy Game, and really helped kick it up a notch. . . . → Read More: Happy Game featured by 71Squared!

Computer vision with the Tegra K1

Just unboxed this bad boy: the Nvidia Tegra TK1 dev kit:

It’s roughly a desktop-class GPU in a low-power (5W) mobile package. The next-gen iPad and iPhone will likely have similar GPUs (the PowerVR Rogue 6XT).

I’m excited to play with the latest Tegra. In recent years, deep convolutional neural networks have revolutionized computer vision. . . . → Read More: Computer vision with the Tegra K1

New camera from Exo Labs

I am proud to announce that Exo Labs has released the Model 2 camera!

This camera has a much better image sensor – smooth illumination from corner-to-corner and greater dynamic range.

It also supports 60 frames per second (640 x 480) and 15 fps (2048 x 1536). Compare that to 8 fps / 1 fps . . . → Read More: New camera from Exo Labs

Open sourced “Happy Game” for iOS

I have released all the source code and assets for “Happy Game,” the mobile game I began writing in 2010.

https://github.com/skelleher/HappyGame

Because the assets are included, you can build and run the game on real hardware. You can step through it, modify it, and even submit it to the App Store.

Perhaps this will be educational, or . . . → Read More: Open sourced “Happy Game” for iOS

30 Years of Mac, featuring my startup

The iOS camera system we make at Exo Labs was featured in Apple’s “30 Years of Mac” video:

The Focus camera lets students and scientists use a microscope interactively, record videos, save images, and more. I do firmware and iOS programming for . . . → Read More: 30 Years of Mac, featuring my startup

Happy Game for iOS now available!

Well, it’s been a long time coming. Almost 3 years ago I set out to make my first mobile game – and it’s finally in the App Store.

My goal was to ship something rapidly, small in scope, but to still release a shining example of its genre. Two out of three? Paying jobs . . . → Read More: Happy Game for iOS now available!

Debugging embedded systems with Saleae Logic

When not making video games, I love to hack on embedded systems. Hardware has never been cheaper, and there are wonderful sources for the DIY maker.

Lately I’ve been playing with the Texas Instruments msp430. Cost for the dev board: $10 (I got a few on sale for $4.30 each!). Be sure to download . . . → Read More: Debugging embedded systems with Saleae Logic

Critters: beta

After being on the backburner for too long, my upcoming iOS game “Critters” is alive and kicking.

I’m working with the very talented artist Sam Strick to give it that extra polish!

Beta testers love the game, calling it addictive and charming. But as per usual, the final 10% is taking 90% of . . . → Read More: Critters: beta