Do you ever walk down the street, see someone way cooler than you, and wonder where you can get the outfit they’re wearing? Wouldn’t it be neat (and not super creepy at all) if you could just look at them and know? Intellivision works with Google Glass and MYO band to give you a rich shopping experience based on what you see in the world around you.

Intellivision is a live, augmented reality, image-tagging Android application that uses Google Glass and MYO armband to allow you to “shop” the world around you. Whatever you see through the Google Glass is automatically recognized and added to a list using the Clarifai API. If you see a tag pop up that you’re interested in, you can “click” it, and you’ll be taken to a link to buy the item on Amazon.

tumblr_m8gdd4zgVb1ql4atmo4_250WHY WE ❤ IT

Augmented reality opens up a whole new way for people to interact with their surroundings, and it’s a perfect fit for Clarifai’s visual recognition technology. We love the idea of being able to shop on the go, whether we’re commuting to work or at a baseball game. Plus, it’s always cool to discover some swag new gear and look like Ironman while we’re buying it! Check out the GitHub repo for this app and try it yourself.

HOW YOU DO IT

We asked Ron Wright, a Ph.D. student in Computer Engineering at the University of Illinois at Urbana-Champaigne and one of the creators of Intellivision, to explain his inspiration behind the app and how he built it.

Clarifai: What inspired your idea for Intellivision?

Ron: The inspiration for this project is centered around Google Glass and the Clarifai API. We knew that the image-capturing hardware and software built into Google glass was a natural fit for Clarifai API and that this combination could have a very universal application if implemented correctly.

What did you use to build your app?

We wrote the Android application in Java, and we used Android Studio to write it. We also provided a web frontend that displays a histogram of every instance of a word that was collected, and that was written as a NodeJS web application in HTML/JavaScript.  We interfaced the Android application and the web frontend using Firebase.  As for challenges, there was very little developer support available for Google Glass, so we had to seek some help from mentors or else find as many examples as possible to understand what we needed to do and what issues needed to be resolved. Also, finding the right SDK to get the development on Google Glass started was one big challenge.

What did you enjoy most about hacking with the Clarifai API?

The best part of the Clarifai API is its incredible ease of use.  You can send a stream of bytes representing a JPEG image and receive back the words that best fit the image.  Overall, the Clarifai API is not that challenging to use, although the most challenging part of it is understanding how to pass the image data correctly to the API call.

Thanks for sharing, Ron!

To learn more, check out our documentation and sign-up for a free Clarifai account to start using our API – all it takes is three lines of code to get up and running! We’re super excited to share all the cool things built by our developer community, so don’t forget to tweet @Clarifai to show us your apps.

And give Ron some props in the comments below. Until next time!