Clarifai Featured Hack: Block unwanted content and browse the internet safely with Distill

Distill is a Google Chrome extension that creates a better internet experience by helping people browse the web safely and block unwanted images and text from their view. For example, this Valentine’s day, you might want to block all images related to romance, so you’re not constantly bombarded with reminders that you’re forever alone. Just sayin’.

We’ve all been there – innocently browsing the wild interwebs when suddenly we see an image we just can’t unsee (never google “trypophobia” … trust us on this one). Don’t you wish there was a better way to browse the internet without all the unpleasant surprises? Distill solves that problem.


Distill is a quick, live-censoring Google Chrome extension that helps users browse the web safely. You can select categories of things you despise (like spiders) or things that make you upset (like blood and gore) and Distill automatically obscures those images in your browser. And, with the power of Clarifai’s Custom Training technology, you can easily train a model to recognize new concepts (like your ex’s dumb face … Happy Valentine’s Day!) and use Distill to block them from your virtual life. Handy, huh?


Besides being a fun and useful app for those of us who prefer not to be ambushed by unpleasantness, Distill also helps make the internet a better place for people recovering from real trauma and diseases, like drug addiction, alcohol addiction, post-traumatic stress disorder (PTSD), sexual assault, etc. Visit Distill’s GitHub repo to learn more!


We caught up with Jack Mask, one of the creators of Distill, to talk about how his team built the Distill app.

Clarifai: What inspired your idea for Distill?

Jack: After an intense bout of mild-melting brainstorming,  our team got distracted and ended up browsing the internet. Upon doing this, one member of our team came across some not-so-pleasant content on the web. “I wish there were a way to not see this nonsense!” they exclaimed. This was when we knew we needed to “distill” the web-browsing experience. We also took into account those using the product for more serious reasons, like veterans recovering from PTSD being triggered by certain images.

How did you build the app?

We used javascript almost exclusively used for our program’s back-end, while HTML/CSS is used for the front-end splash page. We used Clarifai’s API for our image-tagging. A challenge we faced was the limit on how many requests could be sent to Clarifai at once on the free hackathon plan, as well as how often they could be sent. To counter this, we sent small-batch requests to Clarifai so that the images were processed from the top of the page down, often being finished by the time the user scrolled that far down.

What was the best part about working with the Clarifai API?

The Clarifai API was a blast to use.  Exploring the patterns of tags it would give was an exciting, often humorous experience. It was challenging at times trying to experiment with its machine learning capabilities. The Clarifai API is quite literally the best thing ever.

Thanks for sharing, Jack!

To learn more, check out our documentation and sign-up for a free Clarifai account to start using our API – all it takes is three lines of code to get up and running! We’re super excited to share all the cool things built by our developer community, so don’t forget to tweet @Clarifai to show us your apps.

And give Jack and the Distill team some props in the comments below. Until next time!

Inside Clarifai HQ: Watch out, San Francisco - here we come!

We’re excited to announce that we’ve opened an office in San Francisco, the first of many steps in our post-Series B expansion plans. The SF office is where Clarifai is building a stellar machine learning research team led by Andrea Frome, formerly of Google Brain and the Hillary for America campaign. 

As New Yorkers, we know we live in the world’s greatest city. But, that doesn’t mean we don’t think San Francisco is a close second (joking, geez!). Really, though, we know there are tons of talented machine learning researchers and engineers out in Silicon Valley, so naturally, we decided to open an office in SF!

Our new San Francisco office will be focused on research, particularly Clarifai’s unique brand of research which involves solving problems we don’t have solutions for today so that we can productize the solutions quickly and make them accessible to developers everywhere. The research team is led by the amazing and talented Andrea Frome, formerly of Google Brain and Hillary for America. Andrea has big plans for Clarifai’s research team and is focused on hiring a diverse and talented team.

“One great thing in terms of research for Clarifai is that there’s just so much. I want to build an innovation engine for the entire company that can advise, mentor and teach across the entire organization.” – Andrea Frome

Even though we’re opening a new office, let’s not forget our NYC HQ is growing like crazy, too! We’ve recently hired a handful of AI curators from Twitter who will help us create the best visual recognition models for every use case. On the business side, our focus is building partnerships with other great platforms. These efforts will be led by our new VP of Business Development Matt Molinari who has joined us from Indeed.

We’re excited to share this news with the world, and we hope you’ll join us on our journey – we’re hiring for many roles from product to engineering to marketing to developer evangelism, so examine our careers page to see where you fit in the #Clarifam. Next stop, the moon!


Interested in joining the machine learning revolution? Head over to our careers page and browse all our open positions!

Clarifai Featured Hack: Use Clarifai for audio recognition with ADKI

Clarifai is well-known in the artificial intelligence world for being the best image recognition technology out there. But, did you know that Clarifai can be used to recognize audio as well? ADKI is an app that extends Clarifai’s API to work with audio – here’s how!


Sound can be expressed in many ways – as beautiful music, cacophonous yelling, soothing white noise, or even as a colorful mosaic image. ADKI is an app that extends Clarifai to work with audio by converting sounds into images using space filling curves. With Clarifai’s Custom Training product, the ADKI team was able to teach the Clarifai API new audio-image concepts and train it to understand audio. Click on the image to the right to hear what you’re seeing! 


We love when hackers push the envelope of Clarifai’s visual recognition technology beyond its obvious use. Applying Clarifai’s visual recognition to music is pretty novel – read more about ADKI on Devpost or check out the GitHub repo!


We caught up with one of the creators of ADKI, Kristin Moser (NYU student, designer, developer, and full-time dog enthusiast), to talk about her team’s inspiration for the ADKI app.

Clarifai: What inspired your idea for ADKI?

Kristin: Music is really important to everyone on our team so we felt like we really wanted to work with audio; however, we also really love Clarifai so we thought it would be great to combine the two.

How did you build the app?

We used Python with Clarifai’s API and an scurve API we found on Github. One challenge was not having enough data. Because the images created were very abstract to Clarifai, we really had to train it with a lot of data for it to recognize patterns.

What was the best part about working with the Clarifai API?

It was so fun to use. It made machine learning more understandable and made all of us really want to continue this in the future.

Thanks for sharing, Kristin!

To learn more, check out our documentation and sign-up for a free Clarifai account to start using our API – all it takes is three lines of code to get up and running! We’re super excited to share all the cool things built by our developer community, so don’t forget to tweet @Clarifai to show us your apps.

And give Kristin some props in the comments below. Until next time!

Auto-tagging user-generated content in Pixide's mobile photo app



Use case

Categorize user-uploaded images


Built an AI-powered mobile app that automatically segments user-generated content.

Pixide is a global photo contest app that allows amateur photographers to showcase and compete with their pics. Launched in November 2016, Pixide already has a vibrant community of over 5,000 photographers and many thousands of user-uploaded pictures.

Stefano Giacone is the CEO and Co-Founder of Pixide. He’s an Italian mathematician, full-stack developer, and passionate photographer. He was able to combine these three passions in his latest project, Pixide.


How do you categorize large volumes of user-generated content?

Pixide is an online platform that allows photographers to share their pictures and compete with their peers in weekly photo contests.

With user-generated content playing a large role in their app’s game mechanics, Pixide needed a solution that would provide a highly-scalable, quick-to-implement system for automatically categorizing user-generated images to enhance the user experience.

“Clarifai allowed us to validate our idea and build a market-ready product without the large technical overhead and time cost of building everything in-house.” – Stefano Giacone, CEO and Co-Founder of Pixide


Pixide used Clarifai to power its app’s game mechanics and create a more engaging user-experience.

In the Pixide app, photographers are encouraged to share their pictures and rate others’ submissions. The goal of the app’s picture rating algorithm is to remove the “popularity contest” element that exists on other social networks (e.g. celebrities getting the most “likes” even though their photos may not be great). This is achieved by directly comparing a series of photos side-by-side to surface the best one. By creating a purely skill-based competition on who can take the best shot, Pixide is able to build a stronger, more engaged online community that recognizes photographers for their work.

In order to create a fair competition comparing apples to apples, Pixide used Clarifai’s visual recognition technology to automatically categorize photos so they could be grouped by similar content and surfaced to app users for comparison and rating.


Automatically “see” and understand images

Using Clarifai’s visual recognition product, Pixide was able to create a “sticky” gamified experience based on the ability for the user to directly compare photos of similar subject matter and choose the best one. Once a photographer uploads a photo to the platform, Clarifai’s API automatically recommends relevant and accurate categories and tags to the user.

Clarifai allowed us to build a better user experience and smarter app using visual recognition and machine learning. The Clarifai API is a cornerstone of how our app works.

Serve the right content to the right users

With all user-generated content automatically categorized through Clarifai’s API, Pixide is then able to pair images of similar subject matter and surface them to users to vote on through the slick contest interface. This ensures that the rating mechanics enable users to discovering the “best” photos of the community and keep them coming back for more.

“Clarifai allowed us to find new ways to explore the world of photography.

Quick and easy implementation

Pixide’s CEO and Co-Founder, Stefano Giacone, tested visual recognition APIs like Imagga, Microsoft Computer Vision API, and Google Cloud Vision API before deciding that Clarifai offered the best possible solution for launching his mobile app. He selected Clarifai based on its superior accuracy, customer support, easy-to-follow documentation and starters, and multi-language support. With a product team of six people, Stefano was able to integrate Clarifai in just a couple of days and take Pixide from concept to production rollout in nine months.

“It was simple and straightforward to get started with Clarifai and use the API to power our mobile app. We’re excited about Clarifai’s upcoming mobile SDK, which will allow us to tag and categorize user-generated photos even faster and provide an even smoother user experience!”

DIY with Clarifai

Now that you’ve been inspired by Pixide’s mobile app, it’s time to build your own app. Clarifai’s core model includes tags for over 11,000 concepts you can apply to your business. Or, you can use Clarifai’s Custom Training solution to teach our AI new concepts. All it takes is three simple lines of code – sign up for a developer API account to get started for free!

Clarifai Featured Hack: Improve your emotional health with FeelyBot

It’s a new year, which means everyone is refocusing on what’s important to them, whether it’s getting back in shape or taking a year off to travel. One thing we believe to be super important is emotional health – that’s why we love FeelyBot, an app that senses your mood and improves it through positive interactions.

FeelyBot is an app that aims to be your closest and most loyal friend; he will always be there for you through the good and the bad. His sole purpose is to improve your day through positive interactions via the “super cute” Jibo platform and, in the long term, help treat behavioral and emotional disorders like depression and social anxiety.

Through a user’s social media accounts, FeelyBot can evaluate the strength of the user’s five basic emotions: happiness, sadness, anger, disgust, and fear. It then uses that information along with a contextual understanding of text and images to interact with the user and improve their mood!



When you’re feeling down, getting affirmations from FeelyBot is a lot more effective and healthy than eating your feelings. Just saying. Read more about FeelyBot on DevPost or check out the GitHub repo!


We caught up with Minh Hoang, one of the creators of FeelyBot, to talk about how his team built the app that gives us all the feels.

Clarifai: What inspired the idea for FeelyBot?

Minh: I like to take pictures and I figured that it would be really cool to have a timeline of all my photographs based on the feelings that they captured. Then, we figured that we could use that information to help people with emotional disorders. There are millions of people in the world that suffer from depression, social anxiety, or other forms of disorders. We asked ourselves how we could use the Jibo platform and it’s super cute interface to cheer up or improve a user’s mood.

How did you build the app?

We used JavaScript pretty much for everything. The server will track if there is a new Facebook post and run a sentimental analysis on the post. There are two types of post that are going go to be analyzed: text posts and image posts. With text post, FeelyBot will call an AlchemyAPI, which scores the different emotions of the post. FeelyBot then interprets the result and responds to the user in a positive way. With image post, FeelyBot will collect the image, then call Microsoft Emotion API to analyze the sentimental elements and Clarifai to get an overall description of the image. FeelyBot combines the results from both APIs and uses it to determine the user’s emotional state. All API is being called via Rapid API or native API.

The Jibo platform POSTs information to the Node server in order to get the processed information and interact with the user through text-to-speech. The user’s input is then sent back to the Node server, where it is being constantly analyzed and used to improve the backend’s AI to make it more personalized.

What was the best part about working with the Clarifai API?

Clarifai API is very friendly to use. Thank to the Clarifai team, all documents for the API are straightforward and easy to read.

Thanks for sharing, Minh!

To learn more, check out our documentation and sign-up for a free Clarifai account to start using our API – all it takes is three lines of code to get up and running! We’re super excited to share all the cool things built by our developer community, so don’t forget to tweet @Clarifai to show us your apps.

And give Minh some props in the comments below. Until next time!

Clarifai 2016 Year in Review

Amazing gifs on this page by the talented James Curran!

Let's be real - 2016 was a weird year.

It’s safe to say that 2016 was, for a lot of people, a very shitty year. We lost a lot of good humans this year – Muhammad Ali, David Bowie, Gene Wilder, Prince, Florence Henderson, Nancy Reagan … even goddamn Severus Snape and Princess Leia – that no amount of artificial intelligence can ever replicate (we would know). And the world turned upside down all across the globe.

But, if there’s a silver lining in this crazy, crazy year, we hope it’s the promise of machine learning and artificial intelligence. What excites us is the idea that developers all over the world are using our product to build a better and more inclusive, more just, more kind world for 2017 and beyond. Thanks for being the best developer community a startup could ask for!

360+ HACKS

Whether it was life-saving, world-changing, ground-breaking apps or punny projects, our developers built crazy, amazing things using the Clarifai API. 

See all our Featured Hacks

“We saw the Clarifai API and realized the possibilities were endless.” – creators of Pocket Hipster

100s of Businesses

This year, we supercharged hundreds of businesses and startups with AI, helping them go to market faster, scale quicker, and create better user experiences.

Read all the case studies

We’ve been evaluating image recognition and learning for 3 years. The Clarifai team and product were always on the leading edge and with the release of their custom training and similar image API are now our solution of choice. With image recognition seeing is believing; we’ve benchmarked every service out there and we believe in Clarifai. – Peter Gerber, Chief Product Officer @ Architizer

v2 API

In 2016, we launched incredible products at a pace that has Usain Bolt jealous. From Custom Training to Visual Search to developer-friendly metered pricing to a sleek and intuitive UI, our new-and-improved API has the whole AI world talking!

Read the docs

71 Champions

This year, we mentored two classes of Clarifai Champions to become better developers, technical writers, speakers, and evangelists. Many of them have already gone on to do great things - including work for us full-time!

What is a developer evangelist?

Causes We Supported in 2016


Did we also mention we raised a shit-ton (metric) of money to build the best machine learning products for developers? Thanks to you, our community, we're leading the charge in the AI revolution!

Here's wishing you a happy 2017, Clarifools!

Amazing gifs on this page by the talented James Curran!

Clarifai Featured Hack: Find the perfect gift with Gifted, an AI-powered recommendation engine

Gifted is an AI-powered recommendation engine that takes the guesswork out of gift-giving by looking at a photo and suggesting an appropriate present. You can send Gifted a photo of the gift recipient to build a gifting profile perfectly suited to their personality!

We all have that person in our lives who’s impossible to shop for, whether it’s because they’re a spoiled princeling who has everything or because they’re a total Scrooge who enjoys nothing. Gifted is an AI-powered recommendation engine that takes the guesswork out of gift-giving. The app finds suitable gifts just by looking at the photo of the person or their interests!



Giving someone the perfect gift is hard. It requires a level of thoughtfulness we reserve for only our most deserving friends and family. SO. MUCH. EFFORT. For all the other people in our lives (we’re looking at you, Secret Santa!), we’re glad to have Gifted to pick out a great present without all the work.


We caught up with Michael Jordan (not that one), founder of Gifted, to talk about his great app idea!

Clarifai: What inspired your idea for Gifted?

Michael: To pick the perfect gift for someone, you need to describe that person. Describing people is hard and takes a lot of words. A picture is worth 1,000 words, so naturally the quickest way to describe someone is to look at some pictures of them.

How did you build the app?

I had been working on building this from scratch for three weeks. I was running out of time as the holidays were approaching. I threw it all away and started from scratch. I decided to try Clarifai, and in one weekend I built the finished product with a serverless architecture. The Clarifai integration took 15 minutes tops and I had a fully functional product in two days!

Clarifai had clear documentation and took less than 10 lines of code total to integrate. I tested Google Vision as well, but the amount of set-up and documentation they throw at you is daunting compared to Clarifai, whose set-up and implementation reminds me of Stripe (highest compliment possible). The deciding factor was the speed with which I could integrate Clarifai, test results, and iterate. Also, the developer community had some great public examples of Clarifai integrations, so I was able to see how others integrated Clarifai and tailor my own integration with their examples.

I didn’t consider pricing at first because my first question was, “Can I build this, and will the tagger give me good enough results that will enable me to turn this into a product?” This is why Clarifai’s free tier was crucial. I wouldn’t have used something if I had to pay up front because I wasn’t sure I could build the product in the first place.

What was the best part about working with the Clarifai API?

Clarifai empowered Gifted the same way AWS empowered a generation of startups, and Stripe did for a generation of commerce. It’s the future. If it wasn’t for Clarifai, we wouldn’t exist!

Thanks for sharing, Michael!

To learn more, check out our documentation and sign-up for a free Clarifai account to start using our API – all it takes is three lines of code to get up and running! We’re super excited to share all the cool things built by our developer community, so don’t forget to tweet @Clarifai to show us your apps.

And give Michael some props in the comments below. Until next time!

Clarifai Featured Hack: Blu Vision is a visual alarm clock for the hearing impaired

We’ve all experienced the terrible feeling of jolting awake after our blaring alarm clocks force us into consciousness. For the hearing-impaired, a blaring alarm clock usually means an intrusive and uncomfortable vibrating wristband. What if there were a better way to wake? Blu Vision is a visual alarm clock that uses blue light to gradually wake you up, adjusting its intensity by learning user behavior through the Clarifai API.

Blue light has a reputation as the light that’s “better than coffee.” Research has confirmed that blue light can actually improve our cognitive abilities, including memory, alertness, reaction time, and executive function. So, what better way to wake up than by the gradual brightening of blue light? Blu Vision is a visual alarm clock for the hearing-impaired that uses blue light and Clarifai’s API to deliver a more pleasant and healthy waking.

Blu Vision adjusts intensity by learning user behavior, and the only way to turn it off is by getting out of bed and making a gesture to a camera running the best image recognition software in the world – Clarifai’s API.


We love any hack that improves quality of life. We also love the creative use of the Clarifai API to turn off the alarm – now, if only someone would make an alarm that would cook us bacon when we wake up, too. Oh wait, that’s a thing. Read more about Blu Vision on DevPost!


We caught up with Kheng Wei Ang from Malaysia to ask him how he and his team at Dubhacks came up with the idea for Blu Vision and went about building it.

Clarifai: What inspired your idea for Blu Vision?

Kheng Wei: My friend Navid worked on SignAloud, which is a glove that translates sign language into speech and won the MIT-Lemelson prize. It also helps that Trevor has an annoying roommate that doesn’t turn off his alarm clock for hours.

What tools and languages did you use to build the app?

We used Java, Python and C++. We brought a lot of hardware to Dubhacks, including many Arduinos, RGB LEDs, and a solder iron. In the end, we have a Raspberry Pi running as a server which communicates to the Arduino that in turn trigger the LEDs. We faced many challenges such as the devices not communicating with each other, unfamiliarity with a new programming language, and time constraints especially with the fact that we’re dealing with hardware. We also did not have much time to train Clarifai’s software to recognize a thumbs up gesture, but it works fairly well during demonstration.

What was the best part about working with the Clarifai API?

It was definitely fun! Andrew enjoyed working with Cassidy (Clarifai’s developer evangelist), and got the image recognition software to run on our hardware in just a few hours. It was the first hackathon for Jason, Andrew and I, and the second for Trevor, and we managed to build a device that turns off by recognizing gestures. Friends thought that we were cool, but that’s Clarifai’s API being cool. That is proof of how easy it is!

Thanks for sharing, Blu Vision team!

To learn more, check out our documentation and sign-up for a free Clarifai account to start using our API – all it takes is three lines of code to get up and running! We’re super excited to share all the cool things built by our developer community, so don’t forget to tweet @Clarifai to show us your apps.

And give Kheng Wei, Trevor, Jason, and Andrew some props in the comments below. Until next time!

Clarifai Featured Hack: How Many Rocks Are There is an app that counts how many rocks there are, naturally

Do you ever stay awake at night, cursing your brain for keeping you up by replaying every awkward social blunder you’ve ever made or bugging you with pointless questions like “How many rocks are there in the world?” While we can’t save you from your social awkwardness, we do have an app that can help you count rocks. You’re welcome.

How Many Rocks Are There is a massively multiplayer mobile app designed to figure how many rocks there are in the world and where they live. All you need to do is download the iOS app, snap a pic of a rock, and upload it to the app. The more people who play, the more fun the game. With riveting testimonials like “This is an app that counts rocks” and “I uploaded a picture of a rock,” how could you resist giving it a try?


Here are some of the many features of How Many Rocks Are There app:

+ Create a username
+ Take pictures of rocks
+ Name your rocks and record notes about your discoveries
+ Keep tabs on the latest tally of how many rocks there are
+ Let our scanning technology determine if your discoveries are, in fact, rocks
+ Locate rocks found by other users on a virtual map
+ Browse photos of the latest rocks discovered by the community
+ Check if your rock has already been discovered with our “nearby rocks” function
+ Purchase discovery rights to other user’s rocks


Actually, you might laugh, but you’d be surprised how hard it is to find a damn rock in New York City. For example, one might have an awesome succulent plant that needs to be surrounded by chic, grayish-white-striped, medium-sized pebbles in order to perfectly offset the other hipster decor in one’s home – how else would one find a rock in NYC that meets that criteria? #FirstWorldProblems

Really, though, we ultimately want our visual recognition technology to answer any question. Sometimes, the question is, “How do we prevent climate change?” or “How do we stop human trafficking?” or “How do we end weapons proliferation?” Other times, the question is, “How many rocks are there in the world?” Who’s to say which question is the most important? 


We asked Alec Cohen, freelance video director and comedy writer from Brooklyn who’s taking the world by storm, to give us some insight into what inspired such a thought-provoking app and how he and his team built it at Comedy Hack Day.

Clarifai: What inspired your idea for How Many Rocks Are There?

Alec: I was challenging myself to come up with the dumbest thing I could ask on Twitter. “How many rocks are there, anyway?” was the tweet, but there was something about it that stuck with me. I knew I was onto something but it wasn’t until a few months later at Comedy Hack Day that I realized how amazing my ideas are.

What did you use to build your app?

We used Swift for the iOS app, node + express + mongo for the backend, with other integrations with Clarifai and AWS S3.

tasksWhat was the best part about working with Clarifai?

The Clarifai API was ridiculously easy to use. At Comedy Hack Day, we had maybe 36 hours from pitch to demo. While brainstorming features, we joked that adding actual, functional rock recognition would be incredible but basically impossible. Like, “it would take 5 years and a research team” level impossible. But after we found Clarifai, it actually only took 20 minutes and 10 lines of code. Matt, our iOS developer, got in touch with Keeyon at Clarifai who was super helpful. So cool.

Thanks for sharing, “rock” star!

To learn more, check out our documentation and sign-up for a free Clarifai account to start using our API – all it takes is three lines of code to get up and running! We’re super excited to share all the cool things built by our developer community, so don’t forget to tweet @Clarifai to show us your apps.

And give Alec and the rest of the How Many Rocks Are There team – Matt, Mark, and Michael – some props in the comments below. Until next time!

#ClarifaiMyHoliday - Win a pair of Snapchat Spectacles by showing us how you see the holidays

The holiday season is already upon us and to celebrate, we wanted to give back to our community by giving you a chance to win Snapchat Spectacles – this holiday’s hottest gift! All you have to do is submit a photo of how you see the “holidays.”

spectacles2Inclusion is a core mission of Clarifai, and it’s no secret that our CEO Matt Zeiler is on a mission to make AI accessible to everyone. But the challenge of diversity in the AI world is real, which is why we’re curious to see how you perceive the “holidays.” We invite you to help train our AI technology to define what “holiday” even means – just post an image of how you see the “holidays” mentioning @Clarifai and #ClarifaiMyHoliday on Twitter. By doing this, you can help teach our AI about the holiday spirit and also enter for a chance to win this holiday season’s hottest gift – a pair of Snapchat Spectacles!

Here’s how the contest works:

  1. To enter, simply take a photo that captures the holiday spirit – maybe it’s window displays, people ice-skating, or something as simple as your decorated living room.

  2. Post the photo to Twitter tagging @Clarifai and #ClarifaiMyHoliday. Follow @Clarifai to be eligible to win!

  3. Check for live updates on how well your photo represents the “holidays” based on our visual recognition AI. Each photo is given a “holiday prediction score,” and the highest score at the end of the contest wins the Snapchat Spectacles. You can submit multiple photos for more chances to win!

  4. Contest ends at 11:59 pm ET on 12/31/16 and winner will be contacted after 1/2/17. 

For full contest details and legal mumbojumbo, click here. Good luck and have fun with it!


Are you ready to enter the contest? Go to Twitter and tweet us a pic of your holidays now!

10 AI hacks we’re thankful for this year

Thanksgiving is a time for reflection and appreciation. At Clarifai, we’re most thankful for our awesome community of developers who create mind-blowing apps on our visual recognition API. We’ve put together a roundup of apps that reflect the things we’re grateful for this Thanksgiving holiday – enjoy!

Thanksgiving is a holiday where you get together with your friends and family and give thanks for the many blessings in your lives. At Clarifai, we celebrate with our #Clarifam by eating a ton of food and reflecting on the many amazing things you, our developer community, have given us over the last year. We’re thankful for many things, from the 350+ developer hacks you’ve built and showcased on Devpost to the $30M Series B funding you helped us raise. So, we just wanted to share ten hacks we’re thankful for this Thanksgiving!

Food, obviously

Given that the American tradition of Thanksgiving was originally started to celebrate a bountiful harvest, we have to say we’re pretty thankful for the plentiful eats in our lives. We’ve been spoiled this year with pizza-fueled hackathons, professionally catered team meetings, and healthy yet delicious office snacks. If you’re as gluttonous as we are, you’ll also appreciate the many food-based Clarifai hacks that are perfect for this time of year – Foodifai, which can tell you how healthy your meal is based on a photo, and YUMMIfai, which recommends recipes for you to cook based on a photo of the contents of your fridge.

Family and friends

At Clarifai, we’re really lucky to be working with our friends, not just co-workers. This year, we’re especially thankful for the loved ones who support us and the customers and users whom we’ve gotten to know pretty well over the course of many emails, calls, meetups, and tweets. You guys are seriously the best. We’re glad we have hacks like Found, which helps people find lost loved ones, and FindALostPet, which helps people find lost pets, to keep tabs on you guys and never let you leave us!

A sense of humor

We love that you guys appreciate our quirky humor, from our cheeky Not Safe For Work nudity recognition model launch to our gif-tastic tweet wars. It’s really important to us to solve the world’s problems while also maintaining a sense of humor about what we do – it’s what separates us from the robots (*koff* rhymes with Moogle *koff*). We’re thankful you guys share our fun-loving worldview, with hacks like Pocket Hipster, which generates hipster poetry based on images, and PicAPun, which generates puns based on images.

Having a home

We’re thankful for our incredible home in the world’s greatest city (maybe we’re biased), New York City. This year, our company grew, and we moved into a swanky office. Next year, we’ll move into an even bigger, swankier new office! We’re appreciative of the Real Estate Genius hack, which tells us exactly how much our new office is worth on the market based on photos.

Good health

At Clarifai, we try to stay happy and healthy, both mentally and physically. We’re thankful for our flexible work policy that lets us have time to exercise, the shower in our office that helps us not stink up the room after our exercise, and the abundance of healthy snacks to make sure we don’t counteract the good effects of our exercise. We’re also glad to have mental health breaks and a great vacation policy! We love to see our technology promote good health for others who might not have the same advantages, which is why we partnered with i-Nside to build a medical app that helps doctors better diagnose patients in areas without robust medical services.

Bountiful swag

As anyone who’s encountered #Clarifools in the wild knows, we’re pretty fashionable creatures. From Clarifai-branded Herschel backpacks to Clarifai-branded flip flops to Clarifai-branded hoodies, we’re pretty much known in the NYC tech scene as “that AI company with the cool swag.” We’re thankful we’re able to rep our brand and share our swag with our community. That being said, we’re glad our hackers built Claridrobe, a personal stylist AI, and Intellivision, an augmented-reality shopping app, to help us fill the gaps in our wardrobe for the few occasions where a Clarifai t-shirt might not be appropriate.

Another Star Wars movie!

Historically, we’ve had to wait years (even decades!) for another Star Wars movie. Not this year! Right on the heels of Star Wars: The Force Awakens in 2015, we get Rogue One: A Star Wars Story in 2016 – if that’s not something to be thankful for, we don’t know what is. This year, we’re thankful for Lucasfilm and our incredible hacker who built the world’s first AI-powered, 3D-printed lightsaber!

Does Clarifai sound like a place you’d like to work? Well, you’re in luck – we’re hiring! Whether you’re an engineer, businessperson, or robot masquerading as a human, we’ve got a place for you in our #Clarifam.

Clarifai Featured Hack: Robor is the world’s first AI robot designed to help kids learn

Robor is the first smart AI learning buddy for kids. Using Clarifai, Robor creates opportunities for toddlers to learn by mimicking the natural learning interactions between children and adults and helping your child better understand the world around them.

robor2Remember the first time you saw your child point at something and call it out by name? Teaching your child how to interpret the world is not only an emotional gift but one that continues to make an impact on their development for years to come. Robor is an educational toy that helps your kids significantly increase their vocabulary and understanding of their environment while tracking and reporting their progress back to you.

Robor is a small super cute doll with a camera, a mic, a speaker, and a pair of OLED eyes. It’s modeled after how kids naturally learn about the world by associating what they see to what the they hear from their parents. For example, when kids hold up their cat toy to Robor, it recognizes the cat and says “Here is a cat and it meows, meows, meows!” giving the kids more exposure to object and word association – take a look!


Not only is Robor one of the cutest applications of Clarifai’s visual recognition technology we’ve ever seen, but it’s also one of the most novel and well-executed. An adorable robot toy that also helps children learn? Sign us up! We’ve already contributed to Robor’s Indiegogo campaign to fund their product launch and you should, too. Here’s the link!


We asked Sun Peng, a product and industrial designer from Shanghai, China who loves cats and designing cool gadgets, to explain her inspiration for Robor and how she was able to launch her prototype using the Clarifai API.

Clarifai: What inspired your idea for Robor?

Sun: I came across Feifei Li’s TED talk “How we’re teaching computers to understand pictures” while I was working on my graduating project. I was deeply moved when she said “First, we teach them to see. Then, they help us to see better. For the first time, human eyes won’t be the only ones pondering and exploring our world.” The lightbulb went off: “I can build a kids’ companion to explore the world with them!”

What did you use to build your app?

The first prototype was pretty easy. I got help from a friend who knows some Android programming and we simply linked up Clarifai’s API with a simple Text to Speech. We did it in a few hours! This quick validation gave us the confidence to continue to design and develop Robor.

What was the best part about working with Clarifai?

Clarifai API made it so easy for us to validate the idea in hours! It was fun, and we were over the moon when our app recognized the cat, and it said, “CAT!”

Thanks for sharing, Sun!

To learn more, check out our documentation and sign-up for a free Clarifai account to start using our API – all it takes is three lines of code to get up and running! We’re super excited to share all the cool things built by our developer community, so don’t forget to tweet @Clarifai to show us your apps.

And give Sun some props in the comments below. Don’t forget to help fund the Robor IndieGoGo campaign! Until next time!