Build any app using our NSFW model! The ‘Not Safe For Work’ model analyzes images and videos and returns probability scores on the likelihood that the image or video contains pornography. Don’t be afraid to be risque and offend our eyes, we can take it.
DeadlineSeptember 21, 2017RequirementsMust use Clarifai API, create live demo (app or video), fit bounty themeRewardOne winner will be featured on our website and social media (U.S. and international) and also receive a sweet Clarifai skateboard (U.S. only, sorry!)RulesLegal mumbojumboSubmitSend us your hack!