In Computer Vision, AI, moderation

Manual vs AI Moderation: How to Decide Which Option is Best for Your Business

By Natalie Fletcher

Few technologies have impacted the world more than artificial intelligence (AI). Computer vision (CV), for example, has significantly changed and improved the way businesses across varying industries moderate their visual content and developments like Clarifai’s end-to-end moderation solution now make it easy and affordable for them to do the same.

That being said, anyone looking to use the technology to address their moderation needs must first determine whether replacing or augmenting their manual moderation team with an AI platform is truly worth their investment. After all, while AI is a brilliant technology that should never be discounted, there are still things only humans can do, and times when compared to AI, we can do better. Still, of course, there are also things AI is better equipped to handle, as we’ll discuss below.

Here’s a short guide on determining whether AI or manual moderation is better for your business.

1) What is the source of your visual content?

Nowadays, more and more companies are allowing for and even relying on user-generated content (UGC). However, getting content from third-parties is risky. Despite terms and conditions, these businesses don’t have control over what users actually post. With the sheer volume many of them receive daily, they rely on AI to filter this content for them.

If, however, your visual content is wholly or mostly sourced internally, from third parties that you can trust (e.g. stock image companies), or is just much more modest in size, a CV AI platform is likely too much for you, at least for moderation. Companies that may not have a lot of UGC but otherwise have a deluge of visual content to comb through may still benefit from using CV AI to organize and tag their images and videos. Otherwise, having human moderation teams that review the content and determine whether it is appropriate for your purposes is a reliable enough means of moderating content.

 

2) How much visual content do you need to moderate?

One of CV AI’s biggest values is that it enables computers, which can already hold a lot more data than we can, to be able to recognize and learn from all that data. Unlike human beings, computers can work for hours and hours without a break. For those sites and businesses that receive thousands of pieces of visual UGC content, manual moderation is, therefore, inefficient and ineffective.Prior to incorporating CV AI into their platform, Photobucket, for instance, was only able to moderate 1% of these uploads. 

photobucket_computer-1

Now, with Clarifai’s CV AI, all 2 million of the UGC uploads they receive every day pass through our NSFW filter before they are posted to the site.

Still, while CV AI is relatively affordable, unless you receive at least 250,000 new images and/or 8,300 (30-second) new videos per month or have over 5,000,000 images and/or 165,000 (30-second) videos backlogged, it may not be wholly suitable for your needs. Though the platform would be able to quickly moderate all of this content, the return on investment would simply be insufficient to justify the cost.

 

3) What are you moderating for?

While certain moderation tasks are well-suited for CV AI, others, like moderating for counterfeit items or misinformation on social media is far more difficult. As a general rule of thumb, any moderation that is currently challenging for humans will be as, if not more, difficult for AI.

This is because the AI we have today is “narrow AI.” While it is intelligent enough to perform particular tasks (like moderating NSFW images) very effectively and efficiently, it cannot match let alone surpass human intelligence. As I said above, its value “over” humans is not based on the quality of the handling itself, especially since the latter is dependent on how well humans train it. Rather, it is based on the volume of content it can handle.

And this is very important, particularly where the content is disturbing. Just as humans have physical thresholds, so too do we have mental ones. Having to look at NSFW or offensive content every single day can be very damaging, so if the content you are trying to moderate for is of this nature, AI may be worth thinking about to protect your employees or users.

 

4) How are you currently moderating content?

During our recent moderation webinar our Director of Data Operations, Liz O’Sullivan outlined several ways businesses moderate content. Which method is most appropriate for you depends on points 1 to 3 above.

  • If you are dealing with a high quantity of varied UGC and want to filter along more common lines like offensive content, computer moderation may be best for you.
  • If you are dealing with analogous content that is mostly generated internally or from a small group of trusted third parties, having humans moderate according to certain rules may be adequate. 
  • If you need to filter along more nuanced guidelines, like determining whether an item is counterfeit, human moderation may be best for you.
  • If you have a large, dedicated community of users, community moderation may work best for your needs.

While many companies would benefit most from using a combination of moderation techniques, human moderation tends to be more expensive. Though It isn’t fool-proof, where you have a small budget, a large amount of content, and generic moderation needs, you could solely rely on computer vision platforms like our end-to-end moderation solution.

 

5) What does moderation mean for you?

Finally, it is important to determine what exactly moderation is helping you to avoid. That is, if your moderation technique misses an image, video or post, what are the risks?

momio-300x205-1One of our customers, Momio, a social platform for preteens, risked bringing harm to their young users. They needed to protect these users, by not only preventing them from having to view inappropriate content but also from being able to post content that could come back to haunt them as they grow up. With computer vision, they are now able to moderate the millions of uploads they receive every day and keep their users safe.

For other companies, however, the risks are running afoul with law and order. With the introduction of the FOSTA-SESTA bills, for instance, site owners will now be liable where their sites are used to promote sex trafficking. Previously, these sites were not deemed to be responsible for what users post on their sites, allowing for more lax moderation. Other platforms, like Facebook and Twitter, have been taken to task for the impact misinformation posted on their sites potentially had on several major socio-political events around the world.

With the mixed media content of these platforms, they will likely also have to consider natural language processing-based AI moderation techniques like keyword moderation (where the platform filters out certain terms or hashtags) or sentiment analysis (where AI analyzes large blocks of text to determine whether a post is appropriate). While Facebook has tried to combine AI techniques with a moderation team that consists of thousands of moderators, many businesses do not have this option. As such, some have chosen to shut down the parts of their site that allow for such posts entirely to avoid these risks.

For companies who are interested in moderation because of the effect inappropriate content could have on their brand and sales, life is a lot easier. Customers rely on UGC when making purchasing and selling decisions, so online travel agencies and marketplaces need to ensure this content is of the utmost quality and relevancy, especially as this has been shown to impact conversion rates. Irrelevant or low-quality content is likely to drive users away from your business, making moderation critical to protecting their business. Thankfully, these businesses just need to determine whether they have the human resources needed to adequately moderate this content. If not, computer vision is available and ready to get to work.

If you are having trouble deciding on which moderation technique is best for you, you don’t have to decide alone. Here at Clarifai, for instance, we work with our potential clients to guide them as best we can in making the decision that is right for their business. Whether you download our ebooks or contact us through support or sales, don’t hesitate to talk to an expert. New call-to-action

Previous Next