BesedoKnowledge Partners

All change: A quick look at content moderation’s big trends

So, as another year draws down and a new one dawns, it’s again that time when we reflect on what happens – and take a moment to think about what the future holds. Living in these disrupted, dramatic times, though, it can be hard to see the wood for the trees: with so much going on, and so much uncertainty about what it all means, we might feel like it’s a tougher job than usual to take stock this year.

In the case of content moderation, for example, so many of the important recent changes have actually been impacts felt by the technology industry as a whole. Even as the world, on average, emerges from the other side of the pandemic, we’re seeing little pull-back from the spike in technology usage that it triggered. Commerce is more online than ever, work is more remote than ever, lives are more tied to platforms than ever, and social expectations about how we use technology have been set on a new path.

Looking back

All of which makes it easy to miss the fact that, even taken in isolation, content moderation has had a really interesting year. Even while they have been dealing – along with everyone else – with changing user habits, professionals in the space have had to keep one eye on how upcoming legislation will soon rework how businesses work with user-generated content.

Perhaps most prominent has been the EU’s Digital Services Act which, when it is ratified, will require much more extensive reporting, complaints mechanisms, and (for the largest platforms) auditing. It’s not alone: Australia’s Online Safety Act 2021, passed this summer, created a governmental eSafety Commissioner position with oversight of online content, while the UK is currently working on an Online Safety Bill to regulate content.

It’s likely that businesses will still have a long way to go in order to prepare for these regulations: research we ran towards the start of the year found that few were prepared for – and many were unaware of – the Digital Services Act.

Even the growing attention from governments on how content moderation operates, however, might not be the biggest thing facing the industry right now.

Looking forward

That’s because businesses are looking at an even more immediate impact in terms of how users actually use their platforms. We can tell the story in a few key statistics: nearly 50% of internet users look for videos related to a product or service before visiting a store; 72% of customers would rather learn about a product or service by way of video; social video generates 1200% more shares than text and image content combined.

In other words, before thinking about changes in how content is moderated, we need to deal with the fact that what is being moderated is changing rapidly. Whether the task is automated or taken on by a human, video is significantly more difficult and time-consuming to moderate than plain text. Even outside of video, as we’ve recently discussed, AI-led content moderation needs to improve its capacity to keep up with the speed at which human communication evolves online.

If we’re going to make a prediction for the next year of content moderation trends, then, we should start where businesses should always start: by thinking about the user or customer, what they need and want, and how we can step up to meet those desires.

From that perspective, here at Besedo we think that the story of 2022 for content moderation is going to be one of rising as a strategic priority in many different kinds of business. User habits and expectations are clearly changing, but the kinds of user-generated content available (and, more importantly, the quality of that content) is still very unevenly spread across different businesses and platforms. Where one clothes shop, for example, might be enabling users to upload videos, another might only just have introduced text reviews. That makes content, when done well, a powerful competitive differentiator, in a way which will come to the fore as our new assumptions about how we use the internet solidify.

Historically, content moderation has often been seen as a defensive measure, protecting businesses against negative outcomes, and new legislation may well sharpen what that looks like. The real opportunity coming up, however, is to see how it can be an asset to the customer experience, ensuring that this is not just content they have the option of seeing, but content they really want to see.

Luke Smith

Luke is the Editor for Global Dating Insights. Originally from London, he achieved a BA in Journalism from De Montfort University, Leicester. An experienced content writer, he enjoys a variety of sports, with a keen passion for his football team, Fulham FC.

Global Dating Insights is part of the Industry Insights Group. Registered in the UK. Company No: 14395769