Hail to the Social Media Gatekeepers | Teleperformance
Hail to the social media gatekeepers

Today, almost 3.5 billion people are active social media users, spending more than two hours a day generating massive digital content. Our current love affair with social media is accelerating at a never-seen-before pace. According to Domo’s Data Never Sleeps  report, these are numbers generated every minute of the day:

  • 390,030 apps downloaded
  • 4,500,000 YouTube videos viewed
  • 511,200 tweets
  • 277,777 Instagram stories

A lot of these videos, photos, and tweets are fluid, polymorph, erratic content, and it’s very difficult to separate the wheat from the chaff.

Social media is an integrated part of human life today: it keeps us connected, enlarges our communities, makes us more aware, and is a formidable source of information and education. But like any powerful invention in the digital era, social media is also used to spread fake news, defame individuals, diffuse hate speech, promote terrorism, and often brainwash audience to commit religious fanaticism and hate crimes.

The amount of user-generated data shared is both constructive and destructive in nature and this is why social media needs to have a reliable and efficient gatekeeping system to protect the users.

But how can we control 2.5 quintillion bytes of data shared by 3.5 billion users across the world?

By merging cutting-edge technology with the human touch, we can isolate online risk through content moderation.

When it comes to easily recognizable illegal, criminal, and vicious content, Artificial Intelligence (AI) is the perfect choice. But as I always say, no matter how advanced the technology is, it lacks humanness. AI does not understand human EQ, culture, and context. So, after the initial triage made by AI-led content moderation programs, social media gatekeepers have to check and decide if content is accurate, legal, and acceptable or not.

Taming the social media trolling beast

The role of content moderators—the Social Media Gatekeepers—is a difficult one; they have to review large quantities of data and are exposed to disturbing, shocking, and violent content. They are like the field cops and emergency services professionals: deeply needed, but few people are ready and able to do their jobs. One needs to have a high sense of ethics to value service for the community, as well as resilience to handle the stress that goes with the mission..

Today, social media companies and content moderation firms are extremely conscious of the importance of protecting the users and have policies in place to better guide their Social Media Gatekeepers, such as:

  • Selecting content moderators based on stable and resilient psychological profiles
  • Reducing effective working hours versus current labor practices
  • Providing permanent psychological support
  • Offering a better work environment
  • Giving higher compensation than traditional customer service jobs

The job of Social Media Gatekeepers is definitely not a sinecure; it is an essential function in the social media evolving ecosystem. I am sure that many cops, emergency rescue personnel, veterans, and security officers would appreciate getting the same working conditions and care provided to their counterparts of the virtual world.

And to all the Social Media Gatekeepers teams around the world, A GREAT THANK YOU FOR YOUR SERVICE!

What do you think about this post?
Leave us your comment.