top of page
  • Writer's pictureAdmin

Technology Companies Need To Be Responsible Concerning Young Hearts And Minds

Updated: May 17

I shared briefly in my last post what Content Moderation is. This post delves a little deeper into those involved in the work of moderating content.


Mega Alert!

What I find shocking, to say the least, is the horrible decision of technology companies to hire freshers. Young impressionable youth who lack the necessary wisdom or experience to fathom the horrid content they get exposed to every day.


Moderators spend 9-10 hours a day watching content that’s not at all appropriate for normal viewing. Pornography, violence, terrorism, suicides, mutilation, and other harmful graphic content form part of their daily decision-making, sifting through almost 250-300 videos a day.


“Some mods view as many as 6000 images throughout a workday. The average time to make a judgment call: 4.5 seconds.” - Livemint


More than 100,000 content moderators have been hired across the world. BPOs are sprouting up in Indian cities like Hyderabad, Bangalore, Mumbai, and Gurugram.


Is this one of the worst jobs for young hearts and minds?


What has been the result so far? Growing mental health issues for employees; anxiety, depression, and suicidal thinking. While a few claim to be doing fine, only time will reveal how harmful it has been for them.


An article by theVerge exposed the working conditions of content moderators in America. The details are shocking as employees suffer due to a terrible work environment.

Employees who have paid the price during work have continued to suffer even after quitting their jobs. Organizations get away with a signed document by employees acknowledging the dangers of the job. PTSD is a common and increasing problem with moderators today, caused by experiencing or witnessing a terrifying incident. The symptoms include acute anxiety, flashbacks, and intrusive thoughts.


Read the ordeal Ramya went through with bad managers, a poor salary, and hardly any access to counseling. The everyday drudgery of working in such a volatile environment is already much for the youth in emotional pain. Job losses owing to the pandemic have worsened the situation for many. The pandemic brought an opportunity for companies to implement AI-based tools, only to see them fail mostly.


Promising future for content moderation as a profession with AI, what about the human condition?


The role has a promising future with organizations going digital, which means more youth are potentially going to be recruited, along with hopefully support from AI. If organizations do not rethink their strategy now, mental health issues will only get severe.

Technology companies claim that as Artificial Intelligence (AI) improves, content moderation roles will move to highly skilled professionals. It still concerns me that what is a matter of the heart cannot be solved by developing the intellect alone. We need to learn from the last 20 years at least.


YouTube CEO is considering reducing the time to four hours per day. Will it help? We’ll need to wait and see.


As humans, we need a social (offline) approach, which will require a change of culture internally, not the usual cattle mentality of hiring youngsters and getting them to work nonstop. It is a serious job, a war against young hearts and minds that cannot be taken lightly. It is spiritual warfare, and only one side is aware.



A person looking out of a glass window

3 views0 comments

Comments


bottom of page