NEWS
Infinit Care Spotlight Series: A closer look at worker profiles that need workplace mental health support
May 26, 2022 8:00 p.m.
Being on the internet has become a regular part of our everyday life. According to the latest statistics, 3.96 billion people use social media globally, with each person spending an average of 147 minutes or two hours and seven minutes on digital platforms every day.
These are significant figures if you look at them from the context of the level of exposure we get from the digital platforms we access.
Over the last few years, the internet has played a pivotal role in society–building businesses and new industries, creating new needs, and of course, shaping the mindset of the public as a whole. Without a doubt, the internet has become so powerful that it can shape generations and the way they think and act as a whole.
But have you ever wondered how information is sifted and checked in the online worlds we love to immerse ourselves in?
Websites and applications, big and small, have community guidelines that protect their users from being exposed to harmful information, but who exactly are the people working behind the scenes and doing the heavy lifting of screening this information?
In this article, we will talk about the sentinels of the internet and the plight that comes with their profession.
Meet the Content Moderators.
Content Moderation in a Nutshell
Content moderation, at its simplest, is the process of screening and monitoring user-generated content posted on online platforms.
Whenever a user submits or uploads something to a website, moderators go through the content to make sure that the material follows the community regulations and is not criminal or illegal in nature.
Some examples of banned content that content moderators screen are those that contain sexual themes, drugs, bigotry, homophobia, harassment, and racism.
While content moderation is applied to a majority of online platforms, they are even more so practiced in websites with a heavy lean toward user-generated uploads.
This includes social media platforms, online marketplaces, communities and forums, the sharing economy, and even dating sites.
There are two different types of content moderation that websites use: AI-automated and human moderation.
In the first type, a machine learning system is designed to moderate posts based on previous data gathered from the internet.
AI moderation is significantly faster–sometimes only taking seconds to review posts, but it might not always be 100 percent accurate because it relies on machine learning which may not always pick up the right cues.
Human moderation, on the other hand, is a manual type of process that involves an actual person who reviews the posts.
Under this category, the screener follows specific platform rules and guidelines to check the user-generated content submitted to the website. While this type of moderation is more foolproof than its counterpart, it also takes more time due to its manual nature.
Moreover, it also presents a serious problem within its workforce that unfortunately, is not often well addressed: mental distress.
The Dark Side of Content Moderation
While content moderation remains to be a discreet profession, at least in the Philippines, more and more people who have worked in the field have stepped up over the recent years to speak up about the challenges and dangers that are prevalent in the industry.
A riveting 2018 internationally produced documentary titled ‘The Cleaners’ gave an exhaustive look at the plight of moderators in the country who worked for online giants like Facebook, Twitter, and Google, and tackled the subject of their mental health struggles from their job.
Facebook itself has acknowledged the difficulties that come with the profession while Microsoft has faced lawsuits from former employees who claim that they were not given proper support despite the psychological dangers of their job.
Moderators sift through hundreds of submissions that contain triggering content not limited to depictions of death, torture, mutilation, and violence for hours, sometimes with only limited time for breaks.
The nature of the work can lead to the development of mental distress and psychological issues such as post-traumatic stress disorder (PTSD), anxiety, and even depression.
This is something that is also supported by data from other studies in journalism, law enforcement, and child protection which claim that repeated trauma exposure can lead to psychological distress.
On top of that, workers in the said areas have also been stated to suffer more from burnout, relationship challenges, and even suicide.
The following are other mental health problems that can arise from exposure to toxic content:
- Panic attacks – some moderators have expressed feeling attacks when being around animals and children–fearing something will happen to them–after repeated exposure to violent videos.
- Normalization/Desensitization to disturbing humor and language – repetitive exposure to disturbing content can change the mindsets and perspectives of its audience, leading to inappropriate humor and language.
- Self-destructive habits – alcoholism, use of drugs, and display of indiscriminate sexual habits have supposedly also been reported in the workplaces of moderators who presumedly engage in them as a way of emotional escape to their job.
- Skewed beliefs – in some cases, some content moderators can also develop fringe views (e.g. believing conspiracy theories) that are not supported by hard facts because of constant exposure to their materials.
The Cost of Internet Safety
Without a doubt, content moderators serve as the first layer of protection of the general public from disturbing and harmful materials.
Unfortunately, they are not always properly protected from the rigors that come with their profession.
Unlike different workplaces (for example, those in the health sector, law and policing, and journalism) which have more solid guidelines when it comes to taking care of the mental needs of their workforce, there is an obvious lack of the same system for those working in the content moderation industry.
In an article by Harvard, it is even said that companies are even very restrictive about letting others investigate their existing procedures and treatment of these workers.
Not only are there no third parties monitoring the welfare of employees, but people working in the industry are also commonly asked to refrain from talking about their work through non-disclosure contracts.
Fortunately, some companies have also taken the initiative to develop workplace guidelines that can improve the treatment of those in the industry.
Facebook, for example, helped create the Technology Coalition which then designed the Employee Resilience Guidebook, a guide that outlines rules protecting the occupational health and safety of workers reviewing distressing content.
While the guidelines were made for those who are focused on employees dealing with child pornography, it also has terms that can be used for others in professions that expose workers to distressing imagery and content.
Specifically, the guide includes rules such as the provision of mandatory individual and group counseling sessions with a certified trauma specialist, limiting exposure to disturbing content for four hours, giving employees the choice to opt out of viewing specific disturbing content, encouraging them to switch to other projects as a form of relief, and giving them enough time to take a break and recover from their work.
Protecting the Protectors
While overarching guidelines are already being developed on a global scale, it cannot be debated that a huge chunk of the responsibility should fall on the shoulders of the employers who are in a better position to observe and improve the best practices in this area.
Here at Infinit Care, for example, we follow a tried and tested framework, the Mental Health Continuum, to make sure that every employee working in high-risk professions gets the mental health support that they need, wherever they are on the scale – whether they are excelling, surviving or in crises. (Click here to know more about the Mental Health Continuum.)
Our Head of Clinical Care Shyne Mangulabnan suggests several ways on how employers can put this to work.
“Having a counseling professional who can help these employees is essential as well as having a solid support and assessment system for them. For example, surveys given to agents which can be used as a reference for the design of a wellness strategy is a good place to start. Constant monitoring of employees should also be done to make sure that their needs are met.”
On top of that, Mangulabnan also suggests creating proper escalation procedures for concerns relating to the mental health challenges of content moderators.
Proper education of important stakeholders within the company (human resource team, upper management) about mental health risks of the job is also necessary since they are the decision-makers who create systems that take care of employees.
“It would be best to have an end-to-end solution: an onboarding process that gives candidates the training and education they need to understand the risks and concepts of well-being, round-the-clock onsite and virtual counseling services, community support groups, yoga and meditation activities, and workshops are just some of the many things that employers can initiate to make sure that they give the support that their workforce needs.”
True enough, it is the responsibility of employers to make sure that they ‘protect the protectors’ of the internet.
However, it’s not only the content moderators who should be given this kind of support, especially with 43 percent of the global workforce expressing that the COVID-19 pandemic has increased the stress that they suffer from work.
This story is just the first chapter of a series that will shed light on all the professions who need mental health support most in these trying times.
Do you need help on how you can start caring for your employees in this aspect? We’d be more than happy to guide you here at Infinit Care. We are a company that helps other companies provide comprehensive mental health care support to their employees through the use of science-backed methodologies. You can reach out to us here to know more about how we can help.
NEWS
BFAR cites success of annual fishing ban in increasing galunggong stocks
1:11 a.m. March 8, 2026
The annual closed fishing season for roundscad (galunggong) has been instrumental in ensuring sustainable yield and enhanced volume production, the Bureau of Fisheries and Aquatic Resources (BFAR) said, citing the success of the science-based approach in Palawan.
BFAR National Director Elizer Salilig said this fishing cycle, which has been enforced for over 10 years, allows the galunggong to thrive in Palawan waters, ensuring ecological balance and economic success for the local fishing industry.
Galunggong is a dining staple among Filipino families, known to be relatively more affordable than other options. In Palawan, it sells for between P150 and P200 per kilo.
“The success of the annual roundscad fishing cycle in Palawan shows what we can do together through science and discipline. It proves that science-based conservation is not a hindrance to the fishing industry, but its greatest ally,” said Salilig.
The National Stock Assessment Program has confirmed the positive impact of the annual fishing hiatus, said Salilig.
The annual fishing ban contributed to improved volume of roundscad production at 3,363.75 MT in the third quarter of 2024, a sequential increase of 55.1 percent. It was the top contributor to the total commercial fisheries production in MIMAROPA, accounting for 27.1% of the total fish catch.
Enforcement of the fishing cycle has also led to a decrease in fishing mortality and the amount of immature galunggong in the catch. The roundscad also showed an increase in average length from 16.8 cm in 2015 to 17.6 cm in 2024, suggesting “improved growth and a healthier, more mature population.”
The closed season for commercial fishing of roundscad in Northern Palawan takes effect every November 1 to January 31 and is lifted between February 1 and October 31. This fishing cycle, implemented via a 2015 administrative order, covers the West Philippine Sea and the Northern Sulu Sea.
This policy leaves the galunggong breeding ground largely undisturbed during the closed season, protecting the species during its peak spawning period.
Salilig thanked the Palawan fishing community for continuing to comply with the annual fishing cycle.
“By respecting the natural spawning cycles of the galunggong, we are not just protecting an ecosystem; we are securing the food supply and the livelihoods of thousands of Filipino fishers for years to come,” he said. #
NEWS
Converge hosts Project PIGLET in Baclayan, donates learning tablets for Mangyan children
8:26 p.m. March 7, 2026
Leading fiber broadband and technology provider Converge ICT Solutions Inc. has come back to Brgy. Baclayan in Oriental Mindoro in celebration of Safer Internet Month, bringing its online safety campaign at the Baclayan Mangyan School, while strengthening the digital capabilities of its students.
Through its Corporate Governance and Data Privacy (CGDP) Group, Converge engaged pupils in Grades 5 and 6 in its Project PIGLET (Privacy Integration through Guided Learning of Emerging Technologies), a program that aims to enhance digital literacy and privacy awareness among school children.
Converge AVP and Head of Data Privacy and Information Security Compliance Team Eumir Paolo Espiritu highlighted that kids nowadays are more susceptible to cyber threats such as deception, fraud, identity theft, malware and computer viruses.
He noted that as early as possible, children should be educated on ways to protect themselves against these, which was why they launched Project PIGLET in 2024.
“Converge intends to visit different regions across the country to reach the unserved and underserved and increase the awareness of kids when it comes to data privacy and information security. At Converge, we leave no one behind, so this Safer Internet Month, we visited the indigenous children here in Mindoro to also promote the more responsible and positive use of digital technology,” he said.
Converge hosted a story telling session, and distributed school supplies and snacks to the kids.
The company also strengthened the school’s digital capabilities as it donated learning tablets and a smart TV at the Stairway Foundation ICT Learning Center, which has been supporting the students’ learning activities, through the help of Mindoro-based child-care organization Stairway Foundation.
“Dahil nga Mangyan school ang aming paaralan, kaunti ang access namin sa equipment kaya malaking tulong ang mga dagdag na kagamitan para sa pag-aaral ng mga bata. Mahalaga rin ang ganitong mga initiative para maging aware ang mga estudyante sa mga nangyayari sa kapaligiran at hindi lang sila gumagamit ng gadgets basta-basta,” said Baclayan Mangyan School Head Teacher III Noemi Bonquin.
The Baclayan Mangyan School provides education to nearly 250 pupils from the different Mangyan tribes in Puerto Galera.
Converge first engaged with the Mangyan community in Brgy. Baclayan in 2023, in collaboration with Stairway Foundation. The team up paved the way for the powering of free fiber internet by Converge at the said ICT learning center and its donation of learning tablets to support the students in honing their digital skills.
Moreover, the company has also worked with the nonprofit in its online safety advocacy in an effort to combat cyber threats, particularly online sexual abuse and exploitation of children (OSAEC).
NEWS
Maxim launches motorcycle taxi services in Iligan
7:11 p.m. March 6, 2026
Iligan City — Maxim Rides & Food Delivery is now authorized to operate motorcycle taxi services in Iligan, offering residents a convenient new way to get around the city.
With fares starting at Php 20, the service provides an affordable commuting option for daily travelers. The launch also creates new income opportunities for local driver-partners, enabling more residents to earn through a flexible platform.
Driver-partners operate from a commission scheme designed to be more favorable than many platforms in the market–helping them keep more of what they earn. Whether performing full-time or part-time, driver-partners can earn based on their availability and goals. All motorcycle taxi driver-partners undergo proper onboarding and an orientation on motorcycle taxi guidelines to ensure safety and service quality for every ride.
“Our goal in Iligan is to make daily travel easier for everyone while helping local residents earn extra income through flexible timetables,” said Myrrh Ornopia, Head of Maxim Iligan. “We want our service to benefit both riders and the community.”
Maxim continues to strengthen its motorcycle taxi operations in key cities, including Metro Manila, Batangas, and Cagayan de Oro. Through close coordination with local government units and transport authorities, the company remains committed to developing safe, reliable, and well-regulated motorcycle transportation services nationwide.

