Connect with us

NEWS

Infinit Care Spotlight Series: A closer look at worker profiles that need workplace mental health support

May 26, 2022 8:00 p.m.

Being on the internet has become a regular part of our everyday life. According to the latest statistics, 3.96 billion people use social media globally, with each person spending an average of 147 minutes or two hours and seven minutes on digital platforms every day.

These are significant figures if you look at them from the context of the level of exposure we get from the digital platforms we access.

Over the last few years, the internet has played a pivotal role in society–building businesses and new industries, creating new needs, and of course, shaping the mindset of the public as a whole. Without a doubt, the internet has become so powerful that it can shape generations and the way they think and act as a whole.

But have you ever wondered how information is sifted and checked in the online worlds we love to immerse ourselves in?

Websites and applications, big and small, have community guidelines that protect their users from being exposed to harmful information, but who exactly are the people working behind the scenes and doing the heavy lifting of screening this information?

In this article, we will talk about the sentinels of the internet and the plight that comes with their profession.

Meet the Content Moderators.

Content Moderation in a Nutshell

Content moderation, at its simplest, is the process of screening and monitoring user-generated content posted on online platforms.

Whenever a user submits or uploads something to a website, moderators go through the content to make sure that the material follows the community regulations and is not criminal or illegal in nature.

Some examples of banned content that content moderators screen are those that contain sexual themes, drugs, bigotry, homophobia, harassment, and racism.

While content moderation is applied to a majority of online platforms, they are even more so practiced in websites with a heavy lean toward user-generated uploads.

This includes social media platforms, online marketplaces, communities and forums, the sharing economy, and even dating sites.

There are two different types of content moderation that websites use: AI-automated and human moderation.

In the first type, a machine learning system is designed to moderate posts based on previous data gathered from the internet.

AI moderation is significantly faster–sometimes only taking seconds to review posts, but it might not always be 100 percent accurate because it relies on machine learning which may not always pick up the right cues.

Human moderation, on the other hand, is a manual type of process that involves an actual person who reviews the posts.

Under this category, the screener follows specific platform rules and guidelines to check the user-generated content submitted to the website. While this type of moderation is more foolproof than its counterpart, it also takes more time due to its manual nature.

Moreover, it also presents a serious problem within its workforce that unfortunately, is not often well addressed: mental distress.

The Dark Side of Content Moderation

While content moderation remains to be a discreet profession, at least in the Philippines, more and more people who have worked in the field have stepped up over the recent years to speak up about the challenges and dangers that are prevalent in the industry.

A riveting 2018 internationally produced documentary titled ‘The Cleaners’ gave an exhaustive look at the plight of moderators in the country who worked for online giants like Facebook, Twitter, and Google, and tackled the subject of their mental health struggles from their job.

Facebook itself has acknowledged the difficulties that come with the profession while Microsoft has faced lawsuits from former employees who claim that they were not given proper support despite the psychological dangers of their job.

Moderators sift through hundreds of submissions that contain triggering content not limited to depictions of death, torture, mutilation, and violence for hours, sometimes with only limited time for breaks.

The nature of the work can lead to the development of mental distress and psychological issues such as post-traumatic stress disorder (PTSD), anxiety, and even depression.

This is something that is also supported by data from other studies in journalism, law enforcement, and child protection which claim that repeated trauma exposure can lead to psychological distress.

On top of that, workers in the said areas have also been stated to suffer more from burnout, relationship challenges, and even suicide.

The following are other mental health problems that can arise from exposure to toxic content:

  • Panic attacks – some moderators have expressed feeling attacks when being around animals and children–fearing something will happen to them–after repeated exposure to violent videos.
  • Normalization/Desensitization to disturbing humor and language – repetitive exposure to disturbing content can change the mindsets and perspectives of its audience, leading to inappropriate humor and language.
  • Self-destructive habits – alcoholism, use of drugs, and display of indiscriminate sexual habits have supposedly also been reported in the workplaces of moderators who presumedly engage in them as a way of emotional escape to their job.
  • Skewed beliefs – in some cases, some content moderators can also develop fringe views (e.g. believing conspiracy theories) that are not supported by hard facts because of constant exposure to their materials.

The Cost of Internet Safety

Without a doubt, content moderators serve as the first layer of protection of the general public from disturbing and harmful materials.

Unfortunately, they are not always properly protected from the rigors that come with their profession.

Unlike different workplaces (for example, those in the health sector, law and policing, and journalism) which have more solid guidelines when it comes to taking care of the mental needs of their workforce, there is an obvious lack of the same system for those working in the content moderation industry.

In an article by Harvard, it is even said that companies are even very restrictive about letting others investigate their existing procedures and treatment of these workers.

Not only are there no third parties monitoring the welfare of employees, but people working in the industry are also commonly asked to refrain from talking about their work through non-disclosure contracts.

Fortunately, some companies have also taken the initiative to develop workplace guidelines that can improve the treatment of those in the industry.

Facebook, for example, helped create the Technology Coalition which then designed the Employee Resilience Guidebook, a guide that outlines rules protecting the occupational health and safety of workers reviewing distressing content.

While the guidelines were made for those who are focused on employees dealing with child pornography, it also has terms that can be used for others in professions that expose workers to distressing imagery and content.

Specifically, the guide includes rules such as the provision of mandatory individual and group counseling sessions with a certified trauma specialist, limiting exposure to disturbing content for four hours, giving employees the choice to opt out of viewing specific disturbing content, encouraging them to switch to other projects as a form of relief, and giving them enough time to take a break and recover from their work.

Protecting the Protectors

While overarching guidelines are already being developed on a global scale, it cannot be debated that a huge chunk of the responsibility should fall on the shoulders of the employers who are in a better position to observe and improve the best practices in this area.

Here at Infinit Care, for example, we follow a tried and tested framework, the Mental Health Continuum, to make sure that every employee working in high-risk professions gets the mental health support that they need, wherever they are on the scale – whether they are excelling, surviving or in crises. (Click here to know more about the Mental Health Continuum.)

Our Head of Clinical Care Shyne Mangulabnan suggests several ways on how employers can put this to work.

“Having a counseling professional who can help these employees is essential as well as having a solid support and assessment system for them. For example, surveys given to agents which can be used as a reference for the design of a wellness strategy is a good place to start. Constant monitoring of employees should also be done to make sure that their needs are met.”

On top of that, Mangulabnan also suggests creating proper escalation procedures for concerns relating to the mental health challenges of content moderators.

Proper education of important stakeholders within the company (human resource team, upper management) about mental health risks of the job is also necessary since they are the decision-makers who create systems that take care of employees.

“It would be best to have an end-to-end solution: an onboarding process that gives candidates the training and education they need to understand the risks and concepts of well-being,  round-the-clock onsite and virtual counseling services, community support groups, yoga and meditation activities, and workshops are just some of the many things that employers can initiate to make sure that they give the support that their workforce needs.”

True enough, it is the responsibility of employers to make sure that they ‘protect the protectors’ of the internet.

However, it’s not only the content moderators who should be given this kind of support, especially with 43 percent of the global workforce expressing that the COVID-19 pandemic has increased the stress that they suffer from work.

This story is just the first chapter of a series that will shed light on all the professions who need mental health support most in these trying times.

Do you need help on how you can start caring for your employees in this aspect? We’d be more than happy to guide you here at Infinit Care. We are a company that helps other companies provide comprehensive mental health care support to their employees through the use of science-backed methodologies. You can reach out to us here to know more about how we can help.

NEWS

COA files 4 fraud audit reports worth over ₱275 million for Bulacan flood control projects

9:19 p.m. February 13, 2026

THE Commission on Audit (COA) has filed four Fraud Audit Reports (FARs) before the Office of the Ombudsman involving more than ₱275 million worth of flood control projects in Bulacan, citing alleged ghost projects, unauthorized site relocations, payments for pre-existing structures, and serious documentation deficiencies.

The projects were implemented by the Department of Public Works and Highways (DPWH)–Bulacan 1st District Engineering Office and awarded to SYMS Construction Trading and Wawao Builders.

COA said the filing of the cases underscores its commitment to transparency and accountability to ensure that public funds intended for flood mitigation are properly used.

Based on physical inspections, geotagged photographs, and historical satellite imagery, state auditors reported recurring irregularities:

Ghost projects: No flood control or riverbank protection structures were found at approved project sites, despite reports that the projects were completed or substantially accomplished.

Unauthorized relocation of sites: In several instances, DPWH representatives allegedly led inspectors to locations different from those specified in approved plans and contracts, without revised plans or written authority.

Payments for pre-existing structures: Satellite imagery showed that some riverbank protection structures already existed prior to contract effectivity, raising the possibility that payments were made for works not newly constructed.

Documentation deficiencies: Required documents, including as-built plans, detailed cost breakdowns, Statements of Work Accomplished, and approved master plans, were either incomplete or missing, undermining the credibility of reported accomplishments and payments.

Audit Coverage

The fraud audit stemmed from a directive issued on Aug. 12, 2025 by COA Chairperson Gamaliel A. Cordoba ordering an immediate review of DPWH flood control projects in Bulacan covering July 1, 2022 to May 30, 2025, following public concerns over alleged ghost projects and corruption.

Disputed Projects

Hagonoy, Bulacan (SYMS Construction Trading)

The ₱67.55-million project involved the construction of a reinforced concrete flood control structure at Barangay Santa Monica (Purok 6 to Purok 7). COA reported that no such structure was found at the designated site despite the project being declared 100 percent complete as of June 11, 2024 and fully paid by June 19, 2024. Auditors also noted indications of unauthorized site changes and missing required documents.

Pandi, Bulacan (SYMS Construction Trading)

The ₱39.60-million riverbank protection project at Barangay Malibong Bata was allegedly built at a location different from that specified in approved engineering plans, without documented authority for relocation. Structures found at both the approved and identified sites could not be conclusively linked to the contract. Several key documents were also missing.

Baliuag, Bulacan (Wawao Builders)

The ₱72.37-million Phase IV riverbank protection project at Barangay San Roque was reportedly constructed at a site different from that indicated in the approved bid plans. The structure bore markings corresponding to another project. Geotagged progress photos used to support payments were taken before the issuance of the Notice to Proceed and pointed to a different barangay. COA also cited overlapping project locations with another flood control contract and incomplete documentation.

Plaridel, Bulacan (Wawao Builders)

The ₱96.50-million flood control structure along the Angat River in the Lumang Bayan section was found to have existing structures at the site at least 90 days before contract effectivity, based on satellite imagery and inspection. The structures bore markings of different contract IDs and differed in design from approved plans. Despite this, the project was reported 100 percent complete within 65 days from contract effectivity. Auditors again noted missing supporting documents.

Possible Violations

COA said those involved may face charges for violations of Republic Act No. 3019, or the Anti-Graft and Corrupt Practices Act, as well as malversation and falsification of documents under the Revised Penal Code. Possible violations of COA Circular No. 2009-001 were also cited.

The audit body said additional reports may be filed with the Ombudsman as investigations continue, in line with President Ferdinand Marcos Jr.’s call for transparency and accountability in government spending.

Continue Reading

NEWS

ILO study says TNVS drivers earn way above minimum wage

8:49 p.m. February 11, 2026

Transport network vehicle services (TNVS) riders and drivers receive above the mandated minimum wage in the Philippines, according to a recent study commissioned by the International Labor Organization (ILO).

In the “2025 Platform Work Survey: Philippines” presented during the Department of Labor and Employment’s (DOLE) 2026 National Tripartite Conference, it also noted that digital platforms are a major source of livelihood in the country because of the flexible working arrangements they offer.

According to the survey that covered 12 out of 17 regions in the country, the average net earnings of a TNVS rider or driver per week reach P6,704.00, net of costs, as opposed to the average minimum wage of approximately P498 to P695 per day or P4865 weekly set by the government. 

The survey was conducted from June to December 2025 and interviewed 400 respondents from nine platforms providing food delivery, logistics and parcel delivery, and ride-hailing services. It has a margin of error of 5 percent.

The ILO commissioned a comprehensive survey on platform work, including delivery and TNVS riders and drivers, to analyze the working conditions of workers in the platform economy, document labor practices, assess the impact of digital platforms on employment, and inform enterprise formalization and social protection strategies.

Based on the ILO study, nearly 90 percent of the riders and drivers indicated that they have access to social protection provided by the platform, including health insurance, insurance for workplace injury, and pension plan or retirement benefit. 

Among the top reasons the TNVS riders considered for choosing this industry are flexibility, which allows them to select their schedules and attend to family and personal matters, and decent earnings, which they deemed better than other available jobs.

According to riders and drivers, there are platform initiatives to improve their working conditions, such as increasing earnings and incentives, enhancing training and safety, and improving operational support and communication channels.

The study also noted that ride-hailing app platforms are specifically focusing on facilitating mandatory government benefits—Social Security System (SSS), PhilHealth and Pag-IBIG—to their drivers.

Continue Reading

NEWS

DigiPlus deepens investments in Customer Care across BingoPlus, ArenaPlus, and GameZone

6:12 p.m. February 10, 2026

DigiPlus Interactive Corp. (DigiPlus), the pioneer and leading digital entertainment provider behind BingoPlus, ArenaPlus, and GameZone, continues to strengthen its investments and capabilities in customer care, reinforcing its commitment to providing reliable, player-first support across its platforms.

The company reports that its 24/7 customer service operations are now backed by a 450-strong workforce, reflecting sustained investment in high-caliber talent, intensive training, and rigorous service standards. These investments underpin DigiPlus’ efforts to build a scalable customer support organization that champions service quality and upholds Responsible Gaming for players.

“As a leader in digital entertainment, we recognize our responsibility to build and sustain a customer-first service culture,” said Carlos Feliciano, Customer Service Director at DigiPlus. “By designing a scalable, future-ready framework and streamlining processes for simplicity and speed, we aim to make support effortless and intuitive—and elevate the overall customer experience for BingoPlus, ArenaPlus, and GameZone players.”

A more robust training framework to build a high-caliber, human-centered team

Great service starts with a strong training foundation. In 2025 alone, the DigiPlus customer service team collectively logged over 87,000 training hours. DigiPlus has since expanded its customer care training programs to ensure teams are equipped to thrive in fast-paced and complex business operations. Recognizing the need for more immersive learning beyond traditional classroom instruction, the company enhanced its training framework to better prepare customer service teams for real-world scenarios.

The updated approach blends foundational learning with guided, hands-on experience, allowing frontliners to apply skills early while receiving structured coaching from senior team members over an extended, progressive training period. This ensures that BingoPlus, ArenaPlus, and GameZone customer-facing teams are confident, capable, and ready to deliver consistent, high-quality service.

Alongside capability-building, DigiPlus emphasizes human-centered service. Customer care teams are trained to prioritize meaningful conversations over scripted responses, respect players’ time, and resolve concerns more effectively by viewing each interaction as part of a broader customer journey.

Readiness to provide Responsible Gaming support for players

Responsible Gaming remains a key pillar of DigiPlus’ customer care strategy. Customer service teams also undergo a dedicated Responsible Gaming training module that equips them to recognize potential indicators of gaming-related concerns among customers and respond with professionalism, empathy, and appropriate support.

As part of this approach, customer care teams are trained to guide players through available Responsible Gaming tools and safeguards on the platforms, such as options to manage gaming duration or schedule, set limits on deposits or spending, or request self-exclusion or temporary account deactivation. These Responsible Gaming tools are designed and pioneered by DigiPlus to help protect players and encourage more mindful and balanced gameplay.

Where customers require additional well-being support, customer frontliners may also direct players to further resources, including the EmbracePLUS mental health helplines (Smart: 0908-235-2351, Globe: 0956-392-1924; open daily from 12:00 PM to 8:00 PM), which provide Psychological First Aid, and other independent support organizations.

Scaling customer engagement efficiencies in 2026

Looking ahead to 2026, DigiPlus aims to further strengthen customer service operations by driving greater efficiency through innovation. The company plans to continue enhancing processes and responsibly leveraging technology to streamline workflows, improve response times, and enable smarter, more personalized customer support—laying the groundwork for a scalable and future-ready service experience.

Continue Reading