NEWS
Infinit Care Spotlight Series: A closer look at worker profiles that need workplace mental health support

May 26, 2022 8:00 p.m.
Being on the internet has become a regular part of our everyday life. According to the latest statistics, 3.96 billion people use social media globally, with each person spending an average of 147 minutes or two hours and seven minutes on digital platforms every day.
These are significant figures if you look at them from the context of the level of exposure we get from the digital platforms we access.
Over the last few years, the internet has played a pivotal role in society–building businesses and new industries, creating new needs, and of course, shaping the mindset of the public as a whole. Without a doubt, the internet has become so powerful that it can shape generations and the way they think and act as a whole.
But have you ever wondered how information is sifted and checked in the online worlds we love to immerse ourselves in?
Websites and applications, big and small, have community guidelines that protect their users from being exposed to harmful information, but who exactly are the people working behind the scenes and doing the heavy lifting of screening this information?
In this article, we will talk about the sentinels of the internet and the plight that comes with their profession.
Meet the Content Moderators.
Content Moderation in a Nutshell
Content moderation, at its simplest, is the process of screening and monitoring user-generated content posted on online platforms.
Whenever a user submits or uploads something to a website, moderators go through the content to make sure that the material follows the community regulations and is not criminal or illegal in nature.
Some examples of banned content that content moderators screen are those that contain sexual themes, drugs, bigotry, homophobia, harassment, and racism.
While content moderation is applied to a majority of online platforms, they are even more so practiced in websites with a heavy lean toward user-generated uploads.
This includes social media platforms, online marketplaces, communities and forums, the sharing economy, and even dating sites.
There are two different types of content moderation that websites use: AI-automated and human moderation.
In the first type, a machine learning system is designed to moderate posts based on previous data gathered from the internet.
AI moderation is significantly faster–sometimes only taking seconds to review posts, but it might not always be 100 percent accurate because it relies on machine learning which may not always pick up the right cues.
Human moderation, on the other hand, is a manual type of process that involves an actual person who reviews the posts.
Under this category, the screener follows specific platform rules and guidelines to check the user-generated content submitted to the website. While this type of moderation is more foolproof than its counterpart, it also takes more time due to its manual nature.
Moreover, it also presents a serious problem within its workforce that unfortunately, is not often well addressed: mental distress.
The Dark Side of Content Moderation
While content moderation remains to be a discreet profession, at least in the Philippines, more and more people who have worked in the field have stepped up over the recent years to speak up about the challenges and dangers that are prevalent in the industry.
A riveting 2018 internationally produced documentary titled ‘The Cleaners’ gave an exhaustive look at the plight of moderators in the country who worked for online giants like Facebook, Twitter, and Google, and tackled the subject of their mental health struggles from their job.
Facebook itself has acknowledged the difficulties that come with the profession while Microsoft has faced lawsuits from former employees who claim that they were not given proper support despite the psychological dangers of their job.
Moderators sift through hundreds of submissions that contain triggering content not limited to depictions of death, torture, mutilation, and violence for hours, sometimes with only limited time for breaks.
The nature of the work can lead to the development of mental distress and psychological issues such as post-traumatic stress disorder (PTSD), anxiety, and even depression.
This is something that is also supported by data from other studies in journalism, law enforcement, and child protection which claim that repeated trauma exposure can lead to psychological distress.
On top of that, workers in the said areas have also been stated to suffer more from burnout, relationship challenges, and even suicide.
The following are other mental health problems that can arise from exposure to toxic content:
- Panic attacks – some moderators have expressed feeling attacks when being around animals and children–fearing something will happen to them–after repeated exposure to violent videos.
- Normalization/Desensitization to disturbing humor and language – repetitive exposure to disturbing content can change the mindsets and perspectives of its audience, leading to inappropriate humor and language.
- Self-destructive habits – alcoholism, use of drugs, and display of indiscriminate sexual habits have supposedly also been reported in the workplaces of moderators who presumedly engage in them as a way of emotional escape to their job.
- Skewed beliefs – in some cases, some content moderators can also develop fringe views (e.g. believing conspiracy theories) that are not supported by hard facts because of constant exposure to their materials.
The Cost of Internet Safety
Without a doubt, content moderators serve as the first layer of protection of the general public from disturbing and harmful materials.
Unfortunately, they are not always properly protected from the rigors that come with their profession.
Unlike different workplaces (for example, those in the health sector, law and policing, and journalism) which have more solid guidelines when it comes to taking care of the mental needs of their workforce, there is an obvious lack of the same system for those working in the content moderation industry.
In an article by Harvard, it is even said that companies are even very restrictive about letting others investigate their existing procedures and treatment of these workers.
Not only are there no third parties monitoring the welfare of employees, but people working in the industry are also commonly asked to refrain from talking about their work through non-disclosure contracts.
Fortunately, some companies have also taken the initiative to develop workplace guidelines that can improve the treatment of those in the industry.
Facebook, for example, helped create the Technology Coalition which then designed the Employee Resilience Guidebook, a guide that outlines rules protecting the occupational health and safety of workers reviewing distressing content.
While the guidelines were made for those who are focused on employees dealing with child pornography, it also has terms that can be used for others in professions that expose workers to distressing imagery and content.
Specifically, the guide includes rules such as the provision of mandatory individual and group counseling sessions with a certified trauma specialist, limiting exposure to disturbing content for four hours, giving employees the choice to opt out of viewing specific disturbing content, encouraging them to switch to other projects as a form of relief, and giving them enough time to take a break and recover from their work.
Protecting the Protectors
While overarching guidelines are already being developed on a global scale, it cannot be debated that a huge chunk of the responsibility should fall on the shoulders of the employers who are in a better position to observe and improve the best practices in this area.
Here at Infinit Care, for example, we follow a tried and tested framework, the Mental Health Continuum, to make sure that every employee working in high-risk professions gets the mental health support that they need, wherever they are on the scale – whether they are excelling, surviving or in crises. (Click here to know more about the Mental Health Continuum.)
Our Head of Clinical Care Shyne Mangulabnan suggests several ways on how employers can put this to work.
“Having a counseling professional who can help these employees is essential as well as having a solid support and assessment system for them. For example, surveys given to agents which can be used as a reference for the design of a wellness strategy is a good place to start. Constant monitoring of employees should also be done to make sure that their needs are met.”
On top of that, Mangulabnan also suggests creating proper escalation procedures for concerns relating to the mental health challenges of content moderators.
Proper education of important stakeholders within the company (human resource team, upper management) about mental health risks of the job is also necessary since they are the decision-makers who create systems that take care of employees.
“It would be best to have an end-to-end solution: an onboarding process that gives candidates the training and education they need to understand the risks and concepts of well-being, round-the-clock onsite and virtual counseling services, community support groups, yoga and meditation activities, and workshops are just some of the many things that employers can initiate to make sure that they give the support that their workforce needs.”
True enough, it is the responsibility of employers to make sure that they ‘protect the protectors’ of the internet.
However, it’s not only the content moderators who should be given this kind of support, especially with 43 percent of the global workforce expressing that the COVID-19 pandemic has increased the stress that they suffer from work.
This story is just the first chapter of a series that will shed light on all the professions who need mental health support most in these trying times.
Do you need help on how you can start caring for your employees in this aspect? We’d be more than happy to guide you here at Infinit Care. We are a company that helps other companies provide comprehensive mental health care support to their employees through the use of science-backed methodologies. You can reach out to us here to know more about how we can help.
NEWS
Converge named finalist for NPC’s Privacy Initiative of the Year Award

5:25 p.m. June 11, 2025
Leading fiber broadband and technology provider Converge ICT Solutions Inc. was listed among the finalists for the 2025 Privacy Initiative of the Year at the National Privacy Commission’s (NPC) Privacy Awareness Week (PAW) Awards, in recognition of its innovative program, Project PIGLET (Privacy Integration through Guided Learning of Emerging Technologies).
The program aims to enhance digital literacy and privacy awareness among primary school students, emphasizing the critical importance of safeguarding personal information in today’s increasingly digital world.
“In a world where people are always online, Project PIGLET is important for teaching kids about privacy and data protection. We are proud that the NPC sees our work in helping build a safer and smarter digital community from the ground up. We remain committed to continuing this movement so that protecting privacy becomes a lifelong habit for all,” said Converge Corporate Compliance and Data Protection Officer Atty. Laurice Esteban-Tuason.
Every year, the NPC recognizes stakeholders for their compliance with the Data Privacy Act of 2012 (DPA) through the PAW Awards and inspires privacy advocates to deepen their commitment to data protection.
Under Project PIGLET, Converge – with the help of its Corporate Governance and Data Privacy (CGDP) Group – hosts engaging storytelling sessions in primary schools, where students, teachers, and parents learn about data protection through the adventures of ‘Astro Kids’ in the ‘Internet Universe.’
The narrative highlights the dangers of sharing personal information with deceptive online entities in the guise of friendship.
With the guidance of Captain Conrad, the Astro Kids impart crucial lessons on vigilance in cyberspace and encourage young participants to report suspected incidents to their guardians.
Previously, the company visited Francisco Legaspi Memorial School in Pasig and Anunas Elementary School in Angeles, Pampanga for Project PIGLET, inviting pupils in Grades 2 to 6 in age-appropriate discussions on digital literacy, and responsible online behavior.
Converge intends to expand the information drive by introducing new approaches and engaging more students across all academic levels in an effort to broaden the campaign’s reach throughout the country. ###
NEWS
8 steps to secure industrial enterprises

5:20 p.m. June 10, 2025
Industrial sectors such as power and utilities, energy and chemicals, metals and mining and critical manufacturing are becoming increasingly vulnerable to cyber threats, in fact industrial enterprises experienced more incidents than any other, with a 25.7% share in 2024 according to the Kaspersky MDR team.
The importance of cyber resilience in these industries cannot be overstated, as cyberattacks can lead to operational disruptions, financial losses and compromised safety. Yet, according to the World Economic Forum only 19% of cyber leaders feel confident that their organizations are cyberresilient.
Those fears are rooted in the knowledge that threat levels are rising everywhere. Global analyst and advisory firm Omdia found that 80% of manufacturing firms experienced a notable increase in overall security incidents or breaches last year, but only 45% are adequately prepared in their cybersecurity.
Why cyber resilience matters
One of the most critical aspects of cyber resilience is maintaining business continuity. Cyberattacks have the potential to cripple operations, causing significant delays and financial setbacks. The effects of a cyberattack can be felt far and wide such as power outages, safety incidents and environmental emergencies. However, organizations that prioritize cyber resilience can quickly recover from incidents, minimizing downtime and ensuring essential functions remain operational. Proactive business continuity planning is key, as it enables companies to prepare for potential cyber threats and ensure that disruptions do not lead to prolonged or catastrophic consequences.
Another hugely important reason for greater cyber resilience is its role in protecting sensitive data and preserving an organization’s reputation. Industrial enterprises manage vast amounts of sensitive data, making them prime targets for cybercriminals. A successful attack can lead to data breaches, intellectual property theft and significant reputational damage. Additionally, many industries must comply with stringent data protection regulations, and so a comprehensive cyber resilience strategy helps organizations stay compliant and avoid costly legal consequences.
Industrial control systems are particularly vulnerable, as they form the backbone of essential industrial processes. Cyber resilience ensures these systems remain secure, reliable and functional even when faced with persistent threats. Additionally, as industrial enterprises increasingly integrate connected products and digital technologies, the need to protect these interconnected systems from cyberattacks becomes even more pressing.
Financial loss is arguably the greatest concern when it comes to cyber threats though. A single cyberattack can result in substantial financial repercussions, including direct losses from theft, recovery costs, regulatory fines and lost business opportunities. A well-structured cybersecurity strategy can lead to lower insurance premiums by demonstrating a proactive approach to cyber risk mitigation. Additionally, organizations that invest in cyber resilience are better equipped to optimize their operations, ensuring that productivity and efficiency are maintained even in the face of emerging cyber threats.
Kaspersky is on the front line, protecting more than 1,000 industrial customers and has extensive experience in helping industrial organizations in adopting international standards and best practices. Calling on this expertise, Kaspersky has defined the following eight strategic steps that apply universally to automation systems:
- Inventory: Asset Management
Begin by building or updating your asset inventory. Account for systems, software, hardware, network segments, conduits, communication paths and devices to understand what must be secured. If you can’t monitor a part of your infrastructure – or aren’t even aware it exists and could be attacked – you can’t protect it. This comprehensive inventory ensures all valuable assets are secured.
- Assess: Detailed Risk Assessment
Conduct a detailed risk assessment to understand the current risk level within your organization, considering potential threat vectors and existing or planned countermeasures. This assessment helps prioritize investments and prevent potentially catastrophic disruptions.
- Secure: Essential Security
Implement essential security measures, such as endpoint protection, to safeguard operations. This involves creating security baselines aimed at maintaining and protecting operational OT system integrity while detecting, blocking and remediating cyber threats.
- Detect: Threat and Anomaly Detection
Implement threat and anomaly detection to identify threats early and understand how attacks develop, enabling quick responses to avoid disruption and continually strengthen your security posture.
- Audit: Security Audits and Compliance
Conduct regular security audits and focus on compliance to build a realistic picture of your organizational cybersecurity. These systematic evaluations ensure alignment with criteria and benchmarks, improving adherence to best practices and resulting in robust systems.
- Enhance: Zones and Conduits
Enhance your network architecture by organizing and protecting it through zones and conduits. Zones group networks, devices and services based on function and criticality, while conduits represent communication paths that unite zones or connect them to external networks.
- Monitor: Mature Security Operations
Develop a mature Security Operations Center (SOC) with proactive and contextual analysis capabilities to manage complex attacks. Continually evolve your SOC capability with threat intelligence and incident response features to swiftly investigate, contain and mitigate threats.
- Prepare: Fault Tolerance and Readiness
Guarantee fault tolerance by stress-testing your infrastructure through exercises that simulate large-scale cyberattacks. This preparation ensures that your industrial control systems can withstand and recover from cyber incidents without compromising operational continuity. People are an organization’s greatest asset, but they also a point of potential vulnerability, employers should be trained on a regular basis.
NEWS
Kollab lauded as The Best Small Workplace in the Philippines by Great Place to Work®

8:44 p.m. June 9, 2025
Kollab is shaping the future of Filipino IT talent through a workplace culture that continues to set the standard.
The premier digital transformation advisor has once again been named at the #1 Best Workplace in the Philippines among small enterprises (30-99 employees) by Great Place to Work®, earning the distinction for the second straight year and securing a spot as the #4 Best Workplace™ in Asia.
In the Philippines, Kollab topped the Top 10 list after 99% of Kollab’s employees said the company was a great place to work, a stark contrast to the national average of 65%.
Jonathan Ty, Kollab’s CCO and Head of Business Development, shares, “At Kollab, culture isn’t just a buzzword—it’s how we operate. We believe that embracing a people-first approach is the key to creating the next generation of Filipino tech leaders. We’ve seen the immense potential of Filipinos in driving tech innovation, and we cultivate that with a collaborative culture that balances positive growth results with workplace flexibility and employee wellness.”
Kollab’s success lies in its deeply collaborative culture. During its recent acquisition of local AI firm Senti AI, teams were directly involved in shaping strategy and integrating cultures –– a move that reinforced Kollab’s pro-employee initiatives. Employees also get to lead projects and receive constant feedback, creating a continuous loop of growth and innovation. The company also promotes a remote-first work arrangement, mental health breaks, no-meeting days, learning stipends, and a peer-run training program.
Kollab’s collaborative workplace environment has driven the company’s growth in Southeast Asia as it now serves over 1,100 organizations in the region. It has expanded its portfolio to include more complex tech solutions in cloud computing, AI, and cybersecurity. Kollab also launched Managed Security Services to help enterprises implement continuous threat exposure management and real-time protection and response.
Kollab plans to double its tech talent pool and strengthen its employee empowerment initiatives in the coming years. It has launched AI and cybersecurity bootcamps, invested in professional certifications, and formed partnerships with leading technology firms in line with its pursuit of building a future-ready workforce.
For more information about Kollab and the company’s people-first initiatives, visit https://www.kollab.com/ and follow its LinkedIN page at https://www.linkedin.com/company/kollabph/.