There's a dark side to the Internet, and some humans have to go deep into it in order to spare the rest of us.
Child porn, bestiality, violent murders -- that's just some of what content moderators are purging from the internet. But who is taking care of them?
Two Microsoft employees, who filed a lawsuit in December, say they viewed this content as part of their jobs. The men, who were part of the company's online abuse team, allege that they weren't warned about the dangers associated with the role and weren't given adequate counseling. As a result, according to the suit, they suffer from PTSD. One alleges he even has trouble interacting with his own son.
Microsoft refutes their claims -- but the lawsuit's very existence is an unsettling reminder that artificial intelligence and algorithms aren't enough to monitor the web. Humans play a role and there are residual effects to their involvement -- the extent of which is still unknown.
"It's the dirty little secret, if you will, of the industry," Sarah T. Roberts, assistant professor of information studies at UCLA, told CNNTech. "[Tech companies] like the idea of people thinking that everything is based on algorithms."
In fact, humans are very much needed to identify and flag the inappropriate content. At Microsoft, it meant reviewing content on properties like Bing and Xbox -- but moderators are needed for virtually anywhere that people are able to post content.
New technology inevitably creates fresh ways for people to upload and create content. And the rise of live streaming video adds a complicated new element: "It's virtually impossible" for technology to moderate live content, said Roberts, who has been studying commercial content moderation since 2010.
It's difficult to know the extent of the content moderation industry, according to Roberts, since job titles vary across companies, some employees are under nondisclosure agreements, much of the work is outsourced and there tends to be a high turnover rate. But she estimates that there are thousands of people involved in moderating online content, either currently or in past roles.
Stefania Pifer, a psychotherapist who focuses specifically on trauma, cofounded The Workplace Wellness Project in 2011 to provide firms with consistent programs to monitor employees in these jobs.
Related: Vietnam photo censorship reignites debate over Facebook's responsibilities
"It does impact us to bear witness to the pain and suffering of other people because we're humans and we're wired to feel empathy, most of us," said Pifer, who used to work at Google (GOOGL).
She calls the type of work she does "wellness and resiliency counseling" and said there has been an increased demand for her services over the past two years.
Pifer and cofounder Naheed Sheikh customize their approach depending on the specific situation. Generally, they do individual and group sessions once or twice a month with the content moderators, as well as training sessions with managers so they know how to support their employees.
"It's not really part of mainstream culture for people to talk about their feelings," Pifer, 36, said. How people are impacted by their work is "so highly variable. What are your coping strategies, your resiliency capabilities, your own histories of trauma?"
CNNTech contacted a number of companies for details on their content moderators and policies. Facebook (FB), Instagram, Google (GOOG), YouTube and Twitter (TWTR) would not speak on the record. Reddit provided a generic statement about the overall makeup of its team.
Related: Sextortion is scarily common, study finds
Microsoft (MSFT) has been considered a leader in detecting child porn, opening up its PhotoDNA software to other companies for free to help them prevent child porn from being uploaded onto their sites.
The men, who left Microsoft on disability in 2013 and 2015, allege that the wellness program in place during their tenure was not sufficient. They were told to take a smoke break, play video games as a distraction, or leave work early when feeling distressed, according to the suit.
Microsoft sent CNNTech its current policy, which includes mandatory group and one-on-one meetings with a psychologist, tools that reduce the realism of the imagery and limits on how long each employee can do this work per day.
It said its wellness program is "a process, always learning and applying the newest research about what we can do to help support our employees even more."
"Microsoft takes seriously its responsibility to remove and report imagery of child sexual exploitation and abuse being shared on its services," said a Microsoft spokesperson in a statement sent to CNNTech.
The Technology Coalition, which provides best practices to tech companies in handling abuse, has very vague recommendations for how to care for these employees.
Wells told CNNMoney that things like mandatory job rotations, weekly meetings with a psychologist, and support for spouses of moderators are necessities.
The lawsuit is requesting damages for the employees, as well as policy changes.
"I can only imagine that this lawsuit will serve as a wake-up call," added Roberts.