Our
two-year investigation suggests that the tech giant Meta is struggling to
prevent criminals from using its platforms to buy and sell children for sex
Content
warning – the following article contains descriptions of child sexual abuse,
exploitation and trafficking
Maya
Jones* was only 13 when she first walked through the door of Courtney’s House,
a drop-in centre for victims of child sex trafficking in Washington DC. “She
was so young, but she was already so broken by what she’d been through,” says
Tina Frundt, the founder of Courtney’s House. Frundt, one of Washington DC’s
most prominent specialists in countering child trafficking, has worked with
hundreds of young people who have suffered terrible exploitation at the hands
of adults, but when Maya eventually opened up about what she had been through,
Frundt was shaken.
Maya
told Frundt that when she was 12, she had started receiving direct messages on
Instagram from a man she didn’t know. She said the man, who was 28, told her
she was really pretty. According to Frundt, Maya told her that after she
started chatting with the man, he asked her to send him naked photos. She told
Frundt that he said he would pay her $40 for each one. He seemed kind and he
kept giving Maya compliments, which made her feel special. She decided to meet
him in person.
Then
came his next request: “Can you help me make some money?” According to Frundt,
Maya explained that the man asked her to pose naked for photos, and to give him
her Instagram password so that he could upload the photos to her profile.
Frundt says Maya told her that the man, who was now calling himself a pimp, was
using her Instagram profile to advertise her for sex. Before long, sex buyers
started sending direct messages to her account, wanting to make a date. Maya
told Frundt that she had watched, frozen, what was taking place on her account,
as the pimp negotiated prices and logistics for meetings in motels around DC.
She didn’t know how to say no to this adult who had been so nice to her. Maya
told Frundt that she hated having sex with these strangers but wanted to keep
the pimp happy.
One
morning three months after she first met the man, Frundt says that Maya was
found by a passerby lying crumpled on a street in south-east DC, half-naked and
confused. The night before, Maya told her, a sex buyer had taken her somewhere
against her will, and she later recalled being gang-raped there for hours
before being dumped on the street. “She was traumatised, and blamed herself for
what happened. I had to work with her a lot to help her realise this was not
her fault,” said Frundt when we visited Courtney’s House last summer.
Frundt,
who has helped hundreds of children like Maya since she opened Courtney’s House
in 2008, says that the first thing she now does when a young person is referred
to her is to ask for their Instagram handle. Other social media platforms are
also used to exploit the young people in her care, but she says Instagram is
the one that comes up most often.
In the
20 years since the birth of social media, child sexual exploitation has become
one of the biggest challenges facing tech companies. According to the United
Nations Office on Drugs and Crime (UNODC), the internet is used by human
traffickers as “digital hunting fields”, allowing them access to both customers
and potential victims, with children being targeted by traffickers on social
media platforms. The biggest of these, Facebook, is owned by Meta, the tech
giant whose platforms, which also include Instagram, are used by more than 3
billion people worldwide. In 2020, according to a report by US-based
not-for-profit the Human Trafficking Institute, Facebook was the platform most
used to groom and recruit children by sex traffickers (65%), based on an
analysis of 105 federal child sex trafficking cases that year. The HTI analysis
ranked Instagram second most prevalent, with Snapchat third.
Grooming
and child sex trafficking, though often researched and discussed together, are
distinct acts. “Grooming” refers to the period of manipulation of a victim
prior to their exploitation for sex or for other purposes. “Child sex trafficking”
is the sexual exploitation of a child specifically as part of a commercial
transaction. When the pimp was flattering and chatting with Maya, he was
grooming her; when he was selling her to other adults for sex, he was
trafficking.
Though
people often think of “trafficking” as the movement of victims across or within
borders, under international law the term refers to the use of force, fraud or
coercion to obtain labour, or in the buying and selling of non-consensual sex
acts, whether or not travel is involved. Because, under international law,
children cannot legally consent to any kind of sex act, anyone who profits from
or pays for a sex act from a child – including profiting from or paying for
photographs depicting sexual exploitation – is considered a human trafficker.
Meta has
numerous policies in place to try to prevent sex trafficking on its platforms.
“It’s very important to me that everything we build is safe and good for kids,”
Mark Zuckerberg, Meta’s founder, wrote in a memo to staff in 2021. In a
statement responding to a detailed list of the allegations in this piece, a
Meta spokesperson said: “The exploitation of children is a horrific crime – we
don’t allow it and we work aggressively to fight it on and off our platforms.
We proactively aid law enforcement in arresting and prosecuting the criminals
who perpetrate these grotesque offences. When we are made aware that a victim
is in harm’s way, and we have data that could help save a life, we process an
emergency request immediately.” The statement cited the group director of
intelligence at the charity Stop the Traffik, who is former deputy director of
the UK’s Serious Organised Crime Agency, who has said “millions are safer and
traffickers are increasingly frustrated” because of their work with Meta.
But over
the past two years, through interviews, survivor testimonies, US court
documents and human trafficking reporting data, we have heard repeated claims
that Facebook and Instagram have become major sales platforms for child
trafficking. We have interviewed more than 70 sources, including survivors and
their relatives, prosecutors, child protection professionals and content
moderators across the US in order to understand how sex traffickers are using
Facebook and Instagram, and why Meta is able to deny legal responsibility for
the trafficking that takes place on its platforms.
While
Meta says it is doing all it can, we have seen evidence that suggests it is
failing to report or even detect the full extent of what is happening, and many
of those we interviewed said they felt powerless to get the company to act.
The survivors
Courtney’s
House sits on a quiet residential street on the outskirts of Washington DC.
Inside, Frundt and her team have tried to make the modest two-storey house feel
like a family home, with comfortable sofas and photos on the mantlepiece.
Frundt, who was herself trafficked as a child in the 1980s and 90s, is now one
of Washington DC’s most experienced and respected anti-trafficking advocates.
Warm and ferociously protective of the children in her care, she is contracted
by the city’s child protection services to identify trafficked children going
through the court system, and she regularly attends court hearings for the
youth in her care. She also helps train the FBI and local law enforcement
sex-trafficking units on how to spot traffickers on online platforms, including
Instagram. “When I was trafficked long ago I was advertised in the classified
sections of freesheet newspapers,” Frundt told us. “Now my youth here are
trafficked on Instagram. It’s exactly the same business model but you just
don’t have to pay to place an ad.”
The
children who are referred to Frundt, usually by the police or social services,
have been sexually exploited and controlled: by a boyfriend, a pimp, a family
member. Some of them are as young as nine. Almost without exception, they have
childhoods scarred by sexual abuse, poverty and violence. This makes them
perfect targets for sexual predators. “They are all looking for love and
affirmation and a sense that they mean something,” said Frundt.
Almost
all the young people who come to Courtney’s House are children of colour. They
are, Frundt said, battling stereotypes that pressure them to become sexualised
too early and make them vulnerable to traffickers. A 2017 study by the
Georgetown Law Center on Poverty and Inequality found that adults typically
regard Black girls as less innocent and more knowledgable about sex than their
white peers. The same study showed that Black girls are often perceived to be
older than they are.
Most of
the time, Frundt says, the children who come to Courtney’s House are still
being trafficked when they walk through the door. Even in cases where they have
escaped their exploiters, she said, explicit videos and photos of them often
continue to circulate online. Traffickers will lock victims out of their
accounts, preventing them from taking down images posted to their profiles.
When we
asked Frundt if she could show us examples of young people in her care who she
says are currently being trafficked on Instagram, she pulled out her phone and
scrolled through post after post of explicit images and videos of girls as
young as 14 or 15. Most of the photos and videos seemed to have been taken by
someone else. Frundt said that these posts were being used as a way of
advertising the girls for potential sex buyers, who would send a direct message
to buy explicit content or to arrange a meet up.
At one
point, our conversation was interrupted by the arrival of five teenage girls.
They had come back from school, and they gathered around the kitchen table,
chatting and playing music on their phones while Frundt served them casserole.
After they had eaten, we asked if we could talk to them about their
experiences: had any of them been sexually exploited on social media or had
explicit videos or pictures posted of them?
They
glanced at each other and burst out laughing. Yes, they said, of course. All
the time. One girl said she felt that “nobody at Instagram cares, they don’t
care what’s posted. They don’t care shit about us.”
Frundt
claims that she is constantly asking Instagram to close accounts and take down
exploitative content of kids in her care. “I even have law enforcement calling
me up asking, ‘Tina, can you get Instagram to do something?’. If I can’t get
Instagram to act, what hope is there for anyone else?”
When we
put these concerns to Meta, a spokesperson said: “We take all allegations and
reports of content involving children extremely seriously and have diligently
responded to requests from Courtney’s House. Our ability to remove content or
delete accounts requires sufficient information to determine that the content
or user violates our policies.”
Frundt
says that in 2020 and 2021 she had discussions with Instagram about conducting
staff training to help prevent child trafficking on its platforms. She says the
training didn’t go ahead as, after a long back and forth, on a video call
Instagram executives said that they wouldn’t pay Frundt her standard fee of
$3,000, instead allegedly offering $300. When we put this to Meta, they did not
deny it.
The court documents and the prosecutors
What
makes social media platforms so powerful as a tool for traffickers – far more
powerful than the back pages of a newspaper in which Frundt was advertised as a
teenager – is the way that they make it possible to identify and cultivate
relationships with both victims and potential sex buyers. Traffickers can
advertise and negotiate deals by using different features of the same platform:
sellers sometimes post publicly about the girls they have available, and then
switch to private direct messages to discuss prices and locations with buyers.
US court
documents provide a graphic insight into how these platforms can be used. In
one case prosecuted in Arizona in 2019, Mauro Veliz, a 31-year-old who was
convicted of conspiracy to commit sex trafficking of a child, exchanged
messages on Facebook Messenger with Miesha Tolliver, who also received jail
time for sex trafficking. Tolliver told Veliz that she had one girl available
for sex, and photographs of two more, before saying that the girls were aged
17, 16 and 14.
Veliz:
“How much is it for all of them?”
Tolliver:
“The 14 [year-old] will cost the most … a couple of hundred for her but [$] 150
for the rest”
The
14-year-old, Tolliver told Veliz, was “new to the sex game”.
Tolliver:
“The 1 on the right … is 16 with a fat ass ... the other [is] 15 with huge
tits”
The
court transcripts then state that multiple sexually explicit images of the
girls were sent to Veliz.
Tolliver:
“do you want me to bring 1 of the girls with me so you guys can fuck?”
[ ... ]
Veliz:
“is your girl nervous? Or have you told her yet?”
Tolliver:
“… shes still young and doesn’t understand how ppl like it”
Tolliver
and Veliz exchanged more messages, arranging for Veliz to meet the girl in a
hotel in California two days later.
The final
message submitted to the court was from Veliz to Tolliver. “We’re finished
she’s in the restroom,” it said.
Luke
Goldworm, a former assistant district attorney in Boston, Massachusetts, who
has investigated and prosecuted human trafficking cases for years, says that he
has encountered numerous exchanges like this one. From 2019 until he left the
job in October 2022, he said, his department’s caseload of child-trafficking
crimes on social media platforms increased by about 30% each year. “We’re seeing
more and more people with significant criminal records move into this area.
It’s incredibly lucrative,” he said. A trafficker can make up to $1,000 a
night. Many of the victims he saw were just 11 or 12, he said, and most of them
were Black, Latinx or LGBTQI+.
According
to Goldworm, while his investigations involved every social media platform,
Meta platforms were the ones he encountered most often. Six other prosecutors
in several different states told us that, in their experience, Facebook and
Instagram are being widely used to groom children and traffick children. Five
of these prosecutors spoke of their anger over what they felt were Meta’s
unnecessary delays in complying with judge-signed warrants and subpoenas needed
to gather evidence on sex trafficking cases. “We get a higher rate of rejected
warrants from Facebook than any other electronic service provider,” claimed
Gary Ernsdorff, senior deputy prosecuting attorney for King County, Washington
state. “What I find frustrating is that the exchange can delay rescuing a
victim by a month.”
Three of
these prosecutors described experiences where they say the company would cite
technicalities, picking faults with wording and format, and slowing down
investigations. In response, the company said that these claims were “false”,
adding that between January and June last year, it “provided data in nearly 88%
of requests from the US government”.
The responsibility for reporting
Meta
acknowledges that human traffickers use its platforms, but insists that it is
doing everything in its power to stop them. By law, the company is required to
report any child sexual abuse imagery shared over its platforms to the National
Center for Missing & Exploited Children (NCMEC), which receives federal
funding to act as a nationwide clearing house for leads about child abuse. Meta
is a major funder of NCMEC, and holds a seat on the company’s board.
From
January to September 2022, Facebook reported more than 73.3m pieces of content
under “child nudity and physical abuse” and “child sexual exploitation” and
Instagram reported 6.1m. “Meta leads the industry in using the most
sophisticated technology to detect both known and previously unknown child
exploitation content,” said a company spokesperson. Of the 34m pieces of child
sexual exploitation content removed from Facebook and Instagram in the final
three months of 2022, 98% was detected by Meta itself.
But the
vast majority of the content that Meta reports falls under child sexual abuse
materials (CSAM) – which includes photos and videos of pornographic content –
rather than sex trafficking. Unlike with child sexual abuse imagery, there is
no legal requirement to report child sex trafficking, so NCMEC must rely on all
social media companies to be proactive in searching for and reporting it. This
legal inconsistency – the fact that child sexual abuse imagery must be
reported, but reporting child sex trafficking is not legally required – is a
major problem, says Staca Shehan, vice-president of the analytical services
division at NCMEC. “It’s concerning across the board how little trafficking is
being reported,” Shehan says. Social media companies “are prioritising what’s
[legally] required”.
“I think
everyone could do more,” Shehan says. “The volume of child sexual abuse
material (CSAM) and volume of trafficking [being reported] is like apples and
oranges.” According to Shehan, one further reason for this disparity, beyond
the differing legal requirements, is technological. “Child sexual abuse
material is that much easier to detect. There are so many technology tools that
have been developed that allow for the automated detection of that crime.”
A NCMEC
spokesperson told us that if social media companies are not reporting child sex
trafficking, it allows this crime to thrive online. Reporting trafficking, they
emphasised, is crucial for rescuing victims and punishing offenders.
Between
2009 and 2019, Meta reported just three cases as suspected child sex
trafficking in the US to NCMEC, according to records disclosed in a subpoena
request seen by the Guardian.
A
spokesperson for NCMEC confirmed this figure, but clarified that a number of
child trafficking cases during the same time period were reported by Meta under
other “incident types”, such as child pornography or enticement. “I think one
of the things to be aware of is that is that there’s sort of a singular tag
that’s used for reporting,” Antigone Davis, head of global safety at Meta,
emphasised to us in a recent interview. “And so just because something isn’t
tagged as sex trafficking doesn’t mean that it isn’t being reported.”
A Meta
spokesperson claimed that over the past decade, the company had reported “tens
of thousands of accounts which violated our policies against child sex
trafficking and commercial child sexual abuse material to NCMEC.” When we put
these claims to NCMEC, it said that it had not received “tens of thousands” of
reports of child trafficking from Meta, but had received that number related to
child abuse imagery.
Hany
Farid is a professor at the University of California, Berkeley who helped
invent the PhotoDNA technology that Meta uses to identify harmful content. He
believes Meta, which is currently valued at more than $500bn, could do more to
combat child trafficking. It could, for instance, be investing more to develop
better tools to “flag suspicious words and phrases on unencrypted parts of the
platform – including coded language around grooming,” he said. “This is,
fundamentally, not a technological problem, but one of corporate priorities.”
(There is a separate debate about how to handle encryption. Meta’s plans to
encrypt direct messages on Facebook Messenger and Instagram has recently drawn
criticism from law enforcement agencies, including the FBI and Interpol.)
In
response to Farid’s claims and further questions from the Guardian, Meta did
not specify how much money it has invested in technologies to detect child sex
trafficking, but said that it had “focused on using AI and machine learning on
non-private, unencrypted parts of its platforms to identify harmful content and
accounts and make it easier for people to report messages to the company so we
can take action, including referrals to law enforcement”. Davis also emphasised
that Meta constantly works with partners to improve its anti-trafficking
safeguards. For instance, she mentioned that “we’ve been able to identify the
kinds of searches that people do when they’re searching for trafficking
content, so that when people search for that, we will pop up with information
to divert them or to let them know that what they’re doing is illegal
activity”.
These
efforts have failed to satisfy some of Meta’s own investors. In March, several
pension and investment funds that own Meta stock launched legal action against
the company in Delaware over its alleged failure to act on “systemic evidence”
that its platforms are facilitating sex trafficking and child sexual
exploitation. By offering insufficient explanation of how it is tackling these
crimes, the complaint says, the board has failed to protect the interests of
the company. Meta has rejected the basis for the lawsuit. “Our goal is to
prevent people who seek to exploit others from using our platform,” the company
said.
The moderators
As well
as software, Meta uses teams of human moderators to identify cases of child
grooming and sex trafficking. Until recently, Anna Walker* worked the night
shift in an office of a Meta subcontractor. She would start each shift filled
with dread. “We were just, like, shoved in a dark room to look at the stuff,”
she said.
Walker’s
job was to review interactions between adults and children on Facebook
Messenger and Instagram direct messenger that had been flagged as suspicious by
Meta’s AI software. Walker claims she and her team struggled to keep pace with
the huge backlog of cases. She says she saw cases of adults grooming children
and then making plans to meet them for sex, as well as discussions about
payment in exchange for sex.
Walker’s
managers would pass on such cases to Meta to decide if action should be taken
against the user. In some cases, Walker claims: “Months would pass and then the
automatic bot would send me an email saying it was closing this case, because
nobody’s taken action on it.” She added: “I would cry to my manager about [the
children I saw] and how I want to help. But it felt like nobody would pay
attention to these horrible things.”
We
talked to six other moderators who worked for companies that Meta subcontracted
between 2016 and 2022. All made similar claims to Walker. Their efforts to flag
and escalate possible child trafficking on Meta platforms often went nowhere,
they said. “On one post I reviewed, there was a picture of this girl that
looked about 12, wearing the smallest lingerie you could imagine,” said one
former moderator. “It listed prices for different things explicitly, like, a
blowjob is this much. It was obvious that it was trafficking,” she told us. She
claims that her supervisor later told her no further action had been taken in
this case.
When we
put these claims to Meta, a spokesperson said that moderators such as Walker do
not typically get feedback on whether their flagged content has been escalated.
They stressed that if a moderator does not hear back about a flagged case, that
does not mean no action has been taken.
Five of
the moderators claimed that it was harder to get cases escalated or content
taken down if it was posted on closed Facebook groups or Facebook Messenger.
Meta “would be less stringent about something taking place behind ‘closed
doors’,” claimed one team leader. “With Messenger, we really couldn’t make any
moves unless the language and content was really obvious. If it was four guys
who trusted each other and it was in a group it could just live on for ever.”
Meta said these allegations “appear to be misleading and inaccurate” and said
it uses technology to find child sexualisation content in private Facebook groups
and on Messenger.
In 2021,
former Facebook employee and whistleblower Frances Haugen leaked internal
documents that seem to support the moderators’ claims. These documents, which
numbered thousands of pages, detailed how the company managed harmful content.
In one memo from the Haugen leak, the company states that “Messenger groups
with less than 32 people should be treated with a full expectation of privacy”.
Matias
Cruz*, who worked as a content moderator from 2018 to 2020, reviewing
Spanish-language posts on Facebook, believes that the criteria that Meta was
using to recognise trafficking was too narrow to keep up with traffickers, who
would constantly switch codewords to avoid detection. According to Cruz,
traffickers would say: “‘I have this cabra [Spanish for goat] for sale,’ and
it’d be some really ridiculous price. Sometimes they would just outright say
[the price] for a night or two, or ‘an hour’.” It was obvious what was going
on, said Cruz, but “the managers would claim it was too vague, so in the end
they would just leave it up”.
Cruz and
three other moderators we spoke to claimed that in examples like this, where
their managers felt there was insufficient evidence to escalate the case,
moderators could receive lower accuracy scores, which in turn would affect
their performance assessments. “We would take negative hits on their accuracy
scores to try to get some help to these people,” Cruz said.
The limits of the law
While
the law requires Meta to report any child exploitation imagery detected on its
platforms, the company is not legally responsible for crimes that occur on its
platform, because of a law created almost three decades ago, in the early days
of the internet. In 1996, the US Congress passed the Communications Decency
Act, which was primarily intended to ensure online pornographic content was
regulated. But section 230 of the act states that providers of “interactive
computer services” – which includes the owners of social media platforms and
website hosts – should not be treated as the publisher of material posted by
users. This section was included in the act to ensure the free flow of
information while protecting the growing tech industry from being crushed by
litigation.
Whereas
a newspaper, say, must legally defend what it publishes, section 230 means that
a company like Meta, which hosts the content of others, may not be held liable
for what appears on its platforms. Section 230 therefore positions internet
service providers as fundamentally neutral: offering forums in which illegal,
harmful or false content may be posted and circulated, but ultimately not
responsible for that content. Since the passing of the act, tech companies such
as Meta have argued successfully in courts across the US that section 230
provides them with complete immunity from prosecution for any illegal content
published on their platforms, as long as they are unaware of that content’s
existence.
The
debate around section 230 has become highly polarised. Those who want section
230 amended say that the legal safe harbour it has provided for internet
companies means they have no incentive to root out illegal content on their
sites. In an op-ed published in the Wall Street Journal in January, President
Biden spoke out in favour of the section’s reform. “I’ve long said we must
fundamentally reform section 230,” he wrote, calling for “bipartisan action by
Congress to hold big tech accountable.”
However,
tech companies, along with internet freedom groups, argue that changes to
section 230 could lead to censorship and an erosion of privacy, particularly
for private, encrypted content. These arguments over section 230 are being put
to the test in a landmark case that has reached the US supreme court, which
focuses on how far YouTube can be considered culpable for the videos it
recommends to its users. A ruling is due by the end of June.
The consequences
Kyle
Robinson is one year into serving a 10-year sentence at a federal prison in
Massachusetts for sex trafficking two teenagers, one only 14 years old. We
spoke to him in January over the muffled line of the prison’s payphone, our
conversation interrupted by prison staff monitoring the call. Referring to
himself as a pimp, Robinson described how he sought out damaged girls from care
homes and on social media as a way to make money.
Instagram,
he said, was his platform of choice. “I find the girls that have pride in
themselves, but maybe don’t have the confidence, the self-esteem,” he claimed.
“I make her feel special. I give her validation, social skills, her
‘hotential’, if you know what I mean.”
Once he
had identified his targets, Robinson claimed that he would “coach” them and
advertise them on their Instagram accounts and his own. He would talk to
potential buyers through direct messages, offering to send video snippets of
the girls in return for “a small deposit” – about $20 – so that the buyers
could see what they would be getting. If a buyer decided to meet a girl, he
would pay her the rest of the money later, via CashApp, he said. Robinson would
then take most of that money.
To crack
down on such cases of child sexual exploitation, last June Meta announced new
policies including age verification software that will require users under 18
to provide proof of age through uploading an ID, recording a video selfie, or
asking mutual friends on Facebook to confirm their age. When we asked Tina
Frundt about these new measures, she was sceptical. The kids she works with had
already found workarounds; a 14-year-old, for example, might use a video selfie
made by her 18-year-old friend, and pretend that it’s her own.
Even after
children have been referred to Courtney’s House, they continue to be vulnerable
to traffickers. One night in June 2021, Frundt says she got a call from Maya,
telling her she had arrived home safe. Frundt was relieved: she knew that Maya
had spent the evening with a 43-year-old man who had been contacting her on
Instagram.
Frundt
says that Maya, now 15, was in a fragile state: over the previous few months,
her mental health had been in sharp decline and she had told Frundt she’d been
feeling suicidal. Photos and explicit videos taken by a pimp showing her having
sex were being circulated and sold on Instagram. Sex buyers were contacting her
relentlessly through her direct messages. “She didn’t know how to make it stop
or how to say no,” Frundt recalled.
That
night, on the phone, Frundt told Maya that she loved her and that they would
talk in the morning. “That’s the last time I ever spoke to her,” said Frundt.
The older man had given Maya drugs. When Maya’s mother went to wake her
daughter the next morning, she found her dead.
A
picture of Maya that still hangs on the wall of Courtney’s House shows a
baby-faced teenage girl with brown curls and a huge smile. Two years after her
death, Frundt continues to grieve for her caring “girly girl” who loved makeup,
board games and dancing to her favourite Megan Thee Stallion songs. “Losing one
of our youth, it changes you for ever. You can never forgive yourself,” she
said.
Messenger,
Facebook and Instagram app logos on a mobile phone screen
Crime
agencies condemn Facebook and Instagram encryption plans
Read
more
Before
Maya died, Frundt claims she spoke to Instagram on a video call, asking them to
remove the exploitative content her trafficker had circulated. Frundt says that
when Maya died, the videos of her being exploited were still on the platform.
In July
2021, a representative from an anti-trafficking organisation sent an email to
Instagram’s head of youth policy, informing her of Maya’s death. Frundt was
copied in on the email. It asked why Meta’s tools designed to detect grooming
had not flagged a 43-year-old man contacting a young girl. Four days later, the
company sent a brief reply. If Instagram was provided with details about the
alleged trafficker’s account, it would investigate.
But
Frundt says that it was too late. “She had already passed,” she says. “They
could have done something to help her but they didn’t. She was gone.”
Names
marked with an asterisk have been changed to preserve anonymity.
How
Facebook and Instagram became marketplaces for child sex trafficking. By Katie
McQue and Mei-Ling McNamara. The Guardian, April 27, 2023.
Annie
McAdams represents clients who claim Meta’s products connect vulnerable people
with sex buyers
On 14
March 2022, Annie McAdams, a personal injury lawyer running a small firm in
Houston, Texas, filed a civil action suit on behalf of one of her clients. The
plaintiff was a 23-year-old woman, who had endured years of sexual exploitation
at the hands of a convicted trafficker. The defendant was one of the most
powerful technology companies in the world.
Contained
within McAdams’s federal suit was a series of allegations that Meta – the owner
of Facebook and Instagram, which are used by more than 3 billion people every
day – had knowingly created a breeding ground for human trafficking and was
actively facilitating the buying and selling of people for sex online.
The
lawsuit alleges that the company’s products – particularly Instagram – connects
vulnerable victims with human traffickers and sex buyers, and provides
traffickers with the means to groom those victims. It says that human
trafficking victims are regularly posted on Instagram and sold for sex against
their will and claims that the company has failed to take adequate steps to
stop this.
In the
court documents, the plaintiff – who we are calling Shawna – says she was 18
when she was first contacted on Instagram by a man she didn’t know. She claims
that the man – referred to as RL in the court papers – sent her messages on her
public profile and on Instagram’s direct-messaging service and that this
campaign of grooming led to her agreeing to meet him in person. Two days after
their first meeting, she claims that RL began to sell her to sex buyers on
Instagram.
She
claims that RL posted explicit pictures of her on Instagram along with emojis
such as dollar signs, crowns and roses, widely recognised by law enforcement
and trafficking experts as indicators of commercial sex adverts.
“[Meta
Inc] knew that the use of these codes were blatant red flags … and were
actually sex trafficking advertisements designed to sell her for sex, but
[Meta] did nothing to remove or prevent those repeated posts, despite having
the ability to do so,” the court papers say.
Shawna
alleges that over the course of a year she was sold on Instagram to multiple
sex buyers. She says she was threatened with homelessness or violence by RL if
she refused to fulfil her “quota” of sex acts.
She went
on to testify against RL in a federal criminal trial in Texas and he was
subsequently sentenced to 40 years for sex trafficking.
However,
the lawsuit claims that at the time it was filed to the court in Houston,
Instagram had not removed the trafficker’s Instagram account.
McAdams
claims that despite repeated attempts by Meta to get the case dismissed,
Shawna, who is seeking damages from the company, is now on the brink of taking
her civil claim against Meta further through the US court system than any other
case has managed. She believes that there are now no serious legal obstacles
between her case and bringing Meta before a jury in 2024 to face allegations
that it played an integral role in the trafficking of her client.
A
spokesperson for Meta said that Meta prohibits sex trafficking on its platforms
“in no uncertain terms … we vigorously deny the claims made against Meta in
this suit.”
It is
not the first time that Meta – in either of its guises as Meta Platforms Inc or
Facebook Inc – has faced lawsuits containing similar allegations. Yet in the two
decades since it was launched by Mark Zuckerberg from a Harvard dorm, his
company – which rebranded from Facebook to Meta in 2021 – like other technology
companies with servers based in the US, has never faced prosecution for illegal
and harmful content and activities on its platforms.
For
decades, social media companies have sheltered behind an obscure clause in the
1996 Communications Decency Act – called section 230, which concludes that
technology companies are not legally responsible for crimes that occur on their
platforms. Section 230 states that providers of “interactive computer services”
– which includes the owners of social media platforms and website hosts –
should not be treated as the publisher of material posted by users.
Since
the act was passed, tech companies such as Meta have argued successfully in US
courts that section 230 provides them with immunity from prosecution for any
illegal content published on their platforms, as long as they are unaware of
that content’s existence, building a fortress of legal precedent.
Section
230 does not shield online platforms from federal criminal charges if they are
seen as responsible for facilitating trafficking. And a recent amendment to
section 230 – known as the Fosta-Sesta package – means that companies can be
held liable under state and civil laws but must be shown to have knowingly
assisted or facilitated sex trafficking.
Other
cases have attempted to swerve section 230, but in her federal suit, McAdams is
instead tackling it head-on, arguing that it has been misunderstood and was
never intended to protect a social media company which, she claims, knowingly
allows crimes against children to occur on its platforms.
“The problem is not section 230,” McAdams
says. “The problem is 20 years of bad precedent and the court’s
misinterpretation of 230. In no place does it say that there should be
immunity. There is a big difference between immunity and no liability.”
McAdams’
decision to tackle the interpretation of liability under section 230 comes as
changing legal winds across the US challenge the lack of accountability granted
to tech companies.
The
debate around section 230 has become highly polarised. Those who want the
clause amended say that the legal safe haven it has provided for internet
companies means they have no incentive to root out illegal content on their
sites.
Others
warn that amending section 230 would curb free speech and dismantle democratic
values online. Some sex worker groups also warn that undermining section 230
would harm their business and make them unsafe.
McAdams
says that the only way to stop social media platforms being used as online
marketplaces for sex trafficking is through the courts. Along with Shawna’s
case, McAdams has several similar suits filed against Meta across the US in
which other plaintiffs allege that Meta enabled, facilitated and profited from
their sex trafficking.
“After
years of being silenced, I hope my clients will have their day in court,” she
says. “Their bravery and resilience has started something that could finally
see the internet become a safer place for children and open the door for other
survivors to be heard. I have many, many more victims waiting to have their
cases reviewed. This is just the start.”
A Meta
spokesperson said: “Sex trafficking is abhorrent … we cooperate with law
enforcement so they can find and prosecute the criminals who commit these
heinous acts, and we use technology to help keep this abuse off our platforms.”
“Our
goal is to prevent people who seek to exploit others from using our platform,
and we work closely with anti-trafficking experts and safety organisations
around the world to inform these efforts. We will continue to join with others
across society in the fight against sex trafficking and the predators who
engage in it.”
The
lawyer whose sex trafficking case against Instagram could spell trouble for big
tech. By
Mei-Ling
McNamara. The Guardian, May 10, 2023.
No comments:
Post a Comment