B2F
vte

 

How is it even possible that Twitter can't keep child porn off of its site?

How is it even possible that Twitter can't keep child porn off of its site?

Adobe Stock Photo from the search term "the Dark Web" used to describe the shocking lack of safety against sexual predators that children experience on Twitter.

Content moderators banned the hashtag #savethechildren, but allowed a known pedophile hashtag to remain

By Tatiana Prophet

The key to understanding Twitter is to realize that it’s a den of child exploitation, and that the company’s default position is that explicit material is unknowingly becoming part of its revenue stream. But there is evidence going back 10 years that the company is at a minimum guilty of placing children at the bottom of its priority list.

As of December 2020, Twitter received the lowest overall rating, compared to other platforms – including Bing, Facebook, Pornhub, and XVideos – by the Canadian Centre for Child Protection® for its reporting structure of Child Sexual Abuse Material.

As for public tips, Twitter has a huge delay in reporting to law enforcement.

Meanwhile content moderators pounce on tweets that question the shifting “science” of the Covid vaccine. The hashtag #savethechildren has been banned because it is allegedly associated with the “Q Anon conspiracy theory.”

Yet — unbelievably — the hashtag #megalinks has been used unfettered to attract pedophiles to each other and to CSAM content.

To take one prominent example, searching the hashtag #megalinks on Twitter brings up users commenting on CSAM, openly soliciting for and offering to trade CSAM, and encouraging communication by direct message where CSAM can be illegally distributed and exchanged.

A search of the hashtag #megalinks on Twitter also brings up promoted tweets or advertisements, which are displayed intermixed between tweets that include the hashtag and overwhelmingly pertain to CSAM. Here is an example of advertising on Twitter featuring content relating to children that Twitter placed in Twitter’s search results for the #megalinks hashtag and solicitation for CSAM using “c*p” to describe “child porn”:

That right there should tell us all that something is amiss — especially with the pre-Musk employee count swelling from 5,500 to 7,500 between 2020 and 2021 — to invest in audience growth. But shares fell in 2020 after the growth that did materialize did not meet expectations. So what in the world were they doing over there?

If that many employees cannot remove videos in a timely manner that depict sex acts involving children, what else are they fumbling — especially on matters of life and death?

And for that matter, why can’t Twitter do both: provide accurate fact checking — and keep CSAM (Child Sexual Abuse Material) off their platform?

Consider one of the most high-profile recent cases of shocking inaction by Twitter: In 2017, two 13-year-old boys were lured via Snapchat into sending nude photos of themselves to predators posing as two girls from their school. The predators immediately turned to blackmail, threatening to share the original images with their parents, school and church congregation — unless the boys engaged in sexual acts with each other and filmed them. Two years later, according to a lawsuit on behalf of the boys, the content was republished in a compilation on Twitter.

In January 2020, according to the suit, both John Doe #1 and John Doe #2 (plaintiffs) became aware that they were recognized by classmates and that the video was being circulated widely throughout the school. John Doe #2 refused to attend school for several weeks. John Doe #1 became suicidal. On Jan. 21 he filed a complaint with Twitter, had to prove that he was a minor by uploading his driver’s license, and was assigned a case number. The content remained up and Twitter took no action to permanently suspend (as is policy) the accounts who had shared the CSAM. John Doe #1’s mother made another complaint with the same result.

But the video was racking up views and Twitter still took no action. The lawsuit continues:

On January 28, 2020, John Doe #1 responded to Twitter’s message as follows: What do you mean you don’t see a problem? We both are minors right now and were minors at the time these videos were taken. We both were 13 years of age. We were baited, harassed, and threatened to take these videos that are now being posted without our permission. We did not authorize these videos AT ALL and they\ need to be taken down. We have a case number with the [Law Enforcement Agency] for these videos and this incident. Please remove this video ASAP and any videos linked to this one. There is a problem with these videos and they are going against my legal rights and they are again at (sic) the law to be on the internet. (capitalized emphasis in original)

124. Twitter ignored John Doe #1’s final plea and the illegal videos depicting CSAM remained live on Twitter. Just two days after John Doe #1 first contacted Twitter, the videos had accrued over 167,000 views and 2,223 retweets.
— - Forbes article

On Friday November 18, Forbes magazine did a piece on the persistent problem.

“Twitter has long struggled to police the tidal wave of child sexual abuse material in its so-called “public town square,” wrote Alexandra S. Levine and Thomas Brewster. “Those shortcomings last year landed the company at the center of a significant child safety lawsuit which new CEO Elon Musk must soon wrangle. Yet his turbulent takeover of the platform has left a skeleton crew to deal with Twitter’s enormous problem with illegal content of kids—one that experts fear is about to get a whole lot worse.”

Notice how the language in their lead paragraph establishes a correlation between the amount of total employees and Twitter’s “enormous problem with illegal content of kids.” Yet Twitter’s massive injection of 2,000 employees between 2019 and 2020 was for “audience growth” — which failed to deliver the desired results.

In early November, Elon Musk made a deep cut to the Twitter workforce, instructing workers in an evening email not to return to the office in the morning and to stay tuned for notice of job status. Then last week, Musk issued his now-famous ultimatum that hours would be long and difficult, and “hardcore,” whereupon many more employees quit.

On Monday Nov. 21, Musk announced that Twitter was done with layoffs for now and would start rehiring soon, according to The Verge.

From the John Doe lawsuit: there are immediate changes Musk could make to minimize and eradicate the exploitation of children on Twitter.

“A recent report by the Canadian Centre for Child Protection® found that Twitter’s platform made it “extremely difficult”bto report CSAM, specifically:

• Twitter does not allow users to report a tweet for CSAM “through the easily accessible report function”; one “must first locate the child sexual exploitation report form.”

• Twitter does not allow people to “report an image or video sent within a DM on Twitter as CSAM. The child sexual exploitation report form will not accept the URL from an image within a DM.”

Twitter requires an email address for submitting CSAM reports.

• Even though tweets can be viewed without being logged in, Twitter requires a person to be logged in (and therefore have a Twitter account) in order to report CSAM.

[As mentioned above]: Twitter received the lowest overall rating, compared to other platforms – including Bing, Facebook, Pornhub, and XVideos – by the Canadian Centre for Child Protection® for its CSAM reporting structure.

Further reading: Amicus brief from NCMEC.Plaintiffs allege facts plausibly showing Twitter knowingly participated in a sextortion venture. Page 36

While you were occupied ... This is the real news

While you were occupied ... This is the real news

Is Biden slow-walking Trump's Canada drug import rule?

Is Biden slow-walking Trump's Canada drug import rule?

0