TikTok is accused of sharing graphic videos related to serial killers with its young users while encouraging them to buy guns and body armor, a recent investigation has suggested.
In an eye-opening report conceived by online publication RawStory, investigators created an account for a fictional 13-year-old to browse the contents of the video-sharing app, which provides an endless stream of user-uploaded videos tailored to a user's interests.
However, after a few hours of surfing the social-networking site, the de facto youth was bombarded with a surge of disturbing content.
Within twelve hours of opening the account, RawStory's recommended content quickly devolved from innocent videos dealing with law enforcement to clips promoting firearms, and body armor and rifle mounts that improve the accuracy of weapons when fired.
What's more, the Chinese-owned app even provided links of websites to the fictitious teen where such items would be sold.
The app then suggested a slew of clips about serial killers to the imaginary adolescent, with one recommended account detailing the graphic murder of 14-year-old Konerak Sinthasomphone at the hands of notorious necrophiliac and sex offender Jeffrey Dahmer.
After the initial half-day period, the content generated by the app grew even more concerning, suggesting videos where young users uploaded and detailed their failed suicide attempts - with one such video showing a young girl in what appeared to be a hospital after the fact.
'Protecting minors is vitally important,' a spokeswoman for TikTok urged users, critics and concerned family members last month, after a Wall Street Journal report revealed the social media app served drug and bondage videos to teenage accounts.
When confronted with the Journal's report, the spokesperson said that the paper's investigation 'in no way represents the behavior and viewing experience of a real person.'
'TikTok has taken industry-first steps to promote a safe and age-appropriate experience for teens,' she then added.
TikTok, owned by Beijing-based internet technology company ByteDance, is an increasingly popular platform that provides a stream of user-uploaded videos to viewers - who in large part are US teens.
When users click on certain clips, the app responds by recommending a constant stream of additional, similar videos, tailored by TikTok's algorithm, and designed to keep users on the app.
And while it offers widely innocuous content, such as trendy memes and silly dance fads, the site can quickly send users down dark, toxic rabbit holes if they show interest in certain videos, as the probes done by RawStory and The Journal show.
What's more, the publication's in-depth exposés also seem to show that with each slew of recommended clips, the content being shown to users - which in these cases were theoretically well underage - grew more and more extreme.
Raw Story's simulated 13-year-old, for instance, initially dwelled on videos of police, servicemembers and hunting.
But within two hours of the account's creation, TikTok shelled the manufactured user with an assortment of off-color hunting videos, with one jokingly suggesting a hunter shoot a neighbor's dog and an Amish man.
Within three hours, TikTok recommended 'flexible' rifle armor - an advanced type of lightweight tactical gear used by SWAT members and servicemen to protect wearers from high-caliber bullets.
After five hours, TikTok suggested a video that prominently featured an iron-sight Fast Mount from the weapons gear company Unity Tactical. The mount is attached to a rifle to improve the wielder's aim and accuracy while shooting.
Unity Tactical's website adds that the mount is helpful 'especially while wearing tactical gear, night vision goggles, gas masks, helmets, and plate carriers.'
Both accounts promoting the tactile gear linked to websites where the items were sold.
A few hours later, at 10 pm - when most kids would be getting ready to hit the hay - TikTok offered up a not-so-healthy portion of videos about serial killers.
Upon clicking the profile, RawStory's now-likely restless adolescent came across graphic descriptions of murders committed by convicted murderer Jeffrey Dahmer, including the killing of Sinthasomphone, who Dahmer molested and drilled a hole in his skull to fill with acid while he was alive, and was found naked in the street by police.
The account also offered a detailed video study on another one of Dahmer's victims, 18-year-old Steven Hicks, who was 'dismembered and disposed of… in the woods behind his [Dahmer's] parent's home.'
The details of the murder - his first - are likely not something a parent would want their 13-year-old to view.
Dahmer masturbated over Hicks' corpse after bludgeoning him to death with a ten-pound barbell, and dissected his body in his basement.
Then, weeks later, the killer unearthed Hicks' remains from his shallow grave behind Dahmer's parent's home, and pared the flesh from the bones.
He then dissolved the flesh in acid before flushing the solution down the toilet and crushing Hicks' bones with a sledgehammer into dust.
TikTok is showing kids as young as 13 graphic videos as well as clips encouraging them to buy guns https://t.co/vBtbl7HGJc— Daily Mail Online (@MailOnline) October 27, 2021
With that said, many videos appearing on the social media site seem to violate TikTok's Community Guidelines.
TikTok supposedly condemns content that 'promotes, normalizes, or glorifies extreme violence or suffering' and 'depiction, promotion, or trade of firearms, ammunition, firearm accessories, or explosive weapons.'
But RawStory's probe found all four types of these prohibited weapons products - with the app going as far as to shill the dangerous gear as well.
Moreover, the probe revealed that when young users on the networking site show interest in content depicting soldiers or toy weapons, TikTok hits unsuspecting users with videos of people firing real weapons, and links them sites where they themselves can purchase them.
RawStory's 13-year-old user initially clicked on an assortment of military videos after the account's inception, and within 12 hours was shown content advertising firearms accessories and body armor.
TikTok's owner offers a different, censored version of its app in China which has more restrictive rules.
The version provided to Americans is banned in China.
TikTok declined to comment on RawStory's request for a statement concerning the results of the eyebrow-raising probe.
Meanwhile, TikTok is already facing heat from Congress, concerning the social media giant's child safety practices - or lack thereof.
The company's head of public policy testified today in a Senate consumer protection subcommittee hearing on social media and child safety.
"The problem is clear: Big Tech preys on children and teens to make more money," Sen. Edward Markey, D-Mass., said at a hearing by the Senate Commerce subcommittee on consumer protection at the nation's capital.
'Recent revelations about harm to kids online show that Big Tech is facing its Big Tobacco moment—a moment of reckoning,' subcommittee chairman Sen. Richard Blumenthal said last week in a statement.
'We need to understand the impact of popular platforms like Snapchat, TikTok, and YouTube on children and what companies can do better to keep them safe.'
This article has been adapted from its original source.
© Associated Newspapers Ltd.