India News Nation
Interenational

According to a study, YouTube's recommendations send violent gun videos to 9-year-olds

When social media researchers at a nonprofit organization sought to see how YouTube videos and gun violence were related, they created profiles on the site that acted like ordinary American boys.
They played out the roles of two nine-year-olds who enjoyed playing video games, particularly first-person shooters. The only difference between the accounts was that one clicked on the YouTube recommendations, while the other did not.
The account that clicked on YouTube's recommendations was suddenly inundated with graphic films on school massacres, videos on tactical gun training, and how-to guides for making weapons fully automatic.
In one video, a little girl in elementary school was shown firing a pistol, while in another, a shooter fired shots at a fake skull covered in genuine blood and brains. Many of the videos are in violation of YouTube's own rules barring graphic or violent material.
According to the research, YouTube is failing to control the distribution of ominous films that might traumatize young children or lead them down perilous paths of extremism and violence, despite its policies and attempts at content moderation.
“One of the most well-liked pastimes for children is playing video games. “Call of Duty” may be played without leading to a gun store, but YouTube is leading people there, according to Katie Paul, director of the Tech Transparency Project, the research organization that released its findings regarding YouTube on Tuesday. “It's not the kids, it's not the video games. They are the algorithms.
In a single month, 382 distinct firearms-related films, or around 12 per day, were sent to the accounts that followed YouTube's recommended videos. Only 34 videos about guns were shown to the accounts who disregarded YouTube's suggestions.
The researchers also created profiles that resembled 14-year-old males who enjoyed playing video games; the material that was posted on those accounts was similarly heavily influenced by guns and violence.
The title of one of the suggested videos for the accounts was “How a Switch Works on a Glock (Educational Purposes Only).” The video was eventually taken down by YouTube because it broke their rules; nevertheless, two weeks later, a nearly similar video with a slightly different name appeared and is still accessible.
On Tuesday, messages sent to YouTube for comments were not promptly answered. The platform's executives have said that finding and eliminating hazardous information is a top responsibility, along with safeguarding its youngest users.
Before accessing YouTube, users under the age of 17 must get their parent's permission; accounts for individuals under the age of 13 are connected to the parental account.
The video-sharing website is one of the most well-liked ones for kids and teenagers, along with TikTok. Both websites have come under fire in the past for posting and, in some instances, endorsing films that promote self-harm, eating disorders, and gun violence. Social media critics have also drawn attention to the connections between social media, extremism, and violent crime.
Many recent mass shootings have been carried out by individuals who utilized social media and video streaming services to promote violence or even webcast their crimes. The perpetrator of the 2018 massacre at a Florida school that left 17 people dead claimed “I wanna kill people,” “I'm going to be a professional school shooter,” and “I have no problem shooting a girl in the chest” in postings on YouTube.
The neo-Nazi shooter who murdered eight people earlier this month at a shopping center in the Dallas-Fort Worth region also had a YouTube channel with videos on building weapons, serial killer Jeffrey Dahmer, and a scenario from a television drama where a school shooting occurs.
Some of the films discovered by the Tech Transparency Project researchers have already been taken down by YouTube, but in other cases, the information is still accessible. The results from the Project's analysis, according to Paul, demonstrate that increased expenditures in content moderation are required. Paul said that many large tech businesses depend on automated systems to detect and delete information that violates their standards.
According to Shelby Knox, campaign director for the parental advocacy organization Parents Together, social media businesses may utilize potentially hazardous information to target young users in the absence of government oversight.
Platforms like YouTube, Instagram, and TikTok have come under fire from Knox's organization for making it simple for kids and teenagers to discover information about suicide, firearms, violence, and drugs.
In reaction to a revelation that revealed TikTok was promoting dangerous material to teenagers earlier this year, Knox said that “Big Tech platforms like TikTok have repeatedly chosen their profits, their stockholders, and their companies over children's health, safety, and even lives.”
The website and its rules, which forbid users under the age of 13, have been defended by TikTok. Additionally, it has prohibitions against watching films that promote bad conduct, and when users look for information on themes like eating disorders, a notification with mental health services is immediately shown.When social media researchers at a nonprofit organization sought to see how YouTube videos and gun violence were related, they created profiles on the site that acted like ordinary American boys.
They played out the roles of two nine-year-olds who enjoyed playing video games, particularly first-person shooters. The only difference between the accounts was that one clicked on the YouTube recommendations, while the other did not.
The account that clicked on YouTube's recommendations was suddenly inundated with graphic films on school massacres, videos on tactical gun training, and how-to guides for making weapons fully automatic.
In one video, a little girl in elementary school was shown firing a pistol, while in another, a shooter fired shots at a fake skull covered in genuine blood and brains. Many of the videos are in violation of YouTube's own rules barring graphic or violent material.
According to the research, YouTube is failing to control the distribution of ominous films that might traumatize young children or lead them down perilous paths of extremism and violence, despite its policies and attempts at content moderation.
“One of the most well-liked pastimes for children is playing video games. “Call of Duty” may be played without leading to a gun store, but YouTube is leading people there, according to Katie Paul, director of the Tech Transparency Project, the research organization that released its findings regarding YouTube on Tuesday. “It's not the kids, it's not the video games. They are the algorithms.
In a single month, 382 distinct firearms-related films, or around 12 per day, were sent to the accounts that followed YouTube's recommended videos. Only 34 videos about guns were shown to the accounts who disregarded YouTube's suggestions.
The researchers also created profiles that resembled 14-year-old males who enjoyed playing video games; the material that was posted on those accounts was similarly heavily influenced by guns and violence.
The title of one of the suggested videos for the accounts was “How a Switch Works on a Glock (Educational Purposes Only).” The video was eventually taken down by YouTube because it broke their rules; nevertheless, two weeks later, a nearly similar video with a slightly different name appeared and is still accessible.
On Tuesday, messages sent to YouTube for comments were not promptly answered. The platform's executives have said that finding and eliminating hazardous information is a top responsibility, along with safeguarding its youngest users.
Before accessing YouTube, users under the age of 17 must get their parent's permission; accounts for individuals under the age of 13 are connected to the parental account.
The video-sharing website is one of the most well-liked ones for kids and teenagers, along with TikTok. Both websites have come under fire in the past for posting and, in some instances, endorsing films that promote self-harm, eating disorders, and gun violence. Social media critics have also drawn attention to the connections between social media, extremism, and violent crime.
Many recent mass shootings have been carried out by individuals who utilized social media and video streaming services to promote violence or even webcast their crimes. The perpetrator of the 2018 massacre at a Florida school that left 17 people dead claimed “I wanna kill people,” “I'm going to be a professional school shooter,” and “I have no problem shooting a girl in the chest” in postings on YouTube.
The neo-Nazi shooter who murdered eight people earlier this month at a shopping center in the Dallas-Fort Worth region also had a YouTube channel with videos on building weapons, serial killer Jeffrey Dahmer, and a scenario from a television drama where a school shooting occurs.
Some of the films discovered by the Tech Transparency Project researchers have already been taken down by YouTube, but in other cases, the information is still accessible. The results from the Project's analysis, according to Paul, demonstrate that increased expenditures in content moderation are required. Paul said that many large tech businesses depend on automated systems to detect and delete information that violates their standards.
According to Shelby Knox, campaign director for the parental advocacy organization Parents Together, social media businesses may utilize potentially hazardous information to target young users in the absence of government oversight.
Platforms like YouTube, Instagram, and TikTok have come under fire from Knox's organization for making it simple for kids and teenagers to discover information about suicide, firearms, violence, and drugs.
BEST VIDEOS
Vicky Kaushal Takes The Internet By Storm With Katrina Response | Times When Journalists Lost It SidKiara Share Unseen Photographs From Marriage Rituals | Alia Bhatt Takes Off Quickly For Gucci Event – News Summary
This summer, Sara Ali Khan, Kiara Advani, Alia Bhatt, and Ananya Panday argue in favor of the color yellow. Mother's Day 2023 | Hands-on moms include Kareena Kapoor, Soha Ali Khan, Shilpa Shetty, and Sushmita Sen.
Wishes pour in as Parineeti Chopra and Raghav Chadha exchange rings. In Yellow, Priyanka Chopra Turns Heads
In reaction to a revelation that revealed TikTok was promoting dangerous material to teenagers earlier this year, Knox said that “Big Tech platforms like TikTok have repeatedly chosen their profits, their stockholders, and their companies over children's health, safety, and even lives.”
The website and its rules, which forbid users under the age of 13, have been defended by TikTok. Additionally, it has prohibitions against watching films that promote bad conduct, and when users look for information on themes like eating disorders, a notification with mental health services is immediately shown.

Related posts

Parkash Singh Badal cremated at ancestral village in Punjab; leaders across political spectrum pay respects | Chandigarh News – Times of India

cradmin

Unarmed terrorism is a new kind of terrorism.On "The Kerala Story," Nadda

bpnewscg

King Charles Honors UK-Based Sanskrit Scholar MN Nandakumara with an MBE

bpnewscg