Something Was Wrong XX
[0] Wondery Plus subscribers can listen to Something Was Wrong early and ad -free right now.
[1] Join Wondry Plus in the Wondry app or on Apple Podcasts.
[2] Hi friends, it's Dr. Aden Hirschfield, media psychologist and researcher from the Media Psyched podcast.
[3] I'm guest hosting this episode of Broken PsychoMedia's new series of educational episodes called Data Points.
[4] These special episodes will include educational information, statistics, and support on different topics that are important to our community.
[5] Thank you so much for listening.
[6] Digital violence is an umbrella term to describe an array of harmful, digitally based behaviors that can lead to other forms of digital abuse or harassment.
[7] This includes, but is not limited to, the non -consensual distribution of intimate images, such as revenge porn, publishing of private information about someone online with malicious intent, also known as doxing, cyberstocking or cyberbullying, impersonation such as deep fakes, and even extortion commonly known as blackmail.
[8] Other forms of digital abuse include grooming, a term used broadly to describe the tactics abusers deploy through the internet to sexually exploit minors.
[9] As the nonprofit organization Thorne puts it, grooming can happen quickly or over time, but at its core, it's a process of exploiting trust to shift expectations of what safe behavior is and leveraging fear and shame to keep a child silent.
[10] It is difficult but important reality, to face so that we can take steps to stop it.
[11] So although these forms of violence may differ in the ways they are inflicted, they often share common themes.
[12] Rates of cyber and digital abuse are substantial.
[13] Many of our listeners have likely been victims of digital violence in their lifetime or no someone currently affected by it.
[14] A study conducted by the Pew Research Center in 2021 found that two -thirds of adults under age 30 have been harassed online, with 18 to 29 -year -olds experiencing the most of the more severe forms of abuse.
[15] Pew Research has also found that nearly half of U .S. teens, age 13 to 17, report experiencing at least one of six cyber bowling behaviors, including offensive name calling, spreading of false rumors about them, receiving explicit images they don't ask for, online stalking, physical threats of violence, and having explicit photos of them shared without their consent.
[16] Studies have established that digital abuse in a relationship rarely happens in isolation.
[17] For instance, research funded by the U .S. Department of Justice found that 84 % of cyber dating abuse victims also reported psychological dating abuse, 52 % reported physical dating violence, and 33 % reported sexual coercion.
[18] Digital violence has grown since the pandemic and now impacts most people who exist online.
[19] A 2021 research study conducted by Thorn found that nearly half of all kids online have been approached by someone who has been approached by someone who, they thought was attempting to, quote, befriend and manipulate them.
[20] In 2023, the National Center for Missing and Exploited Children received more than 30 million reports of child pornography, including possession, manufacture, and distribution, and over 180 ,000 reports of online enticement of children for sexual acts.
[21] This violence is also frequently gender and race -based and disproportionately affects marginalized groups.
[22] People who identify or present as women and LGBTQ plus are particularly vulnerable to digital abuse.
[23] This includes LGBTQ plus adults who at a rate of roughly 7 in 10 have encountered some form of online harassment.
[24] Meanwhile, more than 50 % of LGBTQ plus adults have been targeted as victims of more severe forms of digital abuse.
[25] When you add AI into the mix, things get even trickier.
[26] While the term AI does come from science fiction, its current real -world implications are far from fiction.
[27] In the tech industry, AI is typically referred to as generative AI.
[28] Think of generative AI as a highly skilled parent.
[29] It can mimic complex patterns, but does not truly understand the content it creates.
[30] Generative AI operates by digesting large datasets and predicting what comes next.
[31] It cannot grasp complex human experiences or perform tasks it hasn't been specifically programmed to handle.
[32] Generative AI is not independently intelligent and lacks both the ability to think critically as well as be creative.
[33] AI is now critically involved in how content spreads online and the information we use to make decisions.
[34] Think about social media algorithms or tools that create fake content or news and use automated online bots to spread disinformation and propaganda rampantly.
[35] The same technology can be misused to harass or intimidate people.
[36] possibly even perpetuate digital violence and abuse at scale.
[37] Deep fake technology has exploded in popularity.
[38] Deep fakes can be defined as synthetic media, typically videos or images in which a person's likeness is digitally altered or entirely generated using artificial intelligence techniques, making it appear as though they are doing or saying something they did not.
[39] This technology can convincingly mimic a person's facial expressions, voice, and movements, often leading to realistic but fabricated content.
[40] The most comprehensive study was conducted by a company called Deep Trace in 2019, which found that 96 % of deepfakes were pornographic.
[41] They also found that there are dedicated deep fake pornography websites that garner hundreds of millions of views.
[42] Combined this data with that on the pervasiveness of child pornography, and you have a recipe for digital child abuse at a massive scale.
[43] Working ahead, the implications of AI on digital violence are significant.
[44] As AI gets more advanced, people will likely continue to find new ways to misuse it.
[45] For example, generative AI could lead to even more convincing deepfakes, making it harder to tell what's real.
[46] It also becomes easier for cybercriminals and abusers to disguise themselves online and cover their tracks.
[47] This begs the question of how we enforce legislation at a micro and a macro level.
[48] Plus, with automated harassment becoming more common, like using online bots, this issue will only grow.
[49] As AI becomes a more significant part of social media and communication platforms, developers and policymakers must prioritize user safety and ethical considerations.
[50] It's important that legislation is enacted and action is taken to address these evolving issues.
[51] Relatedly, most victims in the U .S. are critical of how social media companies address online harassment.
[52] However, there is still much debate about what should be done to combat digital violence and abuse online, and the policies used to hold individuals accountable very drastically across sites.
[53] The U .S. government has taken several measures to address digital abuse, violence, cyberbullying, and harassment through legislation, law enforcement initiatives, educational programs, and partnerships with various organizations.
[54] For instance, the Children's Online Privacy Protection Act, or COPA, enacted in 1998, imposes specific requirements on operators of websites or online services directed to children under 13 years of age.
[55] There have been many bills introduced in 2023 that relate to online exploitation and abuse of children or to child safety more broadly.
[56] One of these critical pieces of legislation, the revising existing procedures on reporting via Technology Act or the Report Act makes statutory changes related to the reporting of crimes involving the online sexual exploitation of children, increases penalties for non -compliant providers, and limits liability for vendors and for self -reporting.
[57] On a global scale, the Digital Services Act from the European Union took effect in 2022.
[58] Its main goal is to prevent illegal and harmful activities online and the spread of disinformation.
[59] The law also ensured user safety, and holds online platforms accountable for harmful content.
[60] While some of these courts are beginning to discuss AI, the real challenge is creating laws that are flexible enough to deal with these fast -changing technologies and the rate at which abusive content can proliferate.
[61] As of now, 28 states have enacted AI -related legislation and 18 states adopted AI -related resolutions or enacted legislation in 2023, according to the National Conference to State Legislators.
[62] Currently, there are many states with pending legislation designed to protect citizens from the harm of AI.
[63] However, there are no comprehensive federal regulations or legislation in place as of now.
[64] Still, research funded by the U .S. Department of Justice found that most victims of digital abuse don't seek help or make a report.
[65] We also know that survivors face victim blaming and shaming due to societal rape culture, internalized misogyny, racism, transphobia, and homophobia.
[66] This is why few victims of digital violence abuse or bullying speak up or seek help.
[67] This ultimately means that social media and online platforms are hardly held accountable for allowing this abuse on their sites, and technically, many social media platforms are making a profit off of it happening.
[68] What else can we do to help victims of digital violence and abuse?
[69] Here are some strategies that could help.
[70] First, raising your and your family's education and awareness about digital abuse and the misuse of AI will empower you to recognize and address these issues.
[71] Consider some digital literacy programs that teach everyone, not just children, how to critically navigate digital technologies.
[72] Also consider petitioning tech companies to design safer platforms, such as better reporting systems, AI content moderation to catch abuse faster, and increased user -friendly privacy settings.
[73] Finally, consider reporting digital violence and abuse or cyberbullying on social media platforms or websites to the companies that own them and to the National Center for Missing and Exploited Children, as well as reporting child sexual abuse material and other cyber crime directly to the FBI using the internet crime complaint center or IC3 website.
[74] You can also get involved in legislation by contacting your legislator to ask what they are doing.
[75] You can also attend hearings and testify for bills or to local news outlets to cover the issue.
[76] I want to mention again that the internet, like AI, is a tool.
[77] It can be wielded as a weapon or an incredibly powerful ally in the war against digital abuse.
[78] When we use the internet to connect victims, share resources, and hold abusers accountable, we can combat digital violence at scale, too.
[79] Just as advancements and deepfakes come out, so do new technologies aimed at identifying and counteracting them.
[80] Ultimately, the more we talk about digital violence and abuse, the less able it is to operate in a protected vacuum.
[81] If you believe you are a loved one has been a victim of digital violence, or if you're looking for more resources or support regarding digital abuse, you may find the following organizations helpful.
[82] A National Domestic Violence Hotline has resources for victims and their allies, including a free 24 -7 hotline, live chat, and texting support.
[83] They also have informative research on digital abuse and harassment, as well as guides on internet safety.
[84] The nonprofit right to be has extensive resources on its website, including educational training and a self -care guide for those who experience online harassment.
[85] The nonprofit organization Thorn has resources for parents and a community for young individuals to find the resources necessary to navigate sexual exploration and risky encounters in the digital era.
[86] The National Center for Missing and Exploited Children's website has many resources for both adults and children, including a service called Take It Down, which helps people remove online nude, partial nude, or sexually explicit photos and videos taken from before they were 18.
[87] for more information about the nonprofit organizations mentioned prior please visit the episode notes for a more comprehensive list of organizations that are working to help eradicate digital violence or abuse please visit something was wrong .com forward slash resources many of the amazing groups listed on the website are only able to exist because of the community's help and support from people like you if you would like to find out more information about volunteer opportunities please feel free to visit the resources page as well and reach out to directly to the organization of your choice.
[88] My name is Dr. Aidan Hirschfield.
[89] My pronouns are he, him, or they, them.
[90] And you can find me on Instagram at Dr. aidan Hirshfield and my podcast media site on Spotify or Apple.
[91] You can also follow or connect with me on LinkedIn.
[92] Thank you so much for listening and learning with us.
[93] If you like something was wrong, you can listen early and ad -free right now by joining Wondery Plus in the Wondery app or on Apple Podcasts.
[94] Prime members can listen ad -free on Amazon Music.
[95] Before you go, tell us about yourself by filling out a short survey at Wondery .com slash survey.