entertainment

Deepfake of principal's voice is the latest case of AI being used for harm

Font size+Author:Stellar Standpoint news portalSource:health2024-04-30 13:00:38I want to comment(0)

The most recent criminal case involving artificial intelligence emerged last week from a Maryland hi

The most recent criminal case involving artificial intelligence emerged last week from a Maryland high school, where police say a principal was framed as racist by a fake recording of his voice.

The case is yet another reason why everyone — not just politicians and celebrities — should be concerned about this increasingly powerful deep-fake technology, experts say.

“Everybody is vulnerable to attack, and anyone can do the attacking,” said Hany Farid, a professor at the University of California, Berkeley, who focuses on digital forensics and misinformation.

Here’s what to know about some of the latest uses of AI to cause harm:

AI HAS BECOME VERY ACCESSIBLE

Manipulating recorded sounds and images isn’t new. But the ease with which someone can alter information is a recent phenomenon. So is the ability for it to spread quickly on social media.

The fake audio clip that impersonated the principal is an example of a subset of artificial intelligence known as generative AI. It can create hyper-realistic new images, videos and audio clips. It’s cheaper and easier to use in recent years, lowering the barrier to anyone with an internet connection.

Related articles
  • Jaguars pick up fifth

    Jaguars pick up fifth

    2024-04-30 12:11

  • Monica Garcia suffers pregnancy loss: RHOSLC star, 39, reveals she lost her baby

    Monica Garcia suffers pregnancy loss: RHOSLC star, 39, reveals she lost her baby

    2024-04-30 11:50

  • China's bond market issuances hit 7.16 trillion yuan in March

    China's bond market issuances hit 7.16 trillion yuan in March

    2024-04-30 11:11

  • Bookstore makes reading accessible to rural areas in Inner Mongolia

    Bookstore makes reading accessible to rural areas in Inner Mongolia

    2024-04-30 10:26

Netizen comments