Connect with us

Hi, what are you looking for?

Tech

‘A gut punch’: Character.AI criticised over ‘horrific’ Brianna Ghey and Molly Russell chatbots

The NSPCC is warning an AI company that allowed users to create chatbots imitating murdered teenager Brianna Ghey and her mother pursued “growth and profit at the expense of safety and decency”.

Character.AI, which last week was accused of “manipulating” a teenage boy into taking his own life, also allowed users to create chatbots imitating teenager Molly Russell.

Molly took her own life aged 14 in November 2017 after viewing posts related to suicide, depression and anxiety online.

The chatbots were discovered during an investigation by The Telegraph newspaper.

“This is yet another example of how manipulative and dangerous the online world can be for young people,” said Esther Ghey, the mother of Brianna Ghey, and called on those in power to “protect children” from “such a rapidly changing digital world”.

According to the report, a Character.AI bot with a slight misspelling of Molly’s name and using her photo, told users it was an “expert on the final years of Molly’s life”.

“It’s a gut punch to see Character.AI show a total lack of responsibility and it vividly underscores why stronger regulation of both AI and user generated platforms cannot come soon enough,” said Andy Burrows, who runs the Molly Rose Foundation, a charity set up by the teenager’s family and friends in the wake of her death.

The NSPCC has now called on the government to implement its “promised AI safety regulation” and ensure the “principles of safety by design and child protection are at its heart”.

“It is appalling that these horrific chatbots were able to be created and shows a clear failure by Character.AI to have basic moderation in place on its service,” said Richard Collard, associate head of child safety online policy at the charity.

Character.AI told Sky News the characters were user-created and removed as soon as the company was notified.

“Character.AI takes safety on our platform seriously and moderates Characters both proactively and in response to user reports,” said a company spokesperson.

“We have a dedicated Trust & Safety team that reviews reports and takes action in accordance with our policies.

“We also do proactive detection and moderation in a number of ways, including by using industry-standard blocklists and custom blocklists that we regularly expand. We are constantly evolving and refining our safety practices to help prioritise our community’s safety.”

Anyone feeling emotionally distressed or suicidal can call Samaritans for help on 116 123 or email jo@samaritans.org in the UK. In the US, call the Samaritans branch in your area or 1 (800) 273-TALK

This post appeared first on sky.com

    You May Also Like

    Stocks

    In this episode of StockCharts TV‘s The MEM Edge, Mary Ellen reviews what’s shaping up in the broader markets after the Fed announced their rate cut...

    Tech

    Consumer rights group Which? is suing Apple for £3bn over the way it deploys the iCloud. If the lawsuit succeeds, around 40 million Apple...

    Tech

    Battle lines have been drawn between the almost 200 countries meeting in Azerbaijan as they seek to agree a new pot of money to...

    Tech

    Meta has lowered the minimum age to use the popular messaging platform WhatsApp. The move, which came into effect on Thursday, reduces the age...

    Disclaimer: globalwashingtonwebinar.com, its managers, its employees, and assigns (collectively “The Company”) do not make any guarantee or warranty about what is advertised above. Information provided by this website is for research purposes only and should not be considered as personalized financial advice. The Company is not affiliated with, nor does it receive compensation from, any specific security. The Company is not registered or licensed by any governing body in any jurisdiction to give investing advice or provide investment recommendation. Any investments recommended here should be taken into consideration only after consulting with your investment advisor and after reviewing the prospectus or financial statements of the company.

    Copyright © 2024 globalwashingtonwebinar.com | All Rights Reserved