Suicide accusations on Meta and Pinterest!

Today, the use of social media has become one of the most important parts of human life. So much so that the age of use has now decreased to the age range of 6-8 years. Although social media provides convenience in terms of communication, it can have negative consequences, especially for people at an early age.

Meta and Pinterest include suicidal content, court says

14-year-old Molly Russell committed suicide in November 2017. The last hearing of the long-lasting court process was held on Friday. Meta and Pinterest platforms came to the fore at the hearing. The court considers that suicidal ideation is reinforced by the use of social media.

Russell, who started using social media after the age of 12, lived a life that did not go beyond the normal line of a child, according to his close circle. However, in the process that passed with his suicide at the age of 14, it turned out that the situation had different backgrounds. Russell’s social media searches reveal the facts.

Artificial intelligence against artists!  DALL-E rival platform from Meta

Artificial intelligence against artists! DALL-E rival platform from Meta

Developed by Meta and called Make-A-Video, the platform will produce videos inspired by the sentences written by users.

Two weeks after Russell’s death, his father’s Pinterest “Depression Pins You May Like” posted on his Instagram account, “Who loves a suicidal girl?” He found a folder titled “Junk” containing dozens of disturbing pictures and quotes, including

Although the court does not find Meta and Pinterest platforms directly guilty regarding the incident, it thinks that the situation has become dangerous. So much so that Elizabeth Lagone, Meta’s head of health and well-being policy, appeared in court and testified on the situation.

Meta and Pinterest

On the other hand, the family’s lawyer, Oliver Sanders, accused Meta, saying “Instagram is literally giving Molly ideas.” On the other hand, Pinterest’s head of community operations Jud Hoffman stated that the teenager named Russell was not safe when he used the platform and that he deeply regretted the material he saw and apologized to the family. So much so that the company has been working hard on extracting harmful content since 2018.

source site-28