A video of a 12-year-old girl's suicide which she recorded live on her mobile phone has been viewed by millions of people online, prompting calls for action to remove the footage.
As viewers watched her live online, Katelyn Nicole Davis filmed herself as she claimed that she had been a victim of sexual abuse by a family member. Then she took her own life.
The youngster used a livestreaming app which allowed her to broadcast herself using her mobile phone.
But following her death, the video of her suicide went viral after it was copied from the feed on Live.me, a social media app similar to Facebook Live, and duplicated.
The full 42-minute video of the girl preparing for and carrying out her suicide was available on YouTube and Facebook for several days but were removed when both companies were contacted by The Independent.
The YouTube video had received nearly 40,000 views before it was taken down.
But other sites continue to show it and Some will profit from advertising each time it is viewed.
Officers from Polk County Police were only tipped off to Katelyn's suicide by another force thousands of miles away in California.
After arriving at her home in Cedarwood, Georgia they rushed her to hospital but she was pronounced dead on arrival.
After receiving hundreds of complaints from members of the public, some as far away as the UK, Polk County Police have urged people to take the video down or refrain from viewing it.
NATIONAL 24/7 CRISIS SERVICES
There have been numerous videos and posts on various internet websites that are referencing this case," they said on Facebook. "We are making a specific request that anyone who has any knowledge, videos, or comments regarding this case, please keep this information off of the internet.
"Out of respect for the family of the departed and for the deceased themselves, we respectfully request that the citizenry of Polk County and whoever might view or receive this message please help us out with this request."
But the force has no legal powers to get taken the video taken down, although some websites have agreed to their requests to remove it.
"We want it down as much as anyone for the family and it may be harmful to other kids," Police Chief Kenny Dodd told Fox 5 News.
"We contacted some of the sites. They asked if they had to take it down, and by law they don't.
"But it's just the common decent thing to do in my opinion."
Charities have also joined the call for their removal.
A spokeswoman for the NSPCC told The Independent: "These videos must be taken down immediately and we are pressing social media sites to get this done as soon as possible. In this instance warnings of graphic content do not go far enough.
"Every child should be safe to use the internet without seeing harmful content, and children who are contemplating self-harm or suicide should be directed towards support and help rather than graphic and distressing content.
"This video highlights the urgent need for the law to protect children from unsuitable and harmful content, including violence and self-harm, through removing or blocking content online age verification measures."
"Social media companies have a further responsibility to strengthen their links with emergency services so that young people can report triggering content efficiently and with the knowledge that this will be acted upon."
The charity said it was campaigning for a Digital Economy Bill to set out minimum standards that all social media companies who operate the uk must follow.
Dr Marc Bush, Chief Policy Advisor at YoungMinds, told The Independent that he thought the video was "truly shocking and disturbing" which would "understandably cause concern among young people and the wider public".
He said: "The legalities and accountability surrounding social media platforms enabling the sharing of disturbing and triggering content such as this can be complex. However, we believe that there is a moral, if not clear legal responsibility on industry to seed out videos such as this and close them down before they can cause any further harm."
This is not the first time tech companies have been criticised for allowing inappropriate footage to be broadcast via their platforms.
In April last year a teenager was accused of livestreaming her friend's rape using Twitter's streaming app, Periscope.
Marina Lonina, 18, from Ohio pleaded not guilty to rape and kidnap but prosecutor Ron O'Brien said she had got "caught up in the likes" and was "giggling and laughing" in the footage.
A spokeswoman for Facebook said they were currently investigating the matter.
A spokeswoman for YouTube thanked The Independent for flagging the video and referred it to their community guidelines which forbid which "prohibit violent or gory content posted in a shocking, sensational or disrespectful manner".
The guidelines, which do not specifically refer to suicide, say if a video is "graphic" it is only permitted on the site when "supported by appropriate educational or documentary information".
The Independent also contacted Live.me for comment but they had not responded at the time of publication.
Update your news preferences and get the latest news delivered to your inbox.