Malaysian police are investigating the case of a teenager believed to have jumped to her death after asking her social media followers to vote on whether she should kill herself.
The 16-year-old girl, who was not named, had run a poll on photo-sharing app Instagram with the question "Really Important, Help Me Choose D/L", hours before jumping off the roof of a building in Sarawak, on Malaysia's east, on Monday, district police chief Aidil Bolhassan told Reuters.
The 'D/L' meant 'Death/Life', and the poll had showed 69% of the girl's followers chose 'D', he said.
"We are conducting a post-mortem to determine whether there were other factors in her death," he said, adding that the girl had a history of depression.
Instagram reviewed the teenager's account and found that the online poll, which ran over a 24-hour period, ended with 88% percent votes for 'L', said Wong Ching Yee, Instagram's head of communications in the Asia-Pacific.
Aidil, however, said that the poll's numbers may have changed after news of the girl's death spread.
The case had sparked concern among Malaysian lawmakers who called for a wider probe.
Ramkarpal Singh, a lawyer and member of parliament, said that those who voted for the teenager to die could be guilty of abetting suicide.
"Would the girl still be alive today if the majority of netizens on her Instagram account discouraged her from taking her own life?" he said in a statement.
"Would she have heeded the advice of netizens to seek professional help had they done so?"
Youth and Sports Minister Syed Saddiq Syed Abdul Rahman also called for a probe, saying that rising suicide rates and mental health issues among young people needed to be taken seriously.
Under Malaysian law, anyone convicted of abetting the suicide of a minor could face the death penalty or up to 20 years' jail and a fine.
Instagram extended its sympathies to the teenager's family, and said the company had a responsibility to make its users feel safe and supported.
"As part of our own efforts, we urge everyone to use our reporting tools and to contact emergency services if they see any behavior that puts people's safety at risk," Wong said.
In February, Instagram banned graphic images and content related to self-harm from its platform, citing a need to keep vulnerable users safe.
The changes came following pressure from the parents of a British teenager, who believed that viewing Instagram accounts related to self-harm and depression contributed to their daughter's suicide in 2017.