Molly, from Harrow, north-west London, is known to have viewed material linked to anxiety, depression, self-harm and suicide before ending her life in November 2017, prompting her family to campaign for better internet safety.
Previous hearings have heard how the 14-year-old had engaged with tens of thousands of social media posts in the six months before she died, including content which “raised concerns”.
The inquest into her death was delayed in March after thousands of pages of new evidence about her internet history were submitted.
Senior employees from social media giants Meta, the parent company of Instagram and Facebook, and Pinterest, are due to give evidence in person at the inquest.
Legislation is ‘vital’
Ms Donelan told Sky News that social media firms efforts to protect children had been “inconsistent” and they had not always been held to account, which was why legislation was “vital” and give the watchdog Ofcom powers to impose multi-billion pound fines of up to 10 per cent of global turnover.
“They do need to be doing more which is why we’re going to legislate to ensure that we do, so we have not only the legal powers that we do so then we can whack them with massive, massive fines which then is a big deterrent and ensures that they act in a responsible way from the off.”
Speaking on Times Radio, she added: “We’ve seen the devastating consequences of when that doesn’t happen and some of the especially social media companies are not held to account on this.”
She did not detail how the legislation will be amended to protect free speech. Concerns have been raised by senior Tory backbenchers that measures to tackle legal but harmful content could allow “woke” tech firms to remove controversial contents.
“There is an element of the bill that is in relation to free speech for adults and we are looking at that and we will be changing [that] to make sure we get the balance right,” said Ms Donelan.