Sir Nick Clegg has welcomed calls for Facebook and other social media platforms to be more closely regulated by governments.
The former deputy prime minister, who joined the tech giant last October following his ousting as a Lib Dem MP, told an audience in Berlin that online companies needed more help to police what is featured on their sites - notably harmful content and political advertising.
In a wide-ranging speech that also covered claims Russia used Facebook to influence the outcome of the Brexit vote three years ago, the 52-year-old said the social network plans to set up an independent oversight board to which people can appeal against content decisions made by its own moderators, but admitted it needed help.
Sir Nick Clegg is the vice president of global affairs and communications at Facebook
Sir Nick argued: "Whether the rules Facebook has created, and the systems it uses to enforce them, work, and whether they tread the right side of the line in terms of what is and isn't allowed, will of course be the subject of much debate, and rightly so. We will get some things right and some wrong.
"But it would be a much easier task, as well as a more democratically sound one, if some of the sensitive decisions we have to make were instead taken by people who are democratically accountable to the people at large, rather than by a private company.
"After all, why should a private company decide who is or isn't a legitimate participant in an election?"
Sir Nick - who also used his speech to argue against proposals for Facebook to break up, splitting its main website from other apps it owns like Instagram and WhatsApp - added: "We need governments and policymakers to engage with issues like these and help set the parameters for us."
The speech in Germany came five months after Sir Nick - the vice president of global affairs and communications at Facebook - said his employer was entering a phase of "reform, responsibility and change", amid demands it "purge" its platforms of dangerous content such as that related to suicide and self-harm.
Molly Russell took her own life in November 2017
Back in January, the father of 14-year-old Molly Russell accused social media platforms of playing a part in his daughter taking her own life after she was found to have viewed material related to anxiety, self-harm and suicide on apps including Facebook-owned Instagram.
Health Secretary Matt Hancock said he had written to a number of internet companies to remind them of their duty to act, and Sir Nick acknowledged at the time that Facebook had to address the issue.
But a month after Instagram promised to remove such graphic images, a Sky News investigation found a number of disturbing videos and pictures still featured on the social media site.
Facebook also owns Instagram and WhatsApp
Since then, the UK government has unveiled plans to potentially hold social media bosses personally liable for any users who come into harm as a result of content on their platforms.
Facebook boss Mark Zuckerberg has indicated that he would be happy for governments and regulators to play "a more active role" in policing the internet and the standards of big online companies - notably in helping to identify harmful content and political advertising.