Pump.fun, a meme coin creation platform that has been a hub for Solana tokens in 2024, faced a severe challenge this week when its newly launched video feature was exploited to upload child sexual abuse material. The incident, confirmed by Pump.fun’s developers, has raised alarms about the moderation and safeguarding of such digital platforms.
Exploitation of the New Video Feature
On Wednesday, Pump.fun enabled users to attach tokenized videos to their meme coin creations—previously limited to images and GIFs. The new functionality spurred a wave of quirky, edgy meme coins, but at least one user took advantage of the feature to upload child sexual abuse material.
A user quickly flagged the existence of the disturbing video, prompting an immediate response from Pump.fun. Alon, the platform’s pseudonymous co-founder, confirmed that the content was removed “shortly” after being reported. “Any forms of illegal content, especially those affecting children, have never been and will never be welcome on Pump.fun,” Alon told.
The Fight Against Exploitative Content
Pump.fun is not the first platform to face the issue of child sexual abuse material (CSAM). The sharing of such materials online is reportedly at an all-time high. Last year, Twitter suspended 12.4 million accounts for violating its policies on child sexual exploitation. Platforms like Snapchat and Twitter have become major arenas for such content, with Snapchat responsible for 44% of all flagged content reported to police in a UK study.
NSPCC, a leading children’s charity, emphasized the impact that sharing CSAM has on victims. “The sharing of child sexual abuse material online is at record levels and has a devastating impact on young victims, putting them at risk of further abuse and trauma,” an NSPCC spokesperson told.
Legal and Moderation Challenges
Under the Communications Decency Act (CDA), online platforms are not directly liable for the content posted by users, but they can be held accountable if they do not promptly remove illegal content once notified. Andrew Rossow, a digital media attorney, underscored the importance of proactive content moderation, suggesting that platforms like Pump.fun should establish robust reporting mechanisms and clear removal policies.
Pump.fun already has a token report feature and a support Telegram chat to help users report illegal or harmful content. The Twitter user who initially reported the CSAM confirmed that the platform handled the situation professionally and took swift action to resolve the issue.
Ensuring Safety Moving Forward
Alon reiterated Pump.fun’s commitment to making the platform safe for users: “We will continue to take steps to make it more difficult for malicious actors to spread such content,” he said. The platform’s moderation team has successfully dealt with inappropriate content in the past, but this latest incident highlights the ongoing challenges that arise when providing open-access tools for content creation.
If the creators of the illegal tokens are identified, they could face serious criminal and civil charges, said Rossow.
Pump.fun’s swift response has been praised, but the incident serves as a stark reminder of the dangers of online platforms and the need for vigilant moderation to combat the spread of harmful material.