London – The social media platform X, formerly known as Twitter and owned by Elon Musk, has committed to accelerating its response to flagged hate and terror content within the United Kingdom. These new pledges, announced Friday, follow persistent pressure from the UK's communications regulator, Ofcom. The core of the agreement centres on tighter deadlines for assessing reported illegal material and blocking access to accounts linked with proscribed terrorist organisations.
Under the new framework, X has promised to assess "at least 85 percent" of reported illegal hate and terror content within 48 hours. The platform will also aim to review suspected illegal terrorist and hate content within 24 hours on average. Beyond these speedier response times, X has agreed to block UK access to accounts identified as being operated by, or on behalf of, terrorist organisations proscribed in the UK.
Read More: Singer's Explicit Selfie Sparks Online Search for Identity
Ofcom has also stipulated that X will engage with external experts to refine its systems for handling user reports of illegal hate and terror content. This move addresses concerns raised by civil society groups regarding the clarity and effectiveness of current reporting mechanisms. The platform is set to submit quarterly performance data to Ofcom for the next 12 months, providing a window into its adherence to these commitments.
While Ofcom's Online Safety Group Director, Oliver Griffiths, described the commitments as "a step forward," he underscored that "there’s a lot more to do." He indicated that the regulator is actively challenging platforms to tackle these issues with greater resolve.
Separately, Ofcom's investigation into X's artificial intelligence tool, Grok, concerning allegations of its use in generating sexualised imagery, remains ongoing.
Danny Stone, the chief executive of the Antisemitism Policy Trust, offered a mixed reaction, welcoming the agreement as a "good start" but also pointing out X's continued "failing in so many regards" when it comes to addressing overt racism on its platform. The heightened regulatory attention and subsequent commitments appear to be partly spurred by a recent increase in hate crimes targeting the UK's Jewish community.
Read More: Channel 5 To Show Commonwealth Games Highlights As BBC Stops
These developments arrive as social media platforms face increasing expectations to manage dangerous content, particularly in the wake of rising hate crimes. The pressure on X and other major online services to implement robust systems for combating illegal material has been building, with regulators seeking more tangible assurances of user protection.