Legal Firestorm Erupts Over Unconsented Authorial Mimicry
Grammarly, the writing assistant service now operating under the Superhuman umbrella, has abruptly pulled its "Expert Review" AI feature following a storm of criticism and a federal class-action lawsuit. The tool, which offered editing suggestions purportedly styled after renowned writers and academics, stands accused of systematically appropriating individuals' identities for commercial gain without explicit permission. The company's CEO, Shishir Mehrotra, admitted they "fell short" and that "scrutiny improves our products," a statement that does little to quell the growing unease surrounding the commercial exploitation of creative personas.

Journalist Julia Angwin, a prominent voice in critiquing privacy erosion, spearheaded the legal challenge, filing a lawsuit in New York. Angwin, representing herself and others, alleges that Grammarly and Superhuman engaged in "misappropriation of the names and identities of hundreds of journalists, authors, writers, and editors" to generate profits. This action highlights a broader, unsettled debate within the 'tech' sphere regarding 'attribution', 'licensing', and the ethical boundaries of 'generative AI'. The feature, a paid add-on beyond a free trial, cost users approximately $12 a month, further emphasizing the commercial imperative behind the controversial function.
Read More: Amazon Alexa+ Adds Sassy Mode and New Tones for Users in 2024

Voices Silenced, Then Loudly Heard
The backlash was swift and vociferous. Writers and journalists, many of whom were unknowingly impersonated, expressed outrage. Wes Fenlon, a gaming journalist, described the tool's opt-out mechanism as "laughably inadequate," bordering on identity theft and profiting from unearned credibility. Kara Swisher, another prominent tech journalist, did not mince words, labeling the company "rapacious information and identity thieves" and asserting, "you suck." The controversy ignited conversations about the commodification of individual voices, with critics framing the incident not as a simple product misstep, but as a crucial test case for how AI companies will navigate the complexities of 'identity' and 'authorship'.

Grammarly's retreat from the "Expert Review" feature underscores a larger trend: the limitations and ethical quandaries inherent in 'large language models' when they move beyond abstract text generation to mimicking specific, real-world individuals. The company stated it plans to "reimagine the feature" with a focus on giving experts "real control over how they want to be represented, or not represented at all." This implies a future iteration might incorporate explicit consent and control mechanisms, a stark contrast to the initial "opt-out" approach that sparked such fierce opposition.
Read More: Calgary Murder Trial Defence Argues No Intent to Kill Homeless Man

Background: The Erosion of Authorship in the Age of AI
The "Expert Review" feature's existence, even briefly, points to a burgeoning market where 'AI' is increasingly seen as a tool to replicate or simulate established authorial styles. Reports indicate the tool generated feedback in the vein of identifiable scholars, including figures like the deceased historian David Abulafia. This practice raises significant ethical concerns, particularly within academic circles, and has prompted calls for stricter guidelines and transparency from AI developers.
The episode unfolds against a backdrop of increasing regulatory scrutiny, with frameworks like the EU's AI regulations poised to impose compliance costs and necessitate feature redesigns. For working writers, the incident serves as a stark reminder that their "voice is their equity." This emphasizes the need for clear contractual clauses prohibiting the training of AI on their work, the simulation of their style, or the use of their name for AI features without express, written consent. The very act of identifying oneself as a creator, in this emerging landscape, increasingly involves navigating the delicate balance between leveraging new technologies and safeguarding one's 'intellectual property' and personal identity.
Read More: Roblox and Minecraft Players Earn Millions in Virtual Games by 2026