A folk musician became a target for AI fakes and a copyright troll

Foto: Musician Murphy Campbell in a black and white photo that appears to be taken on an old-school large format film camera.
Public domain songs, such as the 1870s classic "In the Pines," have become tools for digital scammers, with folk artist Murphy Campbell falling victim to the practice. Unknown perpetrators used recordings from her YouTube channel to create AI covers, which they then unlawfully published on streaming services under her name. The situation took a turn for the worse when YouTube's algorithms accepted Content ID claims against her own performances of traditional songs, diverting advertising revenue into the hands of a so-called copyright troll. Campbell's case exposes critical loopholes in the global music distribution system and the verification of AI-generated content. Although the distributor Vydia ultimately withdrew the claims and blocked the scammer, the incident demonstrates how easily automated systems can be manipulated at the expense of independent creators. Spotify is currently testing a feature for manual song approval before tracks appear on an artist's profile; however, industry skepticism remains high. For users and musicians worldwide, this is a wake-up call: in the AI era, protecting digital identity and intellectual property requires much more robust mechanisms than those currently employed by tech giants. This is no longer just a matter of piracy, but of the systemic takeover of artistic output by anonymous actors exploiting gaps in copyright law.
In January 2026, folk artist Murphy Campbell made a discovery that marked the beginning of one of the most bizarre disputes in the era of generative artificial intelligence. On her official Spotify profile, songs appeared that she had never uploaded. Although the compositions were based on her original recordings, the sound of the vocals raised immediate suspicion. As it turned out, unknown perpetrators downloaded her performances from YouTube, processed them using AI models to create so-called "AI covers," and then published them on streaming platforms under her own name.
An analysis of the track "Four Marys" conducted by independent AI detectors confirmed the artist's fears—both tools indicated a high probability that the track was generated by algorithms. Campbell's case is not merely an incident of identity theft; it is a symptom of a deep dysfunction in copyright protection systems in an era where technology has outpaced legislation and the verification mechanisms of major tech giants.
Identity for sale and leaky verification systems
The process of removing illegal content proved to be an ordeal for Campbell. The artist admitted she had to become a "nuisance" to force the platforms to react. While the tracks disappeared from YouTube Music and Apple Music, the battle with Spotify was much more difficult. Even after intervention, at least one of the fake tracks remained available on the service, albeit under a new, alternative artist profile with the same name. In this way, multiple "Murphy Campbells" began to function within the streaming ecosystem, misleading listeners and diluting the true artist's personal brand.

Spotify currently declares it is testing a new solution that would allow creators to manually approve songs before they appear on their public profile. However, Campbell remains skeptical. Her experience with tech giants suggests that promises made to musicians rarely translate into real protection of their interests. The problem is so serious because the "checks and balances" mechanisms the artist believed in proved to be illusory in practice, allowing anyone to distribute counterfeit content almost instantly.
Absurd claims to the public domain
Just when it seemed the AI situation was under control, Campbell became the target of so-called copyright trolling. On the day a Rolling Stone article was published about her problems, a user appearing as Murphy Rider uploaded a series of videos to YouTube via the distributor Vydia. These videos, although unlisted, were used to file Content ID claims against Campbell's original materials. The artist received a notification that she must now share revenue with the alleged owner of the rights to the song "Darling Corey."
The situation borders on the absurd because the songs performed by Campbell, including the classic "In the Pines" (widely known as "Where Did You Sleep Last Night" as performed by Nirvana), are in the public domain. Some of them date back to the 1870s. Despite this, YouTube's automated system accepted the claims, recognizing the "Murphy Rider" figure as entitled to collect royalties from work that belongs to cultural heritage and is not subject to exclusive copyright protection in this regard.
- Murphy Campbell: Victim of AI voice cloning and unlawful financial claims.
- Vydia: A distributor that processed over 6,000,000 Content ID submissions, of which 0.02% were found to be erroneous.
- Public domain: Traditional folk songs that have become tools in the hands of copyright trolls.
- Content ID: YouTube's system that automatically redirects advertising profits to claiming entities.

Responsibility dispersed in digital chaos
A spokesperson for Vydia, Roy LaManna, reported that the claims against Campbell have been withdrawn and the account of the user responsible for the incident has been blocked. LaManna defends the effectiveness of his system, claiming that an error margin of 0.02% is "incredible" by industry standards. At the same time, the company firmly distances itself from connections to Timeless IR or the individuals behind the previous AI covers, calling the timing of both attacks coincidental. However, the wave of criticism against Vydia took drastic forms, including criminal threats that forced the company to evacuate its offices.
Murphy Campbell rightly observes that the blame cannot be attributed to a single entity. The problem lies in the architecture of the modern music market, where generative artificial intelligence, mass digital distribution, and complex copyright law create a toxic mix. Systems like Content ID, designed to protect major labels, become weapons in the hands of scammers, and independent artists are left alone to fight algorithms. "It goes much deeper than we think," Campbell concludes, pointing to the systemic failure of platforms that prioritize scale over reliable content verification.
The case of Murphy Campbell is a harbinger of a new era of conflict in the creative industry. If streaming platforms and distributors do not introduce radical changes in how they verify identity and intellectual property, authentic creators will be pushed out by "digital zombies"—AI-generated content that not only copies their style but also takes over their sources of income. The industry must move away from a reactive "take-down after the fact" model toward proactive protection; otherwise, the public domain and the work of independent musicians will become free prey for technological opportunists.
More from AI
Related Articles

Anthropic is having a moment in the private markets; SpaceX could spoil the party
Apr 4
Anthropic essentially bans OpenClaw from Claude by making subscribers pay extra
Apr 3
"Cognitive surrender" leads AI users to abandon logical thinking, research finds
Apr 3





