Tech4 min readArs Technica

Perplexity's "Incognito Mode" is a "sham," lawsuit says

P
Redakcja Pixelift0 views
Share
Perplexity's "Incognito Mode" is a "sham," lawsuit says

NurPhoto / Contributor | NurPhoto

Every query directed to the Perplexity search engine, even in Incognito mode, is sent along with full chat history and personal data directly to Google and Meta, according to a class-action lawsuit filed against the tech startup. Court documentation reveals that privacy promises are in fact "fiction," and built-in advertising trackers, such as Meta Pixel or Google Ads, function as digital wiretaps. The scale of the violations is immense, as users often entrust AI systems with sensitive data: from financial and tax information to intimate health details. The lawsuit indicates that Perplexity actively encourages the uploading of medical documents, which are then shared with advertising giants along with email addresses and identifiers allowing for personalization. Significantly, the issue affects both free users and paid subscribers. For creators and professionals using AI in their daily work, this is a clear warning signal: private modes in generative tools may not provide real protection. The leakage of prompts containing trade secrets or client data to external advertising networks is becoming a real legal and reputational threat. In the era of the AI arms race, protecting intellectual property and private data now requires users to be much more distrustful of software providers' assurances.

Modern AI search technology
AI search engines promise a new quality of interaction, but serious privacy violations may be lurking behind the scenes.

The Illusion of Anonymity in Incognito Mode

For many users, **Incognito Mode** was a guarantee that their queries would not be linked to their identity or saved in their history. However, the lawsuit claims that this mode is a "sham" that in no way protects against sharing conversations with **Meta** and **Google**. Even paid subscribers who activated this feature were allegedly exposed to having their content transmitted along with email addresses and other identifiers allowing for direct personalization. Analysis conducted using developer tools revealed that **Perplexity** transmits not only the initial prompts but also the system-generated follow-up questions that the user clicks on. For individuals without an account, the situation is reportedly even worse — their conversations are allegedly shared via a unique URL, giving third parties, including **Meta** and **Google**, full insight into the entire chat history.

The Mechanism of Silent Surveillance

Key to the entire proceeding are hidden advertising trackers, which the lawsuit calls "browser wiretapping technology." Tools such as **Facebook Meta Pixel**, **Google Ads**, and **Google Double Click** were identified in the **Perplexity** search engine code. Furthermore, the company allegedly utilizes **Meta's Conversions API** — a technology designed as a workaround for users who consciously block tracking via standard pixels.
  • Facebook Meta Pixel: tracks user interactions and sends conversion data to Meta's advertising systems.
  • Google Ads and Double Click: integrated systems allowing for user profiling based on their queries in the AI search engine.
  • Conversions API: an advanced Meta tool that allows for sending data directly from the server, bypassing browser blocks.
The scale of the problem is enormous because the **Perplexity** system is designed to encourage deep interaction. When a user asks about serious issues, e.g., "What is the best treatment for liver cancer?", the AI suggests uploading test results, scans, or treatment plans. If such data reaches **Google** and **Meta** systems, it can become the basis for serving advertisements that the lawsuit describes as "overwhelming, disturbing, and often even physically harmful."
Digital data and its protection
Private user data is transmitted using trackers that many were unaware existed.

Legal Consequences and Lack of Transparency

The class action lawsuit covers the period from **December 7, 2022**, to **February 4, 2026**. The plaintiffs are seeking not only damages, which could exceed **$5,000** per violation, but also an injunction to stop these practices and the disgorgement of profits obtained from the unlawful use of data. Significantly, the lawsuit does not include subscribers of **Perplexity Pro** and **Perplexity Max** plans, as the plaintiff did not use these tiers and cannot represent their interests. One of the most serious allegations is the deliberate concealment of the privacy policy. Unlike competitors such as **Google** or **Bing**, **Perplexity** does not place a link to its policy on the homepage. A user must actively search for it, and even then, they will not learn about specific trackers. Instead, the company only states that it does not honor "do not block" signals and warns that attempts to block tracking tools may affect the quality of service.

Giants Rebuff Allegations

The reaction of the accused companies is predictable, though it sheds light on systemic responsibility in the tech ecosystem. A **Google** spokesperson stated that companies using their tools are responsible for informing users about data collection. They also noted that, by default, data sent to **Google Analytics** does not identify individuals, and the company has strict rules against advertising based on sensitive information. Nevertheless, the evidence presented in the lawsuit suggests that these barriers are porous. If AI becomes our confidant in financial, tax, or health matters, protection standards must go beyond generic declarations in terms of service. **John Doe** notes in the lawsuit that had he known about the transmission of full transcripts of his conversations to **Meta** and **Google**, he would never have entrusted the platform with his family's financial data. The current situation of **Perplexity** is a warning sign for the entire AI industry. The business model based on free access in exchange for data, which dominated the Web 2.0 era, is becoming extremely risky in the age of artificial intelligence. Users enter into much more intimate and detailed relationships with language models than with a traditional search engine. If it turns out that "incognito mode" in AI tools is merely a marketing facade, it could lead to a lasting crisis of confidence that even the most advanced model updates will not fix. The verdict in this case will likely define new standards of transparency that the AI industry will no longer be able to ignore.
Source: Ars Technica
Share

Comments

Loading...