PSA: Anyone with a link can view your Granola notes by default

Foto: The Verge AI
Every note created in the Granola app is accessible by default to anyone who comes into possession of its generated link. Although the creators of this popular AI notepad promote it as a tool that is "private by design," shared materials do not actually require a login for an outside party to view their content, the author's name, or the creation date. Furthermore, the platform automatically uses individual users' data to train its artificial intelligence models unless they manually disable this option in the settings. The issue primarily affects individuals using Granola to transcribe and summarize business meetings. While full audio recordings are not stored, the AI generates detailed bullet points and quotes based on them, which, under default settings, become public if a link is accidentally leaked. Only Enterprise clients are covered by automatic protection against the use of their data for algorithm training. For users worldwide, this is a clear signal that the convenience of automated summaries comes at the risk of losing control over sensitive information. Securely using AI notepad tools today requires an immediate audit of privacy settings (Default link sharing) and conscious management of data processing consents. Privacy in the AI era is ceasing to be a standard and is becoming a feature that we must activate ourselves.
In a world dominated by online meetings, automatic note-taking tools have become a lifesaver for many professionals. However, the latest reports regarding the Granola app, which positions itself as an intelligent AI notepad for people working in "back-to-back" mode, are casting a shadow over corporate data privacy issues. It turns out that the platform's default settings may expose sensitive content to the public, a fact many users are unaware of.
The problem concerns the fundamental sharing mechanism. Although Granola promotes itself with the slogan "private by default," the technical reality looks different. Every note generated by the system has a unique link which—according to factory settings—allows the content to be viewed by anyone who comes into possession of it. The lack of a login or authorization requirement means that accidentally pasting a link in the wrong place can result in an immediate leak of business details.
The Illusion of Privacy in the AI Cloud
The Granola operating mechanism is based on calendar integration and capturing audio from meetings. Subsequently, artificial intelligence processes this data, creating a clear bulleted list that the user can edit or ask the AI assistant about. However, tests have shown that opening a note link in a browser's incognito window provides access to the full summary content, information about the author, and the document's creation date. Worse still, although the full transcript is theoretically reserved for collaborators within the app, outsiders can gain insight into snippets of quotes assigned to specific points in the note.
Read also
Such system architecture raises serious concerns in the corporate sector. According to unofficial information, at least one large company has blocked its senior management from using the Granola tool precisely because of security vulnerabilities. The problem is not new—voices regarding the danger of unindexed but publicly accessible links appeared as early as last year on the LinkedIn platform, indicating that an accidental URL leak is equivalent to making data public.
Your Data as Fuel for AI Models
The issue of note visibility is just the tip of the iceberg. Another controversial aspect is the use of data for training purposes. Granola admits that it may use "anonymized data" to improve its AI models. While Enterprise customers are excluded from this process by default, users of all other plans automatically participate in algorithm training unless they manually find and uncheck the appropriate option in the settings.
The company is trying to reassure users, declaring that data is not passed to external giants such as OpenAI or Anthropic if this function remains enabled. The infrastructure is based on the Amazon Web Services (AWS) cloud located in the USA, and data is said to be encrypted both at rest and during transmission. It is worth noting, however, that while Granola does not store raw audio from meetings, transcripts and finished notes remain in the cloud—representing the most condensed and valuable knowledge about a company's projects.
How to Secure Your Meetings?
For those who wish to continue working with the tool, changing the default configuration is key. To do this, go to the Settings section via the profile icon, and then in the Default link sharing tab, change the status from "Anyone with the link" to "Only my company" or the most restrictive "Private." Only such action guarantees that the content will not be read by someone outside the organization.
- Link Verification: Check if previously generated notes are circulating online as public links.
- Disabling AI Training: In the settings menu, find the option "Use my data to improve models for everyone" and uncheck it.
- Folder Management: Full access to transcripts depends on folder permissions within the desktop application—it is worth regularly auditing the list of collaborators.
In an era of growing popularity for meeting assistants, the Granola case serves as a significant warning. The convenience of automation often goes hand-in- hand with aggressive default settings aimed at building a data-sharing ecosystem or faster AI model development at the expense of user privacy. The technology industry must understand that in the productivity tools segment, "privacy" cannot just be a marketing slogan; it must stem from secure software architecture from the very first launch.
More from AI
Related Articles

Anthropic says its leak-focused DMCA effort unintentionally hit legit GitHub forks
11h
Microsoft’s new ‘superintelligence’ game plan is all about business
13h
Google Home’s latest update makes Gemini better at understanding your commands
13h





