Adobe’s AI image generator can now be trained on your own art

Foto: An illustrated image representing Adobe’s Custom Firefly AI Model public beta.
Adobe has released Firefly Custom Models AI image generators in public beta, which can mimic specific artistic styles and character designs. The tool allows creators and brands to train a model on their own assets, ensuring visual consistency in generated images — from line weight to color palettes and character features. The solution is intended to facilitate work for teams producing large amounts of content, eliminating the need to start from scratch with each project. Models remain private by default, meaning the images used to train them will not feed into Firefly's general models. Adobe emphasizes that custom models are a natural extension of its strategy to promote Firefly as an ethical alternative to competing tools that may have utilized protected works. However, the company does not disclose whether there are mechanisms preventing model training on others' work. Users must confirm they possess appropriate rights, but there are no clear safeguards against potential copyright infringement — a key gap in Adobe's approach.
Adobe has just opened a public beta of a feature that fundamentally changes how creators can work with generative artificial intelligence. Firefly Custom Models is not another generic image creation app — it's a tool that learns directly from your portfolio, understands your aesthetic, and can replicate it at industrial scale. For the creative industry, where visual consistency and originality are currency, this is a change that could disrupt existing workflows in agencies, design studios, and among independent artists.
Before Adobe launched the public beta, Firefly Custom Models were tested by select users at Adobe Max last year. Now anyone can feed the model with their own images and watch as AI learns your style — from brush thickness, through color palettes, to details characteristic of your characters or photography. This sounds like a solution the creative part of the internet has been waiting for years, but it raises plenty of questions that Adobe doesn't want to openly discuss.
How Firefly model personalization works
The mechanics are surprisingly simple: you gather your best work, upload it to Adobe's platform, and the model begins analyzing what makes it stand out. The algorithm doesn't copy images — instead, it extracts patterns: what colors dominate your work, how you construct composition, what are the characteristic features of the characters you work with, what is your lighting signature. Adobe claims that Custom Models can preserve brush thickness, color palettes, lighting, and character traits — essentially everything that constitutes an artist's DNA.
This is an important distinction from general Firefly models or competing tools like Midjourney or DALL-E. When you ask a standard AI generator for a "cybernetic girl in cyberpunk style," you get an interpretation of what cyberpunk is — according to billions of images from across the internet. When you feed a Custom Model with your work and ask for a "cybernetic girl," you get a cybernetic girl that looks like your cybernetic girl. This is a fundamental difference.
In practice, this means that team working on an animated series can train a model on their character concept, then generate consistently-looking variants of that character for different scenes. An illustrator working on commission can train a model on their portfolio and generate variations of their work for clients who want more of the same style. A product photographer can train a model on their photos and generate product mockups in their characteristic aesthetic.
Data security and model privacy
Adobe emphasizes that Custom Models are private by default — images you use to train the model don't feed into general Firefly models. This is important because it contrasts with what other companies have done (and continue to do). OpenAI, Midjourney, and other competitors have trained their models on artists' work without permission for years, leading to a series of class action lawsuits. Adobe, having experienced public backlash from artists, is carefully building an image as a company that respects creators' rights.
But here's the problem Adobe carefully avoids in its press communications: the platform has no built-in verification system to ensure the person training the model actually owns rights to the images used. Adobe requires confirmation that you have "necessary rights and permissions" and that your use of Custom Models won't violate "copyright, intellectual property, image, or privacy rights of others." This sounds reasonable, but it's weak protection. Nothing prevents someone from training a Custom Model on another artist's work, and Adobe — as it admitted in conversation with The Verge — currently has no mechanisms to prevent this.
The Verge's editorial team asked Adobe whether there are any technical measures in place to prevent training models on others' work without consent. The answer was evasive. This suggests Adobe relies on user honesty and potential complaints — a model deployed in most SaaS platforms, but insufficient in the context of art and intellectual property rights.
Workflows for creative teams
For creative agencies and large studios, this tool could be a game-changer. Imagine an advertising agency working on a campaign for a major brand. Historically, if a brand wanted 50 ad variants with the same visual character, the agency had to hire graphic designers to manually create each variant or hire freelancers. Now the agency can train a Custom Model on the approved artistic direction and then generate variants within hours.
Adobe says Custom Models help with large-scale production without losing what makes work unique. This is the key promise — because until now, when creative professionals experimented with general AI generators, the result was predictable: generic, colorless, devoid of personality. Custom Models are supposed to change that.
However, there's a dark side: if everyone in the industry starts using Custom Models, we might observe a paradox — everyone will generate images faster, but they'll all look more similar because everyone will rely on their model instead of manual creation. Artists who learn to work with AI might gain, but those who won't or can't might find themselves on the industry's periphery.
Ethics of training models on others' work
This is the elephant in the room Adobe doesn't want to discuss publicly. The history of AI in recent years is a series of scandals involving models being trained on artists' work without consent. Artists sued OpenAI, Midjourney, and Stability AI, claiming their work was used to train algorithms that could subsequently replicate their style. Adobe, having its own experience with lawsuits (Adobe Stock has always been a subject of copyright discussions), is trying to be more careful.
But Firefly Custom Models create a new scenario: instead of Adobe training a model on your work without consent, now you can train a model on someone else's work. This is technically more democratic, but ethically more chaotic. Artist A can train a Custom Model on Artist B's work, then sell images generated by that model as their own. Adobe can claim it's not their responsibility — that it's a problem between Artist A and Artist B — but the platform enables it.
Many artists familiar with AI's history in the industry will be suspicious of this tool. Some will see it as another step toward devaluing manual creative work. Others — particularly those interested in experimenting with technology — may embrace it enthusiastically. Polarization around this tool will likely be vocal.
Implications for Polish creators and the local market
In Poland, where the creative market is smaller than in the USA or Western Europe, but developing dynamically, Firefly Custom Models could have interesting consequences. On one hand, Polish advertising agencies and design studios could produce content for international clients faster. On the other hand, the tool could accelerate the trend where smaller agencies are displaced by larger ones that have resources to invest in training and optimizing Custom Models.
Polish freelancers — illustrators, character designers, product photographers — should be particularly aware of the risks. If your work is available online (on Behance, Dribbble, Artstation, or even Instagram), theoretically someone could download it and train a Custom Model on it. Adobe has no mechanism to prevent this. This means your style could be replicated by someone else, and you'll have no control over it.
Competition and Adobe's position in the AI market
Adobe has a clear strategy: position Firefly as an ethical alternative to competitors. While Midjourney and OpenAI are accused of training models on artists' work without consent, Adobe says: "Our model is trained on licensed and public domain content." This is PR, but there's also some truth to it — Adobe has access to vast libraries of licensed content through Adobe Stock and partnerships.
Firefly Custom Models are another step in this strategy: by giving artists control over their models, Adobe builds an image as a company that "listens" to artists. Meanwhile, competitors still struggle with lawsuits and negative PR. This is a smart business move, even if fundamental ethical issues remain unresolved.
However, Adobe must be careful. If Custom Models are widely used to train on others' work without consent — and nothing suggests they won't be — Adobe could find itself in a similar situation to OpenAI. Artists could sue not just Custom Models users, but Adobe itself for providing a tool that facilitates copyright infringement.
Practical applications and real potential
Social media content production is an obvious use case — a small business can train a model on its aesthetic and generate consistently-looking content for Instagram or TikTok. Creating concepts for games or films is another — a team can experiment with character variants without needing to hire additional artists. Product photography and mockups is a third — e-commerce can generate product photos in different settings without hiring a photographer.
But there are also limitations. Custom Models are only as good as the data they're trained on. If you train a model on 20 photos, the result will be poor. Adobe doesn't say how many images are ideal, but the industry standard for custom models is typically hundreds to several thousand. This means artists with small portfolios may struggle.
Additionally, Custom Models still have problems all AI generators have: they can produce strange artifacts, anatomically incorrect characters, or generate something only superficially similar to what you wanted. They're a tool to speed up work, not a replacement for human creative decisions.
The future of personalization in AI
Firefly Custom Models are a symptom of a larger trend: AI personalization is becoming the norm. In the coming years, we expect every serious AI tool to have an option to train on your own data. OpenAI is doing this with GPT, Anthropic is experimenting with Claude, and every AI startup wants to offer custom models.
The problem is that this personalization creates new ethical and legal challenges. Who is responsible when a Custom Model trained on others' work produces images that infringe copyright? The platform? The user? Both? The law has no clear answer, and tech companies are waiting for courts to decide.
Adobe, with the largest access to creative content in the world (through Adobe Stock, Creative Cloud, and partnerships), is in a unique position to shape how AI personalization will work in the future. If Adobe sets standards — such as requiring copyright verification or blocking training on protected work — others might follow. If not, it could be open season for infringing artists' rights through Custom Models.
Firefly Custom Models is a tool with enormous potential for creators who want to work faster and more efficiently. But it's also a tool that can be abused, and Adobe hasn't done enough to prevent that abuse. For artists and creators interested in this tool, the advice is simple: use it on your own work, but be aware that others may try to use it on your work without your consent. For Adobe, the challenge is to build artist trust while not limiting the tool's potential. It's a tightrope Adobe will have to walk with great caution.









