Part of 2026 May 19, 2026 ·
--- days
-- hrs
-- min
-- sec
Content Hub Canvas Article
Canvas May 4, 2026 · 9 min read

The Smiling Dog in the Flames: When AI Startups Appropriate the Art That Describes Them

The Smiling Dog in the Flames: When AI Startups Appropriate the Art That Describes Them

In Brief

What happened: KC Green, creator of the iconic "This is fine" meme, has accused AI startup Artisan of using his artwork without permission in a subway advertising campaign promoting their AI sales assistant "Ava."

Why it matters: The case crystallizes a deeper irony: an AI company appropriating art about denial and catastrophe to sell automation tools, while the artist considers legal action against what he calls theft "like AI steals."

What's next: Green is seeking legal representation, while Artisan claims to have scheduled a conversation with him. The incident arrives as questions about AI training data, artist compensation, and corporate appropriation of internet culture intensify across the industry.

For those tracking how creative labor, AI ethics, and corporate culture collide in real time, the conversations happening at Human x AI Europe on May 19 in Vienna couldn't be more timely.

The Scene

There's a particular kind of cultural vertigo that arrives when an image escapes its creator entirely. The smiling dog sitting in flames, saying "This is fine," has become one of the most durable visual metaphors of the past decade. It appears in group chats during crises, on protest signs, in corporate presentations about "resilience." The image has traveled so far from its origin that many people who use it couldn't name its creator.

His name is KC Green. And according to TechCrunch, he's now watching an AI startup use his work to sell automation software in subway stations, without his permission.

A photograph circulating on Bluesky shows an advertisement in what appears to be a subway station. The familiar dog sits surrounded by flames. But the speech bubble has been altered: instead of "This is fine," the dog now says "my pipeline is on fire." An overlaid message urges passersby to "Hire Ava the AI BDR."

Ava is the product of Artisan, a startup that builds AI-powered business development representatives (BDRs are salespeople who focus on outbound prospecting and lead qualification). The company has courted attention before with provocative advertising, including billboards urging businesses to "Stop hiring humans."

Green's response on Bluesky was direct: he's "been getting more folks telling me about this" and "it's not anything i agreed to." The art, he wrote, has "been stolen like AI steals." He encouraged followers to "please vandalize it if and when you see it."

The Irony Writes Itself

Stand with this for a moment. An AI company that has previously run advertisements telling businesses to stop hiring humans has now appropriated, without permission, artwork created by a human artist. The artwork in question depicts a character in denial about a catastrophe unfolding around them. The company is using this image to sell AI tools that automate human labor.

The layers here resist easy summary. This isn't just a copyright dispute. It's a cultural diagnostic.

The "This is fine" comic first appeared in Green's webcomic "Gunshow" in 2013. It has since become a kind of universal shorthand for willful denial in the face of obvious disaster. That the image would be appropriated by an AI company to sell automation tools feels less like coincidence and more like the culture commenting on itself.

When TechCrunch contacted Artisan about the advertisement, the company responded: "We have a lot of respect for KC Green and his work, and we're reaching out to him directly." A follow-up email indicated they had scheduled time to speak with him.

The Texture of Appropriation

What makes this case worth examining isn't just the legal question of whether Artisan had the right to use Green's work. It's the texture of how appropriation happens in the AI era.

Green told TechCrunch he will be "looking into representation, as I feel I have to." But he also expressed exhaustion at the prospect: it "takes the wind out of my sails" to take "time out of my life to try my hand at the American court system instead of putting that back into what I am passionate about, which is drawing comics and stories."

This is the asymmetry that defines so many disputes between individual creators and well-funded companies. The artist must divert energy from creation to litigation. The company can absorb legal costs as a line item. Even when creators win, they often lose time, momentum, and the particular kind of attention that makes creative work possible.

Green added: "These no-thought A.I. losers aren't untouchable and memes just don't come out of thin air."

That last phrase deserves attention. Memes don't come out of thin air. They emerge from specific creative acts by specific people. The "This is fine" dog exists because KC Green drew it, wrote it, published it. The fact that it subsequently became a meme doesn't erase that origin. It doesn't transform the work into public domain material available for commercial appropriation.

Precedent and Pattern

Green is far from the first artist to see meme-able work used in ways he finds objectionable. TechCrunch notes the case of Matt Furie, creator of Pepe the Frog, who sued Infowars for using his character in a poster. Furie and Infowars eventually settled.

These cases share a common structure: an artist creates something that achieves viral distribution, and that distribution is then treated as implicit permission for commercial use. The logic seems to be that if something is everywhere, it belongs to everyone. But this conflates cultural ubiquity with legal availability.

The AI industry has its own version of this logic. Training data scraped from the internet is often treated as fair game precisely because it's publicly accessible. Ongoing legal battles over whether AI companies can train on copyrighted material without permission remain unresolved. The New York Times has sued OpenAI. Getty Images has sued Stability AI. Thousands of authors have signed letters urging AI makers to stop using their books without consent.

What makes the Artisan case slightly different is that this isn't about training data. This appears to be straightforward commercial use of a recognizable artwork in advertising. The questions are more traditional: Did Artisan have permission? If not, what are the consequences?

The Broader Climate

This incident arrives at a moment when the AI industry's relationship with creative labor is under intense scrutiny. Recent TechCrunch coverage notes that AI-generated actors and scripts are now ineligible for Oscars. Spotify has introduced verified artist badges to help distinguish humans from AI. The cultural institutions are beginning to draw lines.

Meanwhile, startups like Bria are attempting to build image-generating AI trained strictly on licensed images, with revenue-sharing models for contributing artists. Whether these approaches can compete with models trained on scraped data remains an open question.

The Artisan case sits at the intersection of these tensions. A company selling AI automation tools has apparently appropriated human-created art to do so. The artist is now considering legal action. The company says it has "a lot of respect" for the artist's work.

Respect, in this context, would have meant asking permission first.

What Gets Naturalized

Pay attention to what's being naturalized here. The assumption that viral images are available for commercial use. The expectation that artists should be grateful for "exposure." The framing of legal action as an unfortunate distraction from creative work, rather than a necessary defense of creative rights.

These assumptions benefit companies and burden creators. They treat the internet's distribution mechanisms as a kind of laundering process, transforming owned work into ownerless material.

Green's response cuts through this: "memes just don't come out of thin air." Someone made this. Someone owns this. The fact that millions of people have shared it doesn't change that.

The smiling dog in the flames has always been about denial. About sitting in catastrophe and pretending everything is fine. That an AI company would appropriate this image to sell automation tools feels less like marketing and more like confession.

Frequently Asked Questions

Q: What is the "This is fine" meme and who created it?

A: The "This is fine" meme features an anthropomorphic dog sitting calmly in a burning room. It was created by artist KC Green and first appeared in his webcomic "Gunshow" in 2013.

Q: What did Artisan allegedly do with KC Green's artwork?

A: According to TechCrunch, Artisan used Green's "This is fine" artwork in a subway advertisement promoting their AI sales assistant "Ava," modifying the speech bubble to read "my pipeline is on fire" without obtaining the artist's permission.

Q: What is Artisan's AI product being advertised?

A: Artisan sells Ava, an AI-powered BDR (Business Development Representative), which is an automated sales tool designed to handle outbound prospecting and lead qualification tasks typically performed by human salespeople.

Q: What legal options does KC Green have?

A: Green told TechCrunch he is "looking into representation" for potential legal action. Similar cases, such as Matt Furie's lawsuit against Infowars over Pepe the Frog, have resulted in settlements.

Q: How has Artisan responded to the allegations?

A: Artisan told TechCrunch they "have a lot of respect for KC Green and his work" and claimed to have scheduled time to speak with him directly about the matter.

Q: Why does this case matter for the broader AI industry?

A: The incident highlights ongoing tensions between AI companies and creative labor, arriving as legal battles over AI training data and artist compensation intensify across the industry, with institutions like the Academy Awards now excluding AI-generated content.

Enjoyed this? Get the Daily Brief.

Curated AI insights for European leaders — straight to your inbox.

Created by People. Powered by AI. Enabled by Cities.

One day to shape
Europe's AI future

Secure your place at the most important AI convergence event in Central Europe.