AI Cloned a $15M App in 10 Mins

A new AI tool just replicated a venture-backed app from a few screenshots in under 10 minutes. This is how 'vibe coding' is making million-dollar software ideas dangerously easy to build.

ai tools
Hero image for: AI Cloned a $15M App in 10 Mins

The $15 Million Dare

Fifteen million dollars in venture funding buys a lot of mystique around a startup, especially one like Cluely, a polished voice‑note app that turns rambling audio into clean summaries. That price tag signals complexity: custom UI, real‑time transcription, AI summarization, user accounts, billing, and the invisible glue that keeps it all running on actual phones. For years, cloning that stack meant a small team, a few months, and a budget that did not fit in a YouTube description.

Riley Brown, a YouTube creator and founder of Vibecode, decided to treat that as a dare. In a video titled “Cloning 15 million dollar app in 10 min,” he sets a blunt goal: build a functional Cluely clone, from interface to database to subscriptions, in about the time it takes to microwave dinner. No hand‑coding, no Xcode, just prompts and screenshots.

The claim borders on trolling: replicate the perceived value of a $15 million product in “5–10 minutes” using Claude 4.5 Opus inside Vibecode. Brown feeds the AI four screenshots of Cluely’s UI, uploads the logo, and asks it to “build this app in its entirety” after searching for Cluely online. Minutes later, he taps through a generated app that looks uncannily like the original.

What makes the stunt more than UI cosplay is the scope of what he asks the system to build. Brown prompts Vibecode to wire up: - Voice recording and playback - Full transcript plus AI summary views - User authentication and a database - RevenueCat paywalls and subscriptions - App Store deployment via his Apple developer account

The result is not a pixel‑perfect legal clone or a production‑hardened backend. But as Brown records notes, sees transcripts and summaries appear, hits a paywall after four entries, and flips a $5 subscription switch in RevenueCat, the point lands. When a solo creator can reassemble the visible shape and core flows of a funded app in a single take, the line between “complex software product” and “quickly reproducible pattern” starts to blur.

Meet Vibecode: The App That Builds Apps

Illustration: Meet Vibecode: The App That Builds Apps
Illustration: Meet Vibecode: The App That Builds Apps

Vibecode pitches itself as an app that builds apps, a kind of vibe coding environment for people who think in screenshots and sentences instead of syntax. Rather than a web IDE, it runs as a native mobile app, so the entire development loop lives on your phone. Its promise: describe an idea, tap a few buttons, and ship something that looks and behaves like a real product.

Positioned against no‑code tools like Bubble or Glide, Vibecode leans hard on large language models instead of visual flowcharts. The company markets it as “the first ever mobile app” that turns natural language into production‑ready iOS builds. Where traditional no‑code still requires logic diagrams and manual styling, Vibecode hands most of that to AI.

At the center of that stack sits Claude 4.5 Opus, Anthropic’s flagship model. Riley Brown runs “Claude Code with Opus 4.5” inside Vibecode, asking it to interpret four Cluely screenshots, search the web, and “build this app in its entirety.” Opus handles UI layout, navigation, data models, and even backend hooks from a single, dense prompt.

Vibecode wraps that model in a guided pipeline with modes like Plan and Build, but Brown mostly treats it as a conversational engineer. He uploads assets, tweaks prompts, and reruns generations until the clone feels close enough to the original. The AI interprets vague constraints like “no real‑time voice” or “show this paywall on the fifth note” and rewires the app accordingly.

User experience looks less like coding and more like directing. A non‑developer can: - Describe the app’s purpose and core flows in plain English - Upload reference images and logos - Specify rules for storage, auth, and payments

From there, Vibecode compiles everything into a tappable prototype that runs locally on the phone. Brown records a note, watches a transcript and AI summary appear, then immediately layers in a database, authentication, and a RevenueCat paywall through follow‑up prompts. All of it happens from a handheld device, compressing what would normally be a multi‑tool, multi‑week stack into a single chat‑driven interface.

From Screenshot to Live UI in One Prompt

Riley Brown starts by handing Claude 4.5 Opus four screenshots of Cluely and a single sprawling instruction. He tells Claude to analyze every screen, search for “Cluely” on the internet for extra context, and “build this app in its entirety” inside Vibecode. That one prompt effectively becomes a spec document, UX brief, and engineering ticket rolled into a single sentence.

Claude responds by generating a working clone that looks uncannily close to the real thing. Brown pulls up a side‑by‑side: Cluely’s production app on one phone, the AI‑generated version on another. Colors, layout, typography, and the central Start Recording call‑to‑action all line up closely enough that you could mistake the clone for a slightly older build of the original.

Visual fidelity comes from Claude treating the screenshots as design ground truth. It copies the structure of the home screen, the note list, and the detail view that shows transcript alongside an AI‑generated summary. Even micro‑interactions like tapping a big circular record button and seeing a live “recording” state appear feel deliberately reconstructed rather than approximated.

Brown then tightens the mimicry by feeding Claude more assets. He uploads the official Cluely logo and a dedicated “recording” screen, tells the model that the UI should match this while audio is capturing, and asks it to integrate the branding everywhere. Within minutes, the clone swaps its generic styling for Cluely’s actual logo and a more polished recording state UI.

That iteration loop stays entirely visual and conversational. Brown does not touch a design tool or a single line of Swift or Kotlin; he just drags in images and refines the prompt. Each run of Claude inside Vibecode regenerates the interface so the logo, color palette, and state‑specific layouts converge on the reference app.

For anyone trying to understand how this scales beyond one YouTube stunt, Vibecode – AI Mobile App Builder lays out the pitch clearly. Upload screenshots, describe behavior in natural language, and let an AI model treat your references as a live Figma file and a product spec combined.

Weaving the Digital Brain: Adding Backend & Auth

Riley Brown’s experiment stops being a UI magic trick the moment he types a new prompt: wire up a backend. After playing with the Cluely clone’s front end, he asks Claude 4.5 Opus inside Vibecode to “store all of this in a database” and “have authentication so users can sign in.” No schema diagrams, no ORM boilerplate, just a natural‑language request that assumes a full stack will appear on demand.

Claude obliges. Vibecode spins for a moment, then reports it has “set up” the backend, quietly generating a data model for accounts, meetings, and notes, plus an auth flow. Brown does not touch an SDK, pick a cloud provider, or configure OAuth; the system infers everything from the earlier UI and copy.

That single prompt effectively replaces what many teams would call phase two of a product build. Instead of separate tickets for: - User registration and login - Session handling and permissions - Database tables for users, meetings, and voice notes

Brown compresses all of it into one sentence and a progress spinner. Full‑stack stops being a skill set and becomes a checkbox.

Verification matters more than vibes, so he immediately stress‑tests the claim. He signs up for an account in the generated app, typing a short intro — “hello, this is a test, my name is Riley” — and creates a profile where the display name reads “Chris.” The app accepts the signup without drama, behaving like any standard Firebase‑style flow.

Then he jumps into Vibecode’s backend view, the part traditional no‑code tools usually bury under dashboards and schema editors. Two user accounts appear, including the “Chris” test profile, along with a single “meeting” record corresponding to the voice‑note session he just ran. Rows, IDs, timestamps: the boring but essential proof that this is not a front‑end toy.

He records another short note, generating a transcript and AI summary again, and refreshes the data view. New records populate under meetings or notes, tied back to his authenticated user. What usually demands a backend engineer, a database migration, and a staging environment now arrives as a side effect of saying “store everything in the database” out loud.

The Monetization Switch: Payments With a Sentence

Illustration: The Monetization Switch: Payments With a Sentence
Illustration: The Monetization Switch: Payments With a Sentence

Monetization arrives as casually as a throwaway sentence. After standing up UI, backend, and auth, Riley Brown adds a business model by typing a single line into Vibecode’s Claude 4.5 Opus prompt: allow users a few free notes, then lock things behind a subscription. No pricing tables, SDK docs, or Xcode gymnastics—just business logic in plain English.

The rule itself sounds like product management shorthand, not code: “Allow each user to create four notes. And then if there are four notes in their database, please show a paywall instead of allowing them to record. So when they press record on their fifth note, it should show this paywall.” Vibecode parses that as a gating condition tied directly to the app’s database state.

Under the hood, that one instruction wires up a classic freemium funnel. Every account gets four free recordings that still run through the AI transcription and summary pipeline. On attempt number five, the record button stops behaving like a recorder and starts behaving like a revenue trigger, routing users to a paywall screen instead.

Rather than bolting on Stripe or wrestling with StoreKit, Vibecode leans on RevenueCat. Brown taps “finish setup,” and the platform automatically spins up a new RevenueCat project dedicated to this Cluely clone. No API keys pasted, no entitlements configured by hand, no platform‑specific billing code exposed.

The demo jumps straight from that click to a live paywall inside the app. When Brown hits record after burning through the four free notes, a subscription screen appears with a “Subscribe now” option and a “Test valid purchase” button. That test purchase runs through RevenueCat’s sandbox pipeline, mimicking a real App Store transaction.

Proof that this is not just a mocked‑up flow arrives in a browser tab. Brown logs into revenuecat.com, navigates to the auto‑generated project labeled “Clo,” and opens the dashboard. Sitting there: a single active subscription for $5, tied to the test user.

That $5 line item closes the loop. A prompt described the paywall rule, Vibecode wired the app to RevenueCat, and a test transaction surfaced in a third‑party billing backend without any manual configuration. In under a minute, a throwaway sentence became a working monetization switch for a supposedly $15 million product.

One-Click Deployment: AI to App Store

Riley Brown ends the demo by hitting the button that used to separate hobby projects from real products: publish. Inside Vibecode, the “publish to the app store” option sits behind three dots, a single tap that kicks off what used to be a days‑long gauntlet of Xcode archives, provisioning profiles, and metadata forms.

Instead of wrestling with certificates, Vibecode wires directly into the creator’s Apple Developer account. Brown taps through a short “next” flow, and Vibecode handles building the binary, signing it with the right credentials, and packaging the app for App Store Connect, all from his phone.

That connection matters because Apple’s pipeline remains notoriously brittle for newcomers. Vibecode abstracts away steps that normally demand: - A paid $99/year developer membership - Correct bundle IDs and signing identities - Manual uploads through Xcode or Transporter

By folding deployment into the same chat‑like interface that handled UI, backend, auth, and RevenueCat payments, Vibecode compresses the entire lifecycle into an AI‑driven workflow. Idea, prototype, paywall, and App Store submission all happen in a single session, powered by Claude 4.5 Opus under the hood.

For Brown, that means the cloned Cluely‑style voice app moves from screenshots to a live build on its way to Apple’s review queue in under 10 minutes. For everyone else, it reframes “shipping an app” as just another prompt, not a separate engineering specialty.

This kind of one‑click deployment tilts the playing field for non‑developers. Students, solo creators, and small businesses can push native apps toward the App Store without touching Xcode or hiring a mobile engineer, as long as they can describe what they want.

Vibecode already pitches this from its own iOS listing, Vibecode – AI App Builder on the App Store, as “turn your app idea into reality in minutes.” Brown’s demo shows that promise now extends all the way to Apple’s front door.

What 'Vibe Coding' Means for Developers

Vibe coding describes a shift from typing syntax into an editor to describing outcomes to an AI collaborator that handles the boilerplate. Instead of specifying view hierarchies, API routes, and schema migrations, you say, “Clone this app from these screenshots, store everything in a database, add auth and a paywall,” and the system infers the rest. The “vibe” is the product’s feel and behavior, not the specific implementation details.

An emerging ecosystem is racing to own this layer. Vibecode targets mobile apps with Claude 4.5 Opus, promising native builds from prompts and assets. Google is pushing a similar idea with Vibe Code on top of Gemini, while tools like Anima already turn static website designs into working React or HTML/CSS, effectively vibe coding for the web.

These systems all share a pattern: they treat UI, backend, and deployment as parameters you describe rather than code you write. In Riley Brown’s demo, a few sentences handle: - UI cloning from four Cluely screenshots - Database and authentication setup - RevenueCat paywall logic and App Store deployment That used to mean days of React Native, Firebase, Stripe, and Xcode configuration.

Developers do not disappear in this world; their job description mutates. Instead of hand‑crafting every screen, they orchestrate AI agents, define architectures, and encode constraints in language. Good “vibe coders” will know how to specify data models, edge cases, and failure modes so the AI does not happily ship something insecure, unscalable, or illegal.

Prompt engineering becomes a subset of software design, not a party trick. You still need to reason about rate limits, offline behavior, PII handling, and multi‑tenant schemas, but now you express those concerns as precise instructions. The skill ceiling shifts from memorizing framework APIs to modeling systems and debugging AI‑generated behavior.

Traditional development does not end; it becomes the escape hatch. When vibe coding hits a hard problem—custom audio DSP, low‑latency multiplayer, hairy legacy integrations—someone still has to drop into Swift, Kotlin, or Rust. For many CRUD‑heavy products, though, vibe‑first workflows will be the default and manual coding the exception.

The Cloning Economy: When Your UI Isn't a Moat

Illustration: The Cloning Economy: When Your UI Isn't a Moat
Illustration: The Cloning Economy: When Your UI Isn't a Moat

Polished mobile UI just got demoted to line item. When Riley Brown can point Claude 4.5 Opus at four Cluely screenshots and get a working clone in roughly 10 minutes, the frontend stops looking like a moat and starts looking like table stakes. Screens, gradients, and button microcopy now live in the same commodity bucket as stock icons and Tailwind templates.

Legal frameworks have not caught up to this reality. Copyright protects specific assets, not broad “look and feel,” and trade dress cases usually hinge on consumer confusion, not clone‑wars between startups. When an AI rebuilds your flows from public screenshots and marketing pages, you end up in a gray zone that sits somewhere between competitive analysis and industrial‑scale plagiarism.

Ethically, vibe coding makes copying feel trivial and deniable. A founder can say, “I just described an app like X” while their prompt history shows they uploaded a rival’s logo, color palette, and onboarding sequence. That ease changes the risk calculus for early‑stage teams who might copy first and lawyer up later.

So value shifts to what Claude cannot scrape from a landing page. Durable differentiation moves toward proprietary data, defensible communities, and deep integrations that require hard‑won relationships or infrastructure. A Cluely‑style app with exclusive enterprise datasets or custom speech models keeps an edge long after its UI pattern hits the AI commons.

Startups now need moats that look less like Dribbble shots and more like systems. That can mean: - Privileged access to data or APIs - Network effects and creator ecosystems - Deep‑tech components like custom inference pipelines or on‑device models - Brand trust and distribution partnerships

Speed to market effectively rounds to zero when a solo creator can spin up frontend, backend, auth, payments, and App Store deployment with a few prompts. The new playbook assumes every obvious UX pattern has a half‑life of days before an AI can reinstantiate it for anyone who asks. Surviving that environment means building advantages that cannot be cloned from four screenshots and a logo upload.

Reality Check: Hype vs. Production-Ready

Reality in a 10‑minute YouTube video moves at 2x speed. Riley Brown’s demo shows Claude 4.5 Opus inside Vibecode apparently cloning Cluely in a single take, but anyone who has shipped an app knows there were invisible reps: discarded prompts, miswired buttons, and model hallucinations cut on the editing floor.

AI app builders follow a brutal 80/20 rule. Vibecode and Claude get you maybe 80% of the way in 20% of the time: screens scaffolded, API calls sketched, auth wired, payments stubbed. The remaining 20%—the part investors and users actually feel—still demands slow, manual work.

That last mile includes: - Scaling beyond a few hundred users - Locking down auth, data access, and secrets - Handling offline states, retries, and flaky networks - QA across devices, OS versions, and locales

None of that shows up in a 10‑minute montage. The video never touches App Store Review edge cases, like microphone permission copy, data retention policies for voice recordings, or whether AI summaries count as “user-generated content” that needs reporting tools. A single rejection can cost more time than the entire “build” segment.

Beyond launch, real apps need an ecosystem. There is no scene where Brown sets up analytics, error tracking, or feature flags. No segment on customer support workflows, SLAs, or what happens when RevenueCat misfires and a paying user hits a broken paywall on their fifth note.

Maintenance turns quick wins into grind. Apple and Google ship OS updates yearly; SDKs deprecate; privacy rules shift. Someone has to keep the Claude prompts, Vibecode project, and third‑party integrations in sync with a moving platform target.

Framed correctly, Vibecode looks less like a developer replacement and more like the most aggressive prototyping engine the industry has seen. You can validate an idea, test pricing, or demo an investor‑ready build in a day instead of a quarter. For a deeper look at this shift, What is Vibe Coding? How To Vibe Your App to Life – Replit Blog sketches how prompt‑driven workflows are reshaping early‑stage product work.

Your $15M Idea Is Now a Weekend Project

Vibe coding lands like a psychological grenade in founder and VC circles. When a solo creator can clone a $15 million-funded app like Cluely in roughly a weekend, the story investors tell themselves about “defensible” software starts to wobble. Capital has to shift from funding basic CRUD plus auth to funding distribution, data moats, and genuinely hard problems.

Founders feel that shift first. If your pitch boils down to “mobile app with login, subscriptions, and AI summaries,” Riley Brown just showed that Claude 4.5 Opus and Vibecode can knock that out from screenshots and a paragraph of instructions. You are no longer raising to build; you are raising to differentiate.

For VCs, this forces a new filter. Check-writing for: - Thin wrappers around foundation models - Simple SaaS dashboards - Single-feature mobile utilities

starts to look reckless when those products compress into a prompt and a weekend. Attention flows to proprietary data, deep integrations, and regulated workflows where “clone from screenshots” does not get you past compliance.

On the flip side, the solo creator just got superpowers. A single person with Vibecode, a Stripe or RevenueCat account, and an Anthropic API key can now ship a credible SaaS: native app, backend, auth, paywall, and App Store deploy. The barrier to testing a niche—say, a journaling app for firefighters or a coaching tool for violin teachers—drops to nearly zero dollars and a few late nights.

Fast-forward a few years and software creation starts to look like document editing. You describe your product in a spec-like prompt: user journeys, pricing tiers, data rules, integrations. An AI stack scaffolds not only the app, but also analytics, onboarding flows, and customer support macros. Entire microbusinesses spin up around hyper-specific workflows that never penciled out under traditional dev costs.

So the question is not whether AI can clone a $15 million app. The question is what you do when that fact is boring. Dust off the app idea in your notes, open a vibe coding tool, and see how far a single weekend—and a few ruthless prompts—can actually take you.

Frequently Asked Questions

What is Vibecode?

Vibecode is a mobile app builder that uses AI models like Anthropic's Claude to turn natural language prompts and images into functional native mobile applications, including backend, payments, and App Store deployment.

What is 'vibe coding'?

Vibe coding is a development style where a user describes the desired outcome in natural language, and an AI assistant generates, edits, or refactors the code to match that 'vibe' or intent, abstracting away the need for traditional coding.

Can this AI process build a production-ready app?

The demo shows it can generate a feature-complete app with a UI, backend, and payments incredibly fast. However, production readiness involves more complex factors like security, scalability, rigorous QA, and legal compliance, which are not fully addressed.

What AI model was used in the video?

The creator specifically used 'Claude Code with Opus 4.5', Anthropic's most powerful model at the time, integrated within the Vibecode.dev platform to build the app.

Tags

#Vibecode#Claude#AI Development#No-Code#App Cloning

Stay Ahead of the AI Curve

Discover the best AI tools, agents, and MCP servers curated by Stork.AI. Find the right solutions to supercharge your workflow.