Your Platform's Security Badge Might Be Decorative

A vibe-coding platform with a potentially fabricated ISO 27001 certificate just left every pre-November 2025 project — source code, database credentials, AI chat history — readable by any free account for 48 days. And counting.

Prologue

On Monday, a security researcher created a free account on Lovable, the AI app builder valued at $6.6 billion, and pulled the full source tree of an active admin panel belonging to a real Danish nonprofit. No exploitation required. Five API calls. The source contained hardcoded Supabase credentials — including the service role key, which bypasses every database security policy. From there, the researcher queried the live database and extracted real names, job titles, and LinkedIn profiles of professionals from Accenture Denmark and Copenhagen Business School.

This isn't a sophisticated attack. It's a Broken Object Level Authorization flaw — OWASP's #1 API security risk. Lovable's API checks whether you're logged in. It never checks whether the project you're requesting is yours.

The researcher reported this on March 3 via HackerOne. Lovable triaged it. They shipped ownership checks for new projects and left all existing projects open. A project created last week returns a 403 Forbidden response. A project from October 2025, still actively maintained, returns everything.

Forty-eight days later, the HackerOne report was still open. When the researcher filed a second report documenting additional affected endpoints, Lovable marked it as a duplicate and closed it.

The fix is behind a paywall

Lovable recommends setting your project to "private." But that's a paid-tier feature.

If you're on the free plan — the same plan the researcher used to access other people's projects — you cannot protect your own. The company's initial response characterized this as "intentional behavior" tied to how public projects have always worked. They later acknowledged the confusion and pointed out that HackerOne did not escalate the report internally.

Three different explanations in 24 hours. None of them involved taking responsibility.

The Delve connection

Here's where this story gets layered.

Lovable held an ISO 27001:2022 certificate. That certificate was issued through Gradient Certification, via Delve — a YC-backed compliance automation startup that was exposed earlier this year for systematically fabricating audit reports. 494 reports. 58 companies named. Lovable was reportedly Delve's highest-profile customer, the name dropped in virtually every sales call.

Now, I want to be precise: a fraudulent audit didn't write Lovable's broken API. Correlation is not causation. But it is signal.

A real ISO 27001 implementation — one with functioning access controls, vulnerability management processes, and a secure development lifecycle — would have caught a BOLA flaw before it reached production. These aren't exotic controls. They're Annex A basics: A.8.3 (information access restriction), A.8.8 (management of technical vulnerabilities), A.8.25 through A.8.28 (secure development lifecycle). If these controls had been genuinely implemented and tested by a legitimate auditor, we probably wouldn't be having this conversation.

I know what a genuine implementation looks like because I've done it. On my first certification engagement, the client's CEO told me something I've never forgotten: "We don't want to slap lipstick on a pig." That gave me the mandate I needed. We weren't building a paper trail. We were building security.

One of the first things I found was that their laptop fleet was managed by an external IT provider that hadn't applied any meaningful hardening. No baseline. No configuration management. Just devices in the wild. Before anything else moved forward, we implemented CIS 18 IG1 controls across the fleet. That's what a real implementation looks like — you pull the thread on every control and find out whether someone actually did the work.

Instead, with Lovable, we have a company whose compliance certification may have been stamped from a template before any evidence was reviewed, running a platform with the most common API vulnerability in the OWASP Top 10, leaving it unpatched for seven weeks after responsible disclosure.

These aren't two separate stories. They're the same organizational posture toward security, expressed in two different ways.

What this means if you're a founder

If your team used Lovable before November 2025, treat this as a breach. Rotate your Supabase service role keys first — those bypass row-level security entirely. Then rotate every database credential, every API key you pasted into a Lovable chat while debugging, and every third-party token in your source. Stripe, SendGrid, OpenAI, auth providers — all of it.

Open your project URL in an incognito window. If you can see code or chat history, so can everyone else.

But the larger point is one of due diligence. Not just with Lovable — with any platform that holds your source code, your credentials, and your customer data.

When a vendor shows you a compliance badge, don't stop there. Ask them who conducted the audit. Ask whether the certification body is accredited. Ask to see the scope — if it excludes the systems that process your data, the badge is decorative. Ask them to walk you through their controls, not their trust page. Anyone can build a trust page. Not everyone can explain what happens when a vulnerability is reported and how long it takes to fix.

Lovable's publicly documented answer to that last question is: at least 48 days, and only for new customers.

The real cost

There's a growing conversation about what some people are calling the "vibe coding tax" — the hidden costs of building fast on AI platforms without understanding what you're giving up in the process. I think that framing is useful, but incomplete.

The cost isn't that vibe coding is inherently insecure. Code is code, regardless of who or what wrote it. The cost is platform trust. When you build on a platform, you inherit that platform's security posture. Their access controls become your access controls. Their vulnerability management cadence becomes your exposure window. Their response to a security researcher becomes your response to your own customers.

If you wouldn't accept a 48-day response time from your own engineering team, you have no reason to accept it from the platform that holds your entire codebase.

The badge on the wall didn't protect anyone. The controls behind it would have. And in this case, it's not clear that those controls ever existed.

Next
Next

The Scenario Doesn't Matter