Skip to main content
📊

Analytics Checklist for AI-Built Apps

Track what users do

When you vibe code analytics with tools like Cursor, Lovable, Bolt, v0, or Claude Code, the generated code often works in development but misses critical production requirements. This checklist helps you catch what AI missed before you ship.

Danger Zone

moderate risk

Bad analytics don't just waste time — they convince you to make the wrong decisions with confidence

Adding analytics looks simple — drop in a tracking code, watch numbers go up. But meaningful analytics means knowing which events matter, tracking them consistently, filtering out bots and test traffic, respecting privacy laws across different countries, and making sure the numbers you see actually reflect what users are doing. Bad tracking is worse than no tracking because you'll make confident decisions based on garbage data.

Failure scenario

You launch and start tracking everything — page views, clicks, sign-ups. Three months in, you notice conversions are dropping. You redesign the whole onboarding flow based on your analytics. Conversions drop further. Turns out 40% of your "users" were actually bots, your "sign-up" event fired twice for real users, and your most engaged customers were filtered out because they use privacy tools that block your analytics.

Common mistakes

  • Tracking events inconsistently (sometimes fires, sometimes doesn't) so the data is unreliable
  • No way to filter out your own testing, bots, or spam traffic from real users
  • Tracking personally identifying information (names, emails) without proper consent or encryption
  • Events named so vaguely ("button_click") that you can't tell what they mean six months later
  • Cookie banners that don't actually stop tracking when someone says no

Time to break: Immediately — but you won't notice until you make a bad decision based on bad data

How are you building this?

Showing what to check when using a managed service

Audit Prompts

Copy these into your AI coding assistant to check your implementation.

Are you tracking bot and test traffic as real users?
data
Look at our analytics data and setup. Is there a way to filter out our own team's testing? Are there obvious signs of bot traffic (same exact patterns, inhuman timing)? Do we have a test environment that sends data to a separate analytics project? Can we exclude traffic from specific IP addresses or development domains?

If 30% of your traffic is bots and dev testing, every decision you make is based on data that's 30% wrong. You might optimize for robots.

Does your tracking respect privacy laws?
security
Check our analytics against privacy requirements. Are we asking for consent before tracking in Europe (GDPR)? Does our cookie banner actually stop analytics when someone says no, or just hide and keep tracking? Are we collecting any personally identifying info (emails, names, IP addresses)? Do we have a privacy policy that mentions our analytics provider?

GDPR fines start at €20 million or 4% of revenue. Even small sites need consent for most analytics tools, and ignoring it isn't just illegal — it's alienating privacy-conscious users.

Will you understand your data six months from now?
data
Review how we've named and organized our analytics events. Are event names descriptive ("checkout_completed") or vague ("click", "action")? Do similar events follow a consistent naming pattern? Is there documentation explaining what each event means and when it fires? Are we tracking the context (which page, which plan, which feature)?

Three months from now when you're trying to understand why conversions dropped, cryptic event names like "btn_1" or "action_complete" will be useless. You'll be guessing what you were tracking.

Are critical events actually being tracked reliably?
reliability
Test our most important analytics events. Do they fire every time they should (not just sometimes)? Do they fire exactly once (not twice or three times on a single action)? If someone has an ad blocker or strict privacy settings, what still gets tracked? Are events sent even if someone closes the page immediately?

If your "purchase completed" event only fires 80% of the time, you'll think sales are dropping when they're not. Or you'll count the same sale twice and think business is better than it is.

Checklist

0/10 completed

Smart Move

Use a service

Basic analytics is easy to implement yourself, but the value comes from analysis tools, privacy compliance, reliable tracking infrastructure, and filtering out noise. Services handle the hard parts and let you focus on using the data instead of babysitting it.

PostHog

Product analytics with session replay (see exactly what users did) and feature flags — open source option available

1 million events per month free

Plausible

Privacy-first web analytics — no cookie banner needed, GDPR-compliant by default

Free for open source projects, $9/mo for small sites

Mixpanel

Event-based analytics designed for product teams — great for tracking feature usage and funnels

100,000 monthly tracked users free

Amplitude

Similar to Mixpanel with more advanced analysis tools — popular with growth teams

10 million events per month free

Tradeoffs

Free tiers cover most side projects, but you'll pay as you grow. Privacy-focused tools like Plausible give you less data but fewer legal headaches. Product tools like PostHog cost more but help you understand user behavior deeply.

Did you know?

Companies with strong product analytics are 2.5x more likely to exceed revenue targets — but 60% of tracked events are never looked at again, creating noise instead of insight.

Source: Product-Led Alliance 2023 State of Product Analytics Report

Related Checks