Skip to main content

Why we built Level

Most performance teams stitch ad reports together with spreadsheets and prayer. Here's the cleaner path.

If you have ever owned the weekly performance report at an agency, you know the ritual. It is Monday morning. You open one browser window, then a second, then a third. You log into Google Ads, then into Meta Ads Manager, then into whatever the platform-of-the-week happens to be. You export a CSV from each, you paste them into a shared spreadsheet, you hand-align the columns because the platforms cannot agree on the spelling of "spend", and somewhere around the third coffee you start the pivot table. By the time the analyst has a defensible cross-channel rollup, half a Monday is gone — and the only thing the client will ever see is a single chart that took six hours to assemble.

We built Level because we kept finding ourselves on the receiving end of that ritual, and we ran out of patience for it. This post is about why the problem is harder than it looks, what we mean when we say "normalized", and what a typical Monday looks like once the spreadsheet is gone.

The Monday ritual, in detail

The thing nobody tells you about cross-platform reporting is that the actual reporting work — the part that gets the agency credit — is maybe twenty percent of the time. The other eighty percent is plumbing.

A typical performance team running, say, twenty clients across two ad platforms will do all of the following before they can even start thinking about narrative:

  • Pull twenty raw exports per platform per week. That is forty CSV files for two platforms, every Monday, on a recurring calendar block that nobody on the team actually wants to own.
  • Reconcile the column names. Google Ads exports a column called Cost. Meta calls the same number Amount spent. If you forget the rename, your weekly spend total is silently wrong and nobody catches it until the client invoice arrives.
  • Reconcile the currencies. The Meta ad account billed in USD and the Google Ads account billed in EUR will sit in the same row of your spreadsheet unless somebody, at some point, picked an FX rate and wrote it down. Most teams pick last Friday's rate and quietly hope.
  • Reconcile the attribution windows. Meta defaults to 7-day click + 1-day view. Google defaults to data-driven, which is a moving target. If the client pays you on CPA, the column called "Conversions" in two different exports is not the same number, and nobody is calling that out in the deck.
  • Build the deck. Pivot table → screenshots → slide deck. Every cell is a place a manual mistake can hide.

How ad platforms make this harder than it looks

Performance reporting is hard not because the math is hard. The math is rounding. It is hard because the four big ad platforms, taken together, behave like four different databases that have never met each other, written by four teams that have never agreed on a single naming convention, returned over four APIs that have each made their own opinionated choices about what "yesterday" means.

Three concrete examples that an analyst will hit within their first hour.

Naming

Google Ads emits a column called cost. Meta emits the same value as amount_spent. TikTok, when we ship that integration, will emit it as total_cost. None of these are wrong — they are just different. If you write a spreadsheet template that reads "Cost" from every export, you will silently drop your Meta spend, and nobody will tell you. There is no error message. There is just a lower number on Tuesday's deck and an awkward email on Wednesday.

Currency

Meta returns a currency field on the account level. Google Ads returns a currency_code field on the customer level. Both fields can lie if the account billing currency was changed mid-quarter — the platform will keep returning the new currency on rows that were originally invoiced in the old one. Most teams deal with this the way you deal with most data quality issues in a spreadsheet: they don't, and then quarterly variance reviews get weird.

Attribution

The phrase "conversions last week" means at least four things. It means "conversions attributed to clicks in the last seven days under the platform's default model." It means "conversions reported within the platform's own attribution window, which may extend up to ninety days after the click." It means "conversions counted by view-through, which is a thing on Meta and not really a thing on search." And it means "the model the platform happens to be running this week, which Google quietly changed in 2023 and 2024." Comparing the three columns in a single weekly chart is a small act of faith.

What "normalized" actually means

Our position is straightforward: the messy reconciliation should happen exactly once, by a team that cares about getting it right, before the data ever touches a chart. That is what Level does at ingest time. Three concrete things change.

Naming becomes one schema. Every platform's cost, spend, amount_spent, or total_cost lands in a single field — same name, same units, same meaning — across every account in your workspace. The chart you build on top stops caring which platform a row came from.

Currency is preserved, not collapsed. Each ad account keeps its own billing currency through the pipeline. When you build a multi-currency report — say, a global agency rolling up EUR clients alongside USD clients — Level applies the FX rate at query time, against the date of the spend, using a pinned rate source you can audit. No mystery conversions hiding inside a CSV.

Attribution becomes a workspace decision, not a platform default. You pick the attribution rule once, in your workspace settings — say, 7-day click for everything — and Level rewrites every conversion column in every chart against that rule, regardless of the platform's own defaults. If you want to change the rule next quarter, every historical chart updates with you.

A typical Monday, after Level

Here is what we wanted Monday to feel like, and here is what it actually looks like for the agencies running on Level today.

You open one tab. You sign into Level. You see the workspace for the brand or client you're reporting on this week, with every connected ad account listed on the left and a status indicator on each one showing the last hourly sync. Every account is green; the latest pull was within the hour.

You click into the Weekly summary report you saved last quarter. The numbers are already there — spend by channel, clicks, conversions, CPA, ROAS — across every account in this client's workspace, normalized to your chosen attribution window and the client's reporting currency. There is no CSV, no pivot table, no rename pass.

You spend the next twenty minutes doing the part of the job you were hired to do: looking at the chart, noticing what's interesting this week, and writing the narrative. The deliverable to the client is a shared link, read-only, no Level account required, with your agency's logo on it. If the client wants the same view as a recurring email, you click "Schedule" once and that's a permanent thing now. If the client wants a PDF for the quarterly board pack, you click "Export" and it's a PDF. The whole exercise is one screen.

What's next

We are shipping this small on purpose. The two integrations live on day one — Google Ads and Meta — cover the overwhelming majority of agency spend, and getting them right is more valuable than shipping four half-built connectors that all need babysitting. TikTok Ads and LinkedIn Ads are the next two integrations on the roadmap; if those are blocking for your team, tell us and we will share where they sit in the queue.

The product roadmap beyond integrations is roughly: deeper formula metrics in the report builder, native white-label client portals, scheduled exports to Slack and Google Drive, and a much richer alerts surface so the team can stop checking the dashboard manually for anomalies. Everything we ship sits on top of the same normalized event store described above, which means the work we did on day one keeps paying off.

For pricing and the detailed feature breakdown, see /pricing. For a deeper sense of how platforms differ on metric definitions, the GA4 measurement protocol reference is a good primary source. And if you have ever lost a Monday to CSV alignment — we built this for you.