GA4 includes a native export to BigQuery — Google's cloud data warehouse — that ships raw, unsampled event data directly to a queryable database you control. It's available to every GA4 property, costs nothing beyond standard cloud storage fees, and takes about 15 minutes to turn on. Most marketing teams have never touched it. That's leaving a significant amount of analytical capability on the table, and in many cases it's the difference between reports that answer real questions and reports that raise more questions than they resolve.
What Is GA4 BigQuery Export?
GA4 BigQuery export is a native integration that sends a copy of your raw event data from Google Analytics 4 to a BigQuery project in Google Cloud on a daily or streaming basis. Every event your GA4 property collects — page views, purchases, form submissions, custom events — lands in BigQuery as a row in a structured table, along with all its parameters, device data, geo data, and traffic source attribution.
This is not a summary export or an aggregated report. It's the same raw event-level data GA4 collects before it applies any sampling, thresholding, or aggregation for the interface. According to Google's developer documentation, the export includes every field the GA4 data collection layer captures — many of which aren't surfaced in standard GA4 reports at all.
Why GA4's Built-In Reports Hit Walls
GA4's interface is a powerful starting point, but it has three constraints that limit what you can do with the data as your property and your questions grow more sophisticated.
Sampling. GA4 applies data sampling when queries exceed certain event thresholds — 100 million events for standard reports and 1 billion events for Explorations. For high-traffic properties, or any exploration that spans a long date range, the numbers you're seeing in the interface are estimates, not counts. BigQuery data is never sampled.
Thresholding. GA4 withholds rows from reports when the data could allow identification of individual users — a privacy protection feature that activates based on demographic and interest dimensions. The data exists; GA4 just won't show it to you in the interface. In BigQuery, you have access to the full dataset.
Data retention limits. GA4's retention settings apply specifically to Exploration reports — the custom, ad-hoc analyses you build in the Explore section. Standard reports (Acquisition, Engagement, Monetization, etc.) are not subject to the same retention window. But if Explorations are part of how your team does analysis, the 14-month cap is a real constraint: once that window closes, you can't query that event-level data in a custom exploration anymore. In BigQuery, you control retention — your historical data persists for as long as you choose to store it.
What BigQuery Export Unlocks
With raw GA4 data in BigQuery, the analytical capabilities expand substantially. Here's what becomes possible that isn't in the GA4 interface:
Custom attribution models. GA4's attribution is limited to its built-in models — last click, first click, data-driven. In BigQuery, you can build your own attribution logic using SQL: time-decay models with custom half-lives, position-based models weighted to your business, or entirely custom paths that credit specific channel combinations. If GA4's data-driven attribution doesn't match how your marketing team thinks about credit, BigQuery is where you fix that.
Multi-session user journeys. GA4's user explorer shows individual user paths, but it's cumbersome for population-level analysis. In BigQuery, you can query across all user pseudo IDs to find patterns: what sequence of touchpoints precedes a purchase, how many sessions does a typical conversion path contain, which content is visited by users who eventually convert vs. those who don't. This analysis is nearly impossible in the GA4 interface and straightforward in SQL.
Joins with your own data. BigQuery is a general-purpose data warehouse. Your GA4 event data can be joined with CRM data, ad spend data, email engagement data, or any other dataset you bring in. The result is analysis that GA4 alone can never produce: cost-per-acquisition by traffic source using actual CRM revenue, email-to-purchase attribution across channels, or lifetime value segmented by acquisition campaign.
Unlimited historical retention. Export once, keep forever. Once your GA4 data is in BigQuery, it doesn't expire. Every year you delay turning on the export is a year of historical data you can never recover.
Who Should Use GA4 BigQuery Export
BigQuery export is not necessary for every property — and it's worth being clear about who benefits most. Enabling it without a plan for how to use the data adds infrastructure complexity without analytical payoff.
You should enable GA4 BigQuery export if any of the following apply:
- Your property collects more than 500,000 events per day and you've noticed sampling warnings in Explorations
- You need to join GA4 behavior data with CRM, ad spend, or other external data sources
- You need more than 14 months of historical event data for trend analysis, cohorts, or forecasting in a custom report format
- You want to build custom attribution models that go beyond GA4's built-in options
- Your stakeholders need complex, cross-channel reports that GA4's interface can't produce
- You're building Looker Studio dashboards that are hitting GA4 API quota limits
You probably don't need BigQuery export (yet) if you're a small or early-stage property under 100,000 daily events, your questions are fully answered by GA4's standard reports and Explorations, and you don't have a developer or analyst who will actively query the data. Turning it on is low-risk — storage costs are minimal at small scale — but the value comes from using the data, not just collecting it.
How to Set Up GA4 BigQuery Export
The full GA4 BigQuery export setup guide is in Google's documentation. Here's the practical overview:
Step 1: Create a Google Cloud Project
If you don't already have one, go to the Google Cloud Console and create a new project. This is the container for your BigQuery data. Name it something descriptive — your company name or website domain works well. Enable billing on the project (required for BigQuery, though costs are minimal under the free tier).
Step 2: Enable the BigQuery API
Inside your Cloud project, navigate to APIs & Services and enable the BigQuery API. This takes about 30 seconds and is required before you can link GA4.
Step 3: Link GA4 to BigQuery
In your GA4 property, go to Admin → Product Links → BigQuery Links. Click Link and select your Google Cloud project. You'll need Editor or higher access to the GA4 property and Owner access to the BigQuery project. If your Google Analytics account and Google Cloud project are under the same Google account, this is usually already satisfied.
Step 4: Configure the Export
Choose your export type. Daily export sends a full day's data once per day — sufficient for most use cases and the cheapest option. Streaming export sends events continuously with a few-minute lag — useful if you need near-real-time data in BigQuery, but costs more due to BigQuery's streaming insertion fees. For most teams, daily export is the right starting point.
Select which events to export. You can export all events or filter to specific event types. Unless you have a specific reason to exclude events, export everything — filtering can always be applied in SQL later, and it's much harder to recover excluded data.
Step 5: Verify the Export
After 24–48 hours, your BigQuery project will contain a dataset named after your GA4 property ID, with daily tables in the format events_YYYYMMDD. Run a simple query to confirm data is flowing:
SELECT
event_name,
COUNT(*) AS event_count
FROM `your-project.analytics_XXXXXXX.events_*`
WHERE _TABLE_SUFFIX = FORMAT_DATE('%Y%m%d', DATE_SUB(CURRENT_DATE(), INTERVAL 1 DAY))
GROUP BY event_name
ORDER BY event_count DESC
If you see your expected events (page_view, session_start, purchase, etc.) with roughly the right counts, your export is working correctly. Minor discrepancies between BigQuery and the GA4 interface are normal — they typically stem from GA4's interface-level processing and thresholding.
Understanding the GA4 BigQuery Schema
GA4's BigQuery tables use a nested, repeated structure that's different from a flat relational database table. Each row is one event, and event parameters are stored as a nested array called event_params. This means standard SQL joins and filters require the UNNEST() function to access parameter values.
The key fields you'll use most often:
| Field | What it contains |
|---|---|
event_name |
The name of the event (page_view, purchase, form_submit, etc.) |
event_timestamp |
Unix microsecond timestamp of when the event fired |
user_pseudo_id |
Anonymized identifier for the user/device (persists across sessions) |
event_params |
Nested array of all event parameters (page_location, session_id, custom params, etc.) |
user_properties |
Nested array of user-scoped properties set via set_user_properties |
traffic_source |
First session attribution: source, medium, campaign name |
device |
Device category, operating system, browser |
geo |
Country, region, city |
The nested structure takes getting used to, but it's actually more flexible than a flat schema — every custom parameter you send with an event is available in event_params, so your measurement plan's custom dimensions are all queryable without needing to register them separately.
Common GA4 BigQuery Export Mistakes
Waiting until you need it to turn it on. The export is not retroactive. BigQuery only receives data from the day you enable the link forward. If you turn it on after 18 months of GA4 data, those first 18 months are gone from BigQuery permanently. Enable it now — even if you have no immediate use case — so the data is there when you need it.
Exporting without a data governance plan. BigQuery gives you raw, user-level data. Depending on your data privacy obligations under GDPR or CCPA, this data may need to be handled differently than aggregated GA4 reports. Make sure your legal or compliance team is aware of the BigQuery export and that access controls are in place before sharing the dataset broadly.
Querying events_* without date partitioning. BigQuery charges for the data scanned per query. Running a query against the full events_* wildcard table on a large property can scan years of data and generate a significant bill. Always filter by _TABLE_SUFFIX (for wildcard queries) or event_date (for partitioned tables) to limit the scan to the date range you actually need.
Not connecting BigQuery to a reporting layer. BigQuery is a query engine, not a dashboard. If your stakeholders can't write SQL, the raw data stays unused. Connect BigQuery to Looker Studio, Tableau, or another BI tool so the output of your SQL queries becomes accessible to the people making decisions. The investment in turning on the export only pays off if people can actually consume the data.
Ignoring the GA4 implementation quality underneath. BigQuery amplifies whatever is in your GA4 property — good tracking and bad. If your GA4 implementation has missing events, inconsistent event names, or parameters that aren't populated correctly, those problems are just as present in BigQuery, and now you have more surface area to notice them. A clean measurement plan and a validated GA4 implementation are prerequisites for BigQuery analysis that produces reliable results.
Frequently Asked Questions
The export itself is free — there's no additional cost to link GA4 to BigQuery and stream your event data. You pay only for what you store and query in BigQuery. Google Cloud's free tier covers the first 10 GB of storage and the first 1 TB of queries per month, which is enough for most small and mid-size properties to get started at no cost.
GA4 exports raw, event-level data to BigQuery — every event your property collects, with all associated parameters, user pseudo IDs, device data, geo data, and traffic source information. This is the unprocessed data before GA4 applies sampling, thresholding, or aggregation. Each day's data lands in a separate table partitioned by date.
GA4 interface reports apply sampling for high-traffic properties and thresholding to protect user privacy. GA4's retention settings also limit Exploration reports to a 14-month event-level window — once that closes, you can't run custom explorations against that historical data. BigQuery data is unsampled, unthresholded, and retained for as long as you choose to store it. The raw event tables also include parameters that aren't surfaced in standard GA4 reports, giving you more dimensions to analyze.
You need basic SQL to query BigQuery directly. However, many teams use Looker Studio connected to BigQuery as their reporting layer — which requires no SQL. If your analyst or developer handles the query layer, stakeholders can consume the output in dashboards without writing a single line of SQL.
Standard GA4 properties have a daily BigQuery export limit of 1 million events. If your property consistently exceeds this limit, the daily export will be paused. Properties approaching this threshold should consider GA4 360 or evaluate whether all tracked events are necessary — this is another reason a well-scoped measurement plan matters before implementation.
Want to Actually Use Your GA4 Data?
From BigQuery setup to custom dashboards and attribution models — we build the analytics infrastructure that turns your GA4 data into decisions.
Get a Free Assessment