Getting Realtime Data out of GA4 with Big Query

... Tomorrow, you're always, a day away

In the bygone days of Universal Analytics (UA), data was updated within a few hours within Google’s flawed, but familiar, web interface. By 2pm, you could typically see all of what happened for that morning. This was very handy, especially for testing and being able to see the immediate impact of any website changes.

While Google Analytics 4 (GA4) brings huge benefits, it takes 24 hours or more for data to appear in the front-end GA4 web interface.

If all you're doing is testing, then there's the 'Realtime' report section, but you'd better work quickly, events only stay visible for 30 minutes before they are secreted until the following day.

There is, however, a way to extract GA4 data that’s only an hour or two old. The set up requires patience and some technical kn0w-how, but the results are even better than what you could get out of UA.

 

A small query for Big Query

BigQuery is Google's data warehouse available on Google Cloud. With BigQuery, you can store data in all sorts of different formats in one place and analyze later.

With the introduction of GA4, you can now automate exports directly to BigQuery. This is a hugely useful development and is something we use for marketing attribution. Moreover, the export and storage of the GA4 export is free.

GA4 data can be exported to BigQuery regularly so it's only a couple of hours behind — very useful once your test events are outside of the 30 minute “realtime” window.

Diving into BigQuery may seem daunting, but you don’t need to fully master — or even fully understand — BigQuery to get useful data out.

The steps to extracting GA4 data from BigQuery include:

1. Set up a Google Cloud Platform Account

You'll need a GCP account with a valid billing account. You won't be charged for the exports and it's unlikely you'll be using enough capacity to incur costs. Nonetheless, you’ll need a payment method set up.

2. Link GA4 to BigQuery

This is pretty easy but you need to follow the instructions. Like me, you may prefer non-Google instructions.

3. Wait 24 hours

Data takes a full day to populate after setup.

4. Query the data

Now comes the fun bit — you can query the data. Push the “+” button, paste in the query, change the table name and date, then hit ‘run’.

 

BigQuery screenshot of example pasted query

 

For easy reference, we’ve included the plain text below. The table name should be replaced with your actual table name that is updated with the GCP project, analytics profile ID, and today’s date if querying today’s data. The examples show just the ‘purchase’ event, but any event can be entered. However, if something like “page_view” is queried expect the results to be very long. Similarly with the fields queried; only four are present within the query, but there are many others that can be included.

  	-- Handy query to filter BigQuery to the events you want
SELECT
  event_name,
  event_params,
  user_properties,
  traffic_source,
FROM
	-- Replace the example table name below
  `your-gcp-project.analytics_123456789.events_intraday_20240326`
WHERE
	-- Replace the event name with what whatever you're looking for if not 'purchase'
  event_name IN ('purchase')
 

Export away

After running, you’ll see a partially visible table of text under ‘Query results’. You can ‘Save Results’ and look through the data in a text editor, Excel, Sheets, R or Python.

The free tier of BigQuery is generous: there is no charge for the first terabyte of data queried.

What’s even better is the exported data is upsampled, meaning you can see and analyze every event.