BigQuery + GA4: Output Pages Report

Remember from Universal Analytics the output reports? 

It was a default report that was within the behavioral heading under Content or Site content in which this table

was displayed.

It was a report of the main exit pages of our website, it is the report that is on the other side of the report of entry pages or landing pages. 

It was a report of the main exit pages of our website, it is the report that is on the other side of the report of entry pages or landing pages.

In the report as its name indicated it showed the main urls by which users leave our website.

UA - Exit pagesThe purpose of this query was to show the main urls by which users leave our website.

The purpose of this query is to identify which are the most common exit pages on a website, which can provide valuable information about user behavior, potential friction points in website design or user flow, and opportunities to improve user retention or optimize conversions.

We also provide you with a "UA - Exit pages" alt.

We also provide a description of what the query does in case you want to know, but you can skip the explanation and copy and paste the query directly.

This BigQuery query is designed to identify the exit pages of user sessions on a website using Google Analytics 4 data exported to BigQuery. Here's the breakdown of what the query does:

  • Subquery "prep":
      • Select user identifiers ('user_pseudo_id'), session identifiers ('session_id'), and the URLs of the pages ('page') where a page view event ('page_view') has occurred.
      • Pick up the 'event_timestamp' for each page view event.
      • Filters to include only events that are pageviews (event_name = 'page_view').
      • Sort the results by 'event_timestamp' to get a sequence of events by time.
  • Sub-query "prep_exit":
      • Use the temporary table 'prep' to select the same fields.
      • Apply the window function 'FIRST_VALUE' to get the last page viewed ('exit_page') in each session per user, sorting the events by 'event_timestamp' in a top-down manner, which means it gets the page with the last timestamp as the exit page.
  • Main inquiry:
    • Select the page ('exit_page') only if it matches the last page viewed in the session (indicating that it is the exit page).
    • Count the number of unique sessions ('exits') that ended at each specific exit page, using 'COUNT(DISTINCT ..') on the concatenation of 'user_pseudo_id' and 'session_id'..
    • Group the results by 'exit_page'..
    • Filters to include only those rows where 'exit_page' is not null, ensuring only exit pages are counted.
    • Sort the results by the number of exits in descending order.

We are going to pass you the query you have to use in Bigquery on your GA4 dataset or dataset of your website in order to see this type of report.

Remember that to do this you don't need to know Bigquery, just copy and paste what follows.


WITH prep AS (
  SELECT
    user_pseudo_id,
    (SELECT value.int_value FROM UNNEST(event_params) WHERE event_name = 'page_view'AND KEY = 'ga_session_id') AS session_id,
    (SELECT value.string_value FROM UNNEST(event_params) WHERE event_name = 'page_view'AND KEY = 'page_location') AS page,
    event_timestamp
  FROM
    -- In events_2023* you can put a specific date: _20231001 (October 1, 2023), _202310* (all of October 2023), _2023* (all of October 2023), _2023* (all of 2023 so far)....
    `<project>.<dataset>.events_2023*``
  WHERE
    event_name = 'page_view'
  ORDER BY
    event_timestamp
),
  prep_exit AS(
  SELECT
    user_pseudo_id,
    session_id,
    page,
    event_timestamp,
    FIRST_VALUE(CONCAT(page, event_timestamp)) OVER (PARTITION BY user_pseudo_id, session_id ORDER BY event_timestamp DESC) AS exit_page
  FROM
    rep
  ORDER BY
    event_timestamp
  )
SELECT
  CASE
    WHEN concat (page, event_timestamp) = exit_page THEN page
  ELSE
  NULL
END
  AS exit_page,
  COUNT(DISTINCT CONCAT(user_pseudo_id,session_id)) AS exits
FROM
  prep_exit
GROUP BY
  exit_page
HAVING
  exit_page IS NOT NULL
ORDER BY
  exits desc

If you want a "closed" date range, for example from August 8 to September 23. 

Take this snippet from the query above: 


FROM
  -- Here you put the name of your GA4 dataset.In events_2023* you can put a specific date: _20231001 (October 1, 2023), _202310* (all of October 2023), _2023* (all of 2023 so far)....
  `<project>.<dataset>.events_2023*``
WHERE
  event_name = 'page_view'

And you replace it with: 


FROM
  `<project>.<dataset>.events_*``
WHERE
  event_name = 'page_view'
  AND _TABLE_SUFFIX BETWEEN '20230808'
  AND '20230923'

And that's it!

When you run the query you would get a table like this:

BQ - Query Results

In the table we can already see the main exit pages of our website and the number of times that users have exited through them in the selected period.

This is the end of today's article. I hope it will be useful and helpful.

PREVIOUS
NEXT

TIPS DE EXPERTOS

Suscríbete para impulsar tu negocio.

LATESTS ARTICLES

No solid base, no AI performance: the challenge of the Data Foundation

In a business context where AI has become the new standard for efficiency and scalability, many organizations face a paradox: they have advanced technology, but they fail to achieve consistent results. The issue usually isn’t the algorithm—it’s the foundation. The Data Foundation is the true determinant of success or failure for any AI, automation, or CRM strategy.

This is confirmed by the latest TDWI (Transforming Data With Intelligence) study, published in June 2025, which warns that more than 49% of companies still lack a database ready to scale artificial intelligence projects.

The Data Foundation: more than just infrastructure

Having a modern data platform doesn’t mean having a solid foundation. The TDWI study emphasizes that an effective Data Foundation must meet three conditions:

- Data quality and governance from the source
- Scalable and connected architecture
- Real-time activation capability

When a company fails in any of these three areas, AI becomes more of a promise than a real business lever.

Key findings from the study

Here are some of the main conclusions of the report:

Only 10% of companies claim to have a fully operational Data Foundation.
40% report severe limitations due to poor data quality, silos, or outdated processes.
Most organizations suffer from fragmentation across data sources, preventing a 360-degree view of the customer.
55% of companies already using AI operationally do so despite their technical limitations, not because of their strengths.

In other words, many companies are running with a backpack full of ballast. And that limits the performance of their AI, automation, or CRM tools.

Why does this matter for your CRM or marketing?

At Hike&Foxter, we see it frequently: companies investing in advanced CRMs, analytics platforms, or generative AI engines… without first securing the technical and structural foundation of their data.

The result:

– AI models that fail in production.
– Automations triggered incorrectly.
– Unreliable analytics reports.
– Inconsistent customer segmentations.

All of this can be avoided with a well-designed Data Foundation, connected to key processes and with controlled data flows.

How to build a real Data Foundation

These are the phases we recommend implementing if you want to turn your data architecture into a competitive advantage:

1. Technical and functional audit

Before incorporating AI, it's important to review:

What data sources exist and how they are integrated
The degree of duplication, obsolescence, or noise they contain
Where the main bottlenecks are (latency, format, access)

2. Standardization and governance

Without a common taxonomy and control rules, any automation attempt will be fragile. This involves:

Defining unified structures (customers, products, interactions…)
Establishing automatic validation rules
Creating clear roles: who creates, modifies, or validates data?

3. Connected and flexible architecture

A data warehouse alone is no longer enough. You need to:

Connect CRM with analytics, automation, and digital channels
Use scalable environments (Snowflake, BigQuery, Azure Fabric)
Consider data mesh or federated architecture if there are multiple business units

4. Real-time activation

The value of AI lies not just in predictive analysis but in its ability to act. Therefore:

Connect your Data Foundation with activation tools (such as Customer Data Platforms, personalization engines, RPA)
Ensure data flows in real time
Prioritize use cases with direct business impact (retention, up-selling, lead scoring…)

Conclusion

Investing in AI, automation, or CRM platforms without a solid Data Foundation is like building a house on sand.
Before thinking about “which model to use,” you should ask yourself “what data feeds it and how is it governed?”

A robust and well-connected infrastructure not only improves your current projects but also prepares you for what’s next: autonomous agents, contextual decisions, predictive personalization, and end-to-end automation.

Want to strengthen your Data & Tech Foundation?

At Hike&Foxter, we help you build the digital foundations your business needs to grow with confidence.

Google transforms its search engine with Artificial Intelligence.

In May 2025, Google took a decisive step toward transforming the world’s leading search engine.
At its highly anticipated annual developer event, Google I/O, the Mountain View-based company unveiled a host of innovations powered by artificial intelligence (AI) that not only enhance user experience but are set to redefine how we interact with digital information.

Smart Automation comes to CRM: Salesforce launches Agentforce

Salesforce has once again set the pace for innovation with the announcement of Agentforce and Marketing Cloud Next—two solutions redefining automation within the CRM ecosystem. These innovations not only incorporate artificial intelligence but place generative AI at the core of business operations, enabling virtual agents to act autonomously across sales, marketing, and customer service processes.

In this article, we analyze the key aspects of this evolution and its impact on business productivity, with a focus on CRM and advanced analytics.

How Amazon uses AI in Prime Day 2025 to personalize the shopping experience

Amazon is preparing for its 2025 Prime Day, which will take place from July 8 to 11, with a clear focus on enhancing the shopping experience through artificial intelligence–powered tools. During the 96-hour event, exclusive to Prime members, AI will help users find exactly what they need at attractive prices—saving time and increasing relevance.

data
Mallorca 184, 08036
Barcelona, Spain