BigQuery + GA4: Page Navigation Report

If you've been messing around with GA4, you will have already noticed that there are certain dimensions and metrics that were in Universal Analytics, and that are not in GA4, for example, the Navigation report in which we chose a url of our website and indicated in percentages the path of the previous page and the path of the next page:

UA - Navigation Summary

None of the two dimensions you see: Previous Page Path and Next Page exist for GA4. However, this navigation report that existed for Universal Analytics has always been very useful when we want to know the general behavior during navigation focusing on a specific content. It serves to understand the user flow and improve the browsing experience or the content strategy of the website. 

In this post we are going to explain how to get this report from BigQuery

We don't want to scare you, if you have not yet thought of entering BigQuery to make any query even though you have already made the connection between GA4 and BigQuery, don't run away because in this post we are going to make it easy for you, you will only have to copy and paste the query that we are going to give you and change a number of things. 

Yes we are going to explain what each part of the function does in case you are interested, but if you are not just go down to where the query is and copy it. 

This BigQuery query uses Google Analytics 4 data to analyze user navigation on a website, specifically around a given page URL. It is performed in several stages:

1. "prep" sub-query:


  • Select 'user_pseudo_id' (an anonymous identifier for the user), 'session_id' (a user's session identifier), 'page' (the URL of the page viewed), and 'event_timestamp' (the time at which the page view occurred).
  • Filters to include only events that are pageviews (event_name = 'page_view').
  • Data is extracted from the event tables for a specific date range of the Google Analytics 4 dataset (in this case, for the year 2023).


2. "prep_navigation" sub-query:


  • Uses the temporary table 'prep' to get a sequence of pages visited by each user and session.
  • Apply the window functions LAG and LEAD to get the previous page ('previous_page') and next page ('next_page') respectively, for each page view, sorted by 'event_timestamp' in ascending order. This is partitioned by 'user_pseudo_id' and 'session_id', which means that the sequence of pages is specific to each session of a user.

3. Main query:

  • Replace null previous and next pages with '(entrance)' and '(exit)' respectively, which indicates that if there is no previous page, the page in question is the entrance to the site, and if there is no next page, it is the exit.
  • Count the number of unique sessions ('count') where the specific page has been visited, using 'COUNT(DISTINCT ..)' on the concatenation of 'user_pseudo_id' and 'session_id'.
  • Filter to display information only for the URL of the specific page you want to query.
  • .
  • Aggregate the results by 'previous_page', 'page' and 'next_page'.
  • Filter to ensure that the page is not equal to 'previous_page' or 'next_page' to not count self-references.
  • Sort the results by 'count' in descending order.

Attention. This is the query you need:


with prep as (


select

    user_pseudo_id,

   (select value.int_value from unnest(event_params) where event_name = 'page_view' and key = 'ga_session_id') as session_id,


   (select value.string_value from unnest(event_params) where event_name = 'page_view' and key = 'page_location') as page,


   event_timestamp

from

-- Here you put the name of your GA4 dataset.In events_2023* you can put a specific date: _20231001 (October 1, 2023), _202310* (all of October 2023), _2023* (all of 2023 so far)...

 

   ``tests-bigquery-351807.analytics_313378230.events_2023*``

where

    event_name = 'page_view'),

  

prep_navigation as (

select

    user_pseudo_id,

    session_id,

   lag(page,1)over(partition by user_pseudo_id,session_id order by event_timestamp asc)as previous_page,

    page,

   lead(page,1)over(partition by user_pseudo_id,session_id order by event_timestamp asc)as next_page,

   event_timestamp

from

    prep)

select

   ifnull(previous_page,'(entrance)')as previous_page,

    page,

   ifnull(next_page,'(exit)') as next_page,

   count(distinct concat(user_pseudo_id,session_id)) as count

from

        prep_navigation

where

 -- Copy and paste below the url of the page you want to consult.

 

    page = "https://www.hikeproject.com/como-visualizar-porcentajes-en-un-scorecard-de-data-studio/"

group by

    previous_page,

    page,

    next_page

having

   page != previous_page

   and page != next_page

order by

   count desc

desc

 

If you want a "closed" date range, for example from September 2 to October 15.

Take this snippet from the query above: 

from

from

-- Here you put the name of your GA4 dataset.In events_2023* you can put a specific date: _20231001 (October 1, 2023), _202310* (all of October 2023), _2023* (all of 2023 so far)...

 

   `tests-bigquery-351807.analytics_313378230.events_2023*`

where

    event_name = 'page_view'),

And you replace it with: 

from

 `tests-bigquery-351807.analytics_313378230.events_*``

where

event_name = 'page_view'

and _TABLE_SUFFIX between '20230902' and '20231015'),

And that's it!

When you run the query you would get a table like this: 

BQ - Query Results-1

The central column corresponds to the url selected to analyze (Page), the column to its left indicates the previous page and the column to the right, the next page.

The central column corresponds to the url selected to analyze (Page), the column to its left indicates the previous page and the column to the right, the next page.

This way we see that the post: 

https://www.hikeproject.com/como-visualizar-porcentajes-en-un-scorecard-de-data-studio/

 

In the period from September 2 to October 15 there have been:

With this type of reports you can not only make the analysis of a specific content of your website, they are also very useful to analyze user behavior in a process or a task, for example in a flight booking page, you can analyze what users do after performing a search, what percentage of them return to the home page perhaps to perform another search, what percentage of users go to the next screen to choose fares ...

It is also very useful to make an analysis of the home page of the website especially for websites that have more than one objective in their home, returning to the example of a flight booking page: search for flights, perform the check in of a flight of a reservation, find information about a flight already purchased, contact for an incidence or doubt...

Were you using the navigation reports in Universal Analytics, and were you missing them in GA4?

...

Were you missing them in GA4?

 

 

 

PREVIOUS
NEXT

TIPS DE EXPERTOS

Suscríbete para impulsar tu negocio.

LATESTS ARTICLES

No solid base, no AI performance: the challenge of the Data Foundation

In a business context where AI has become the new standard for efficiency and scalability, many organizations face a paradox: they have advanced technology, but they fail to achieve consistent results. The issue usually isn’t the algorithm—it’s the foundation. The Data Foundation is the true determinant of success or failure for any AI, automation, or CRM strategy.

This is confirmed by the latest TDWI (Transforming Data With Intelligence) study, published in June 2025, which warns that more than 49% of companies still lack a database ready to scale artificial intelligence projects.

The Data Foundation: more than just infrastructure

Having a modern data platform doesn’t mean having a solid foundation. The TDWI study emphasizes that an effective Data Foundation must meet three conditions:

- Data quality and governance from the source
- Scalable and connected architecture
- Real-time activation capability

When a company fails in any of these three areas, AI becomes more of a promise than a real business lever.

Key findings from the study

Here are some of the main conclusions of the report:

Only 10% of companies claim to have a fully operational Data Foundation.
40% report severe limitations due to poor data quality, silos, or outdated processes.
Most organizations suffer from fragmentation across data sources, preventing a 360-degree view of the customer.
55% of companies already using AI operationally do so despite their technical limitations, not because of their strengths.

In other words, many companies are running with a backpack full of ballast. And that limits the performance of their AI, automation, or CRM tools.

Why does this matter for your CRM or marketing?

At Hike&Foxter, we see it frequently: companies investing in advanced CRMs, analytics platforms, or generative AI engines… without first securing the technical and structural foundation of their data.

The result:

– AI models that fail in production.
– Automations triggered incorrectly.
– Unreliable analytics reports.
– Inconsistent customer segmentations.

All of this can be avoided with a well-designed Data Foundation, connected to key processes and with controlled data flows.

How to build a real Data Foundation

These are the phases we recommend implementing if you want to turn your data architecture into a competitive advantage:

1. Technical and functional audit

Before incorporating AI, it's important to review:

What data sources exist and how they are integrated
The degree of duplication, obsolescence, or noise they contain
Where the main bottlenecks are (latency, format, access)

2. Standardization and governance

Without a common taxonomy and control rules, any automation attempt will be fragile. This involves:

Defining unified structures (customers, products, interactions…)
Establishing automatic validation rules
Creating clear roles: who creates, modifies, or validates data?

3. Connected and flexible architecture

A data warehouse alone is no longer enough. You need to:

Connect CRM with analytics, automation, and digital channels
Use scalable environments (Snowflake, BigQuery, Azure Fabric)
Consider data mesh or federated architecture if there are multiple business units

4. Real-time activation

The value of AI lies not just in predictive analysis but in its ability to act. Therefore:

Connect your Data Foundation with activation tools (such as Customer Data Platforms, personalization engines, RPA)
Ensure data flows in real time
Prioritize use cases with direct business impact (retention, up-selling, lead scoring…)

Conclusion

Investing in AI, automation, or CRM platforms without a solid Data Foundation is like building a house on sand.
Before thinking about “which model to use,” you should ask yourself “what data feeds it and how is it governed?”

A robust and well-connected infrastructure not only improves your current projects but also prepares you for what’s next: autonomous agents, contextual decisions, predictive personalization, and end-to-end automation.

Want to strengthen your Data & Tech Foundation?

At Hike&Foxter, we help you build the digital foundations your business needs to grow with confidence.

Google transforms its search engine with Artificial Intelligence.

In May 2025, Google took a decisive step toward transforming the world’s leading search engine.
At its highly anticipated annual developer event, Google I/O, the Mountain View-based company unveiled a host of innovations powered by artificial intelligence (AI) that not only enhance user experience but are set to redefine how we interact with digital information.

Smart Automation comes to CRM: Salesforce launches Agentforce

Salesforce has once again set the pace for innovation with the announcement of Agentforce and Marketing Cloud Next—two solutions redefining automation within the CRM ecosystem. These innovations not only incorporate artificial intelligence but place generative AI at the core of business operations, enabling virtual agents to act autonomously across sales, marketing, and customer service processes.

In this article, we analyze the key aspects of this evolution and its impact on business productivity, with a focus on CRM and advanced analytics.

How Amazon uses AI in Prime Day 2025 to personalize the shopping experience

Amazon is preparing for its 2025 Prime Day, which will take place from July 8 to 11, with a clear focus on enhancing the shopping experience through artificial intelligence–powered tools. During the 96-hour event, exclusive to Prime members, AI will help users find exactly what they need at attractive prices—saving time and increasing relevance.

data
Mallorca 184, 08036
Barcelona, Spain