Chris Green

Jun 28, 20212 min

Do Lookup Variables in Google Tag Manager Cause Performance Issues?

Updated: Jul 5, 2021

Lookup tables in Google Tag Manager can solve so many potential problems and be really quite versatile.

One of the questions I've had though is what is the impact of lookup tables if they get too large - do they slow a webpage down? The answer in short is, yes - but as far as this test (of one) shows, it is by a really very small amount.

There are a number of different considerations here, but I wanted to document my test out for you.

Testing lookups and their impact on Core Web Vitals

The method is really simple, create three different sized lookup tables in JSON format (800, 1,600 and 3,200 rows + a zero row setup), import each into Google Tag Manager and then test.

To keep things simple I've used Google Page Speed Insights and tested each three times to take an average. Not the most rigorous of processes, but the only variable (I could control for) was the number of lookups.

Also worth adding is that this GTM container these are getting uploaded into has <20 tags in - pretty lightweight and very little to conflict with these lookup additions. If you're running a very heavy GTM setup already you may find any impacts from this kind of testing may be more acute.

How big is the impact?

What we're seeing here is not wholly unexpected given what we're doing to it, but I wasn't seeing anywhere near the level of change I was anticipating.

Impact of lookup tables on Core Web Vitals

Time to Interactive (TTI) was 8.3s on the control (zero lookups) and 3,200 row test was 8.6s - 0.3s slower. The Speed Index (SI) took a similar impact being 0.3s slower also. A larger JS bundle is increasing parsing and execution time slightly.

Your perception on how bad this is (or how much impact it'll cause) will vary depending on what you're looking to achieve. If you are focused on scoring as highly on the Core Web Vitals as possible this increase may be unacceptable. That said, the trade-off might make all the difference.

Using Google Tag Manager to re-write title tags to improve them for SEO for example, then the 0.3s loss in TTI and SI could be worth it if you increased rankings and therefore traffic.

Should you be using large lookup tables in Google Tag Manager?

Google Tag Manager was not happy with me when I loaded a 3,200 line lookup table in. In fact editing that variable using the UI was almost impossible because it was too large. I had to edit/change the JSON file and then import it into my container, that worked fine and then testing the impact with GTM was also fine.

So it's doable, just not ideal.

Simo Ahava created another method which looks more sophisticated which uses the GTM API, I haven't tested it myself, but it looks like a better work-around if you are really interested.

Speaking from experience, if someone is considering adding a ~3,000 line lookup table to GTM they are either a) out of "ideal options" and just need to get the job done, b) some kind of maverick, or c) can't resist playing. Which ever of these best fits you, just be careful and test everything.

Also, drop me a DM and let me know how you get on!

    0