We were plagued by CLS problems on our TutKit.com portal for almost three months. We spent a lot of time and energy identifying and solving the problems with the Cumulative Layout Shifts. In this field report, I would like to give you an insight into how we went about it.
Introduction: Why CLS is a problem
Cumulative Layout Shift (CLS) is one of the three central metrics of Google's Core Web Vitals. These metrics are crucial for evaluating the user-friendliness of websites and influence the ranking in search results. CLS measures the visual stability of a page and shows how often unexpected layout shifts occur while a page is loading.
What does CLS mean in concrete terms?
When elements on a website suddenly change position during loading, it can be frustrating for users. You may be familiar with it when website content jumps because an ad is suddenly displayed. A button that is about to be clicked can also suddenly change its position and activate a different link instead. Such unexpected layout shifts are disruptive and have a negative impact on the user experience.
The CLS value is calculated from the size of the shifted element and the distance it covers within the viewport. Google considers CLS values below 0.1 to be good, between 0.1 and 0.25 to be in need of improvement and anything above that to be bad.
Here is our chart from the Search Console. Yellow are the values between 0.1 and 0.25 and red are the pages that have values above this.
The classic causes for high CLS values on most websites are images that are too large in outdated image formats (JPG/PNG) and/or without fixed dimensions. If images do not have defined height and width specifications, they can cause layout shifts. This happens because, while the image is loading, the browser reserves space based on its aspect ratio (width/height). Or dynamic content such as ads, pop-ups or embedded content that appears subsequently during loading can cause the layout to jump cumulatively. The slow rendering of fonts can also impact the layout if a system font loads first and is later replaced by a web font. This can cause text to shift, for example, increasing a paragraph from six to seven lines and pushing subsequent content downward. Since each font has unique character dimensions, switching fonts can lead to line breaks and unexpected layout changes.
Although we had excellent PageSpeed scores at TutKit.com, we simply weren't passing Core Web Vitals for most pages since mid-December 2024. It just looked crazy: mobile PageSpeed score of 95 or 97, but Core Web Vitals failed. The problem was CLS. Since CLS takes into account not only the first viewport but also interactions within the page, the cause was hard to identify. In this report, we share with you our experience of how we identified and successfully solved the problems.
Root cause analysis: Why did we experience CLS?
In the past few weeks, some SEO experts I follow on LinkedIn reported sudden CLS warnings showing up in Google Search Console since December 2024 and January 2025. This could be related to the December 2024 Core Update that Google rolled out in mid-December. Websites that previously had green CLS values were suddenly classified as problematic, just like ours.
One possible reason is an adjustment to Google's rating system for user signals. There are indications that interactions within the pages have a greater influence on the rating. As a result, Core Web Vitals values can deteriorate even without technical changes. What was previously okay was now causing problems. And in our case, due to the large number of different page types, sections and functions, it was a few more areas that caused the CLS problems:
- Google AdSense: Delayed loading of ads caused postponements.
- Tables of contents: Dynamic reloading via JavaScript caused layout jumps.
- Images without width/height specifications: Suddenly appearing images changed the structure of the page.
- Web fonts: Late loading of the fonts led to unexpected breaks.
- FAQ modules: Questions opened by default caused unpredictable layout shifts.
- Slider & accordion tabs: Content was only rendered and moved later.
- Google Reviews: Externally reloaded content affected the page structure.
These problems were not easy to detect, as lab tests such as Lighthouse and PageSpeed Insights did not show any noticeable CLS values.
The difficult path to detecting hidden layout shifts
Why were CLS problems so difficult to identify?
- Lighthouse and PageSpeed Insights showed optimal values because they only measure the first loading process and do not simulate real user scenarios. They test the page in a laboratory environment with stable networks and ideal loading conditions. As a result, many layout shifts that occur during real interactions are not recognized.
- Google Search Console identified CLS as a problem because it analyzes real user data (field data). This data comes from real users with different devices, networks and usage patterns, which revealed problems that did not occur in lab tests.
- CrUX API and BigQuery did not provide real-time data, but aggregated values from the last 28 days. This means that improvements only become visible with a delay and an immediate evaluation was not possible.
In Search Console, the URL types with the CLS problems were named, which were a good first indication that the Core Web Vitals had not been passed, even if the normal check via PageSpeed Insights or Lighthouse did not reveal any CLS values that were too high. We had to find our own method to make CLS errors visible to us.
Analysis with Chrome DevTools → Performance tab
The entire website was reloaded section by section with each page type and checked for layout shifts. We did not use Lighthouse or PageSpeed for this, but the Performance tab in Chrome DevTools.
Particularly problematic elements were marked and repeatedly tested with different throttling settings. By artificially throttling the internet speed to Slow 4G, we were able to better simulate delays when loading individual elements and reproduce layout shifts. Here in the picture we see a layout shift that became visible with network and CPU throttling:
We logged all CLS issues and were able to identify the problematic website elements. We included not only the problematic page types according to Search Console in our tests, but all page types in order to solve the issue sustainably in case the requirements for website operators become even stricter in the future.
All simple and quick tests can be performed with DevTools → Performance and reloading the page via F5 at different positions, or by a hard and more intensive test with performance recording.
Some layout shifts are almost impossible to notice during normal tests. For example, page type A may not contain any layout shifts and page type B may not have any layout shifts either. Only the change between both pages when navigating back and forth revealed the CLS problem, which could not be identified when testing the individual pages in isolation.
Therefore, always test complete user flows and not just isolated pages in order to identify hidden layout shifts. The analysis of actual user behavior often reveals errors that standardized tests overlook. This allows potential problems to be identified at an early stage and performance to be kept stable.
Tests with different screen resolutions:
By default, Lighthouse and PageSpeed Insights test with a fixed screen size. In the Google Search Console, we noticed that some CLS problems only occurred for certain user groups, e.g. visitors from Sweden with certain mobile devices, based on the problematic URL types.
We also conducted mobile tests with different resolutions to identify specific problems that occurred in real usage scenarios. To do this, we exported the data from Google Analytics and carried out the tests for 200 different mobile resolutions in order to identify problems that only became visible with a certain resolution. To get a feel for how wide the range of resolutions is that users use on a website: The export file from Google Analytics showed 4,083 different resolutions for desktop, tablet and mobile for a period of four weeks. We sorted these by number of users. Even at position 200 of the resolutions, we still had 88 users who were on our site with a resolution of 412x778 during this four-week period.
Google has recently added a tool that allows you to adjust the throttling of your PC or notebook to the various mobile devices.
This systematic approach helped us to identify the exact problem areas and implement targeted fixes. It is important to understand that certain CLS problems only occur under specific conditions, but these can discredit the overall rating of the site.
Our solutions for the identified CLS issues
Once we had identified the causes of the layout shifts, we implemented targeted measures to resolve them:
Defining image dimensions
Problem: Images without defined width and height specifications led to unexpected layout changes.
Here, for example, you can see the mobile PageSpeed result of 92 Performance without recognizable CLS problems. With our tests in the DevTools in the Performance tab, we now identified the problem in real user behavior. The CLS value showed a value of 0.22 ... clearly too high.
Solution: Some of the images were still missing a width-height specification. We created a query for all images in the portal and automatically inserted this information into the code. The frontend now knows the size of the image when it is loaded and keeps the necessary space free in the layout to display the appropriate aspect ratio of the image. This means that there is no jump when loading the image. A nice side effect: The mobile PageSpeed value has also improved as a result.
Here is the same URL after merging our CLS fix in an identical test with PageSpeed Insights and DevTools:
Optimizing Web Fonts
Problem: Late loading of fonts led to breaks and layout shifts. In particular, we had the problem that the system-side default font was loaded first (here: Arial) and then the Noto font was loaded as a web font. We use the Noto web font in our portal. Google had the font designed to support all existing writing systems, from Latin characters to Cyrillic and Arabic script to more exotic character sets such as Khmer or Tamil. This makes this font ideal for a multilingual platform like TutKit.com, where content is provided for a wide variety of target groups. A major advantage of Noto is its broad language support - but this was also a technical challenge. As Noto covers so many writing systems, the font family is extremely extensive.
By default, more characters are often loaded than are actually needed. This can lead to longer loading times, especially for mobile users, which has a negative impact on the user experience and SEO.
Solution: We checked which font weights from 100 to 900 are loaded and compared the necessity in the user interface, which enabled us to reduce the different font weights. On the other hand, the font files were reduced to the fonts that were actually only necessary. Only two font files with a total size of 105 kB were loaded.
We also no longer use entire font files for the individual languages, but only subsets of the fonts. When our language switcher is opened, new languages are loaded with special characters from the Nordic and Slavic languages; the language sets for Greek, Cyrillic, Japanese and Korean are loaded. With two exceptions, the subsets are extremely small in file size.
This has enabled us to shorten the loading time and ensure that texts are displayed correctly right from the start without visible changes from system font to web font.
Further adjustments for CLS fixing
We also dealt directly with the other CLS problems that we identified. These included, for example
Google AdSense optimization
Problem: Delayed loading of ads led to layout shifts.
Solution: Adjust AdSense settings to use static placeholders that reserve the required space when the page loads.
Conversion of the tables of contents from JavaScript to PHP
Problem: Dynamic reloading via JavaScript caused layout jumps.
Solution: Render the tables of contents in blog articles on the server side with PHP to ensure that they are available immediately when the page is loaded and do not cause any subsequent shifts.
Customization of the FAQ modules
Problem: Questions opened by default caused unpredictable layout shifts.
Solution: Removal of the function where questions were open by default to avoid unexpected layout changes.
Optimization of slider and accordion elements
Problem: Content was only rendered afterwards, which led to layout shifts.
Solution: Preloading the elements and setting fixed heights for sliders to ensure a stable layout during loading.
Improving the integration of Google Reviews
Problem: Externally reloaded content influenced the page structure and led to layout shifts.
Solution: Use of lazy loading with fixed placeholders to ensure that the required space is already reserved when the page loads.
As I haven't logged separate images for documentation purposes for each fix, here is an excerpt from JIRA, our project management tool, in which the CLS-related tasks are filtered.
Initially, we created a ticket for each anomaly and triggered the validation in the Seach Console after the merge. But that wasn't enough because other problems were identified, so in the last step we bundled the CLS problems into three tickets (List of CLS problems) until they were all fixed. The number of tickets and the breadth of the ticket numbers alone show how long and intensively the CLS problems kept us busy. While the Heise news portal was able to solve its CLS problems - caused by the consent cookie banner - with a single line of code, we had to work on a few more lines of code.
After implementing these measures, we realized that the improvements were not immediately visible in Google Search Console. The validation process was reported to take up to 28 days. This is because Search Console is based on real user data that is aggregated over a 28-day period. Therefore, it can take up to four weeks for the effects of the changes to be fully reflected. In our case, the first improvements became visible after a few days.
Have you done your homework?
One of the SEO experts who shared his CLS issues on LinkedIn expressed his appreciation to all the teams who fulfill the Core Web Vitals in their projects: I admire all the teams that fulfill the Core Web Vitals! Because I can't do it myself...
The recognition is justified, because meeting the Core Web Vitals for large websites requires a strong focus on technology, page speed and user experience. While I'm reporting here on how we solved the CLS problems at our company, the large number of issues we found could give the impression that we already had some technical issues. But that's not the case at all. Due to the roll-out of our multilingualism, we were confronted with many new challenges and therefore spent many months on CSS and JS refactoring, reducing database queries and other PageSpeed improvements such as TTFB optimization. Today, PageSpeed Insights shows a mobile PageSpeed score of 95 for our homepage. There are only a few recommendations in the report that contain further potential for improvement.
Or here is a subpage that also had CLS problems recently. Take a look at the few recommendations that PageSpeed Insights gives us. The second point with the image elements doesn't even concern the main image of the page, but two arrows that function as navigation elements. (Good thing I wrote this post. Now I've created a ticket directly so that we can also enter the details for this so that the frontend knows even more quickly what size is intended for it).
Despite this already clean substructure of TutKit.com, CLS problems have arisen. And our luck was that we had already done most of our homework for the PageSpeed optimization standards in 2024. It's a different story for LinkedIn's SEO expert, who honestly admitted his own issues with Core Web Vitals. I checked his website. The font files alone that are loaded on page load are immense. Compared to TutKit.com, there are many times more files and sizes loaded here, even though the home page is smaller compared to ours. See here:
In fact, we have developed an awareness that PageSpeed and CLS problems are closely linked to a suboptimal font policy.
If the page is tested with PageSpeed Insights, the values are disastrous. Google's recommendations are correspondingly long (in the image on the right):
From a technical performance point of view, this website really has a lot going for it. In the tests that I carried out randomly using our test system, there were CLS problems in several places on the subpages. And finally, a banner popped up that caused an additional layout shift.
Even if developers focus on the CLS problems in isolation, they will not be able to avoid implementing the standards for modern websites here too, i.e. solving the problems with fonts, using modern image formats such as Avif, refactoring the CSS and JS files, consistently loading lower images with lazy loading, etc. Without these fundamental performance optimizations, a quick CLS fix won't be enough or won't last.
How many more recommendations do you receive from PageSpeed Insights? Have you already done your homework?
Digression: PageSpeed 100 and still failed Core Web Vitals
Everyone in Germany knows the Karrierebibel, a career bible with good tips on job applications and working life. Last year, the site was relaunched, which in my opinion also required technical optimization. And I actually found outstandingly good PageSpeed values: the homepage shows 99 for mobile and 100 for computers. Respect.
From a design point of view, however, I still can't get used to the house font: Arial. It doesn't work at all. It reminds me directly of government mail. But Karrierebibel has an advantage. No font files are loaded at all. The website simply states in the source code that the font is Arial. As this is pre-installed on every system in Germany anyway, the website does not spend any time loading the font files. In this way, the website directly avoids one of the causes of PageSpeed problems: loading the fonts.
But: PageSpeed and CLS should always be viewed holistically, which is why it is important to systematically consider the different resolutions. The CrUX board for desktop shows a lot of CLS problems with desktop views.
If I test a subpage with PageSpeed Insight, I get a PageSpeed of 100 for computers, but the Core Web Vitals do not pass. This is also possible, as you can see from this example.
Therefore, use various testing tools to find the errors that are currently causing you to fail important metrics. Because by failing to pass the Core Web Vitals, you are also missing an important ranking signal for Google. Perhaps there is a connection between not meeting certain ranking signals and measurement metrics and the drop in online visibility on Google. Here in the image you can see how visibility has dropped from over 64 to the 30 range within a year. The reasons for this are certainly not only to be found in the Core Web Vitals. But it is always better to have done all your homework. We cannot influence the external conditions. But we can influence what happens on our own website.
Conclusion: Our findings
It is possible to consider CLS in isolation if optimizations for a high PageSpeed have already taken place in advance. Nevertheless, the impact remains huge if the core web vitals are not passed due to CLS problems.
The impact that failing the Core Web Vitals can have becomes clear when we look at the following context. The CrUX board for our portal clearly shows how increasing problems with CLS have accumulated in mobile usage since October 2024. The image shows the CLS values for phone devices alone.
Here, the breakdown by phone, desktop and tablet shows how usage on TutKit.com is distributed:
It's not the case that we gained more desktop users. No, we simply lost some of our mobile users. Daily clicks were in the five-digit range. To be honest, that really hurts. Now we have finally solved the CWV problems and the Search Console only shows good values for the URLs:
So we're very excited to see if clicks are now recovering and we can see a gain in mobile usage in Search Console. I will publish an update to this post in a few months.
The latest changes show that CLS optimization remains a challenging task. Our key learnings are:
- Consider CLS issues early in development
- Lab data is not enough - real user data is crucial
- CLS has a huge impact on core web vitals as a ranking signal and therefore on visibility on Google
In the long term, a systematic approach helps to minimize CLS problems in the long term and ensure an optimal user experience, because this is also part of SEO.
If you have problems with CLS in particular or performance in general, please get in touch with us. We have developed a deep understanding of this and can certainly help you.