4eck Media

Efficient database optimization & reduction of database queries

July 17, 2024
July 17, 2024

Over the past few months, we have been working quite a lot on efficient database optimization and the reduction of database queries. The background to this was that the introduction of multilingualism at TutKit.com, currently in 23 languages, meant that the databases had taken on completely new dimensions and the demands on performance and scalability had increased. So we undertook a sprint to optimize the database and reduce database queries with quite astonishing results.

We went through the various page types (category pages, product pages, blog pages, ...) and systematically optimized and reduced the database queries. To name just two examples:

  • We were able to reduce category pages from 916 to 16 queries and the server running time from 177ms to 15ms.
  • We were able to reduce product pages from 2064 database queries to 157 and the server running time from 412ms to 68ms.

This means that we have been able to reduce database queries by 92 to 98% while at the same time improving the speed enormously, which also requires only 8 to 17% of the previous time. What a result!

Here is a before and after comparison of our Laravel project of the same page type with the debug bar switched on. It was a product page that contained a lot of content:

Datenbankoptimierung und Datenbankabfragenreduzierung

Over 2,000 queries was already a brutal amount, admittedly. I'll explain why we had such high values in another blog post when it comes to working with other agencies, mediocre know-how and technical debt. Our backend developer took on this page and was able to achieve the following result after just 1.5 days of optimization:

Datenbankoptimierung und Datenbankabfragenreduzierung - das Ergebnis

As this was the page type of the products, it had a very big impact on very many pages at our company. These pages in particular should have a high performance in terms of user experience and also be attractive for Google so that these pages rank higher. We are extremely satisfied with the result.

Good reasons for database optimization and query reduction

Database optimization is an essential part of software development and refactoring that offers a variety of benefits. Developers and companies managing large projects such as online stores or portals should take this practice seriously for the following reasons:

  • Improved performance and speed: Optimized databases enable faster queries and shorter loading times. This is particularly important for applications with high data volumes or high user activity, as fast response times significantly improve the user experience.
  • Scalability: A well-optimized database can better handle increasing amounts of data and a growing number of simultaneous users. This ensures that the application remains efficient even with increasing data volumes.
  • Resource efficiency: By reducing unnecessary queries and improving the database structure, fewer computing resources are required. This leads to lower operating costs and more efficient use of existing hardware.
  • Reliability and stability: Optimizations help to prevent database bottlenecks and failures. This is particularly important for business-critical applications where downtime can lead to significant financial losses.
  • Better user experience: Users expect fast and smooth interactions with applications. Optimized databases help ensure that users are not frustrated by slow load times or delayed responses, increasing customer satisfaction and retention.
  • Competitive advantage: In a competitive market, the performance of an application can be a decisive factor. Companies that invest in database optimization can set themselves apart from the competition with superior application performance, partly because Google simply loves performant websites.
  • Error and problem reduction: By identifying and fixing inefficient queries and structures, potential sources of errors can be eliminated. This leads to a more stable application and reduces maintenance costs.
  • Cost savings: Optimizations can reduce the need for additional hardware and resources, which can lead to significant cost savings in the long run. More efficient database operations also mean less spending on cloud services and hosting.

Overall, database optimization significantly helps to improve the performance, reliability and scalability of applications and projects, which ultimately leads to a better user experience, higher search engine rankings and lower operating costs.

So how exactly did we go about it? Here is a summary!

Debugging and profiling tools in programming languages

The work is mainly done with debugging and profiling tools. Debugging and profiling tools are essential for developers to identify and fix problems and optimize the performance of applications. These tools provide insight into various aspects of code execution and help developers ensure that their applications are running efficiently and correctly. One of the tools we use in our Laravel project TutKit.com is Debugbar.

Important functions of debugging and profiling tools

  • Query monitoring: Tools like Debugbar in Laravel log all database queries executed during a request and show details, bindings and execution times of the queries. This helps to identify slow or redundant queries that can be optimized.
  • Performance profiling: These tools provide a breakdown of the execution time of a query, including the time for database operations, view rendering and other processes. This helps to identify performance bottlenecks.
  • Error and exception logging: They capture and log errors and exceptions that occur during code execution and provide detailed stack traces and error messages to help developers quickly identify and resolve issues.
  • Route information: For web applications, these tools often display information about the current route, including route parameters and middleware, which is useful for troubleshooting routing issues.
  • View rendering: They show which views have been rendered and how long this took, which helps to optimize view execution.

General use case

In any programming language, the workflow for using a debugging and profiling tool typically includes the following steps

  1. Installation and configuration: Setting up the tool in your development environment.
  2. Execution monitoring: running your application and using the tool to monitor various aspects of its execution.
  3. Data analysis: Reviewing the logged data to identify slow queries, performance bottlenecks and errors.
  4. Optimization and debugging: Making the necessary changes to optimize performance and fix identified issues.
  5. Functional testing and bug fixing: After optimization, it must be checked whether all necessary content and functions have been retained

Example: Laravel Debugbar

In our Laravel project, we use the debugbar to monitor and optimize our application:

  1. Show queries: Debugbar displays all executed queries, their execution time and parameters.
  2. Identify slow queries: We look for queries with long execution times and optimize them.
  3. Check number of queries: We make sure that there are no unnecessary repeated queries that could indicate an N+1 problem.
  4. Performance profiling: The timeline feature helps us see where execution time is being spent so we can optimize accordingly.

Detailed analysis of database queries and schemas

In the next step, we analyzed similar database queries, identified slow and time-consuming queries, checked database tables without indexes and checked unnecessary default columns. This comprehensive analysis enabled us to significantly optimize database performance.

In-depth debugging and database performance optimization

When debugging, our tool (like Debugbar) provides insight into which files and rows each query originated from. However, with over 40 files, including vendor files, identifying the root cause can be challenging. This is how we proceed:

  1. Thorough debugging: We use our experience and detailed debugging to quickly pinpoint the problem, even if it's not immediately obvious. We analyze why the problem occurred and review both the code and framework inefficiencies.
  2. Code and framework analysis: Sometimes well-written code can cause performance issues if the framework is not sufficiently optimized. Conversely, a good framework can suffer from poorly written code or non-compliance with programming standards.

Common problems in databases

The common problems in most databases are:

  • Repetitive queries: these can occur due to overlooked optimizations, inefficient programming practices, or unintended loops in the application logic.
  • Slow queries: Often the result of missing indexes, inefficient query structures or complex joins that can be simplified.
  • Schema design: During the project planning phase, essential elements such as indexes may be overlooked, leading to performance bottlenecks later on.
  • N+1 query problem: This occurs when the application executes additional queries for related data, which could be avoided by eager loading.
  • Memory leaks: Inefficient memory usage, such as not freeing resources or excessive use of global variables, can degrade performance over time.
  • Contention issues: Improper handling of parallel processes can lead to race conditions, deadlocks or other synchronization issues that degrade performance.
  • Network latency: Excessive or poorly managed network requests can slow down the application, especially in distributed systems.

We are confident that we can solve these problems efficiently due to our deep expertise and years of experience. This expertise allows us to quickly identify and resolve issues, ensuring optimal performance for our projects and yours.

Reducing database queries to improve performance

Repeatedly querying the database for the same data can lead to unnecessary load and slower performance. Each database call consumes resources and time. Here is an example solution:

Wiederholte Abfrage in einer Datenbank wird optimiert

Example of performance improvement by reducing redundant queries

The performance improvement in the updated code results from the reduction of redundant database queries. By storing the result of the translationQuality relationship query in a variable and reusing it, the code avoids multiple database calls for the same data. This optimization reduces the number of database interactions, which improves the response time and efficiency of the edit method.

Wiederholte Abfrage nach denselben Daten vermeiden

Sometimes querying the database directly with raw queries can result in fewer and more efficient queries than using ORM methods. This approach allows for more precise control of the query and optimizations such as selective joins and filtering. By building the query manually, we can eliminate unnecessary overhead and ensure that only the data needed is retrieved in a single, optimized query, resulting in better performance and faster execution times.

Direkte Queries in der Datenbank im Vergleich zur ORM-Methode

Direkte Queries in der Datenbank im Vergleich zur ORM-Methode

Performance refresh significantly improves efficiency by converting multiple database queries into a single, optimized query. Instead of retrieving all packets and then filtering them in memory, the new approach uses a single, well-designed database join and conditional query to retrieve only the required records. This not only minimizes the amount of data transferred, but also reduces processing overhead, resulting in faster execution and optimized data retrieval.

Performance-Update durch Datenbankoptimierung

Implementation of a database index to increase performance

Implementing database indexes is a powerful technique for improving page speed. An index significantly reduces the time it takes to retrieve data by allowing the database to locate and access records more efficiently. This optimization is especially important for tables with large data volumes and frequent read operations. By indexing important columns, such as those used in WHERE clauses or JOIN operations, we can drastically reduce query execution time, resulting in faster page load times and a more responsive user experience. Well-designed indexes ensure that our application can scale effectively while maintaining optimal performance, ultimately improving overall page speed.

Datenbank-Index

Efficient and fast task completion

We emphasize efficiency and speed in our workflow, which allows us to complete tasks in a timely manner. By using optimized queries, minimizing redundant data processing and applying our extensive experience, we streamline our development process. This focus on performance ensures that we deliver high-quality results within a short timeframe, meet project deadlines and exceed our clients' expectations.

Our expertise is not limited to a specific programming language. We apply these principles across different technologies and frameworks, adapting our approach to

to the requirements of each project. Whether it's PHP, MySQL, Python, JavaScript or any other language, our commitment to efficiency and speed remains constant, enabling us to deliver exceptional performance and rapid task completion in any development environment.

Calculating the impact of reducing database queries in sprints

Reducing the number of database queries during a sprint can significantly improve the overall performance and efficiency of an application. To calculate the impact, we first identify all database interactions in the code, focusing particularly on frequent queries and those in critical paths.

First, we measure the execution time and resource usage of the existing queries before optimization. To do this, we use profiling tools to capture the latency and load of each database interaction. By aggregating these metrics, we can create a performance baseline.

We then implement optimized queries and refactor the code to minimize redundant queries. This could include consolidating multiple queries into a single, more efficient query, using joins instead of separate queries, or utilizing caching strategies to avoid unnecessary database accesses.

After optimization, we again measure the execution time and resource usage of the refactored queries. By comparing these post-optimization metrics with our baseline, we can quantify the improvements. Key success indicators include reduced query count, reduced execution time, reduced database load and improved response times.

In addition, we evaluate the broader impact on application performance. This includes monitoring user experience metrics such as page load times and system responsiveness. By documenting these improvements, we provide a clear, data-driven justification for the optimization efforts.

Overall, this systematic approach to reducing database queries not only helps speed up individual sprints, but also contributes to the long-term scalability and maintainability of the application.

Does your database need optimization?

If you have special requirements for the scalability of your project, for example because you are planning multilingualism and an international roll-out or know that traffic peaks (e.g. due to TV advertising) are expected soon, prepare for these upcoming loads on your databases. Especially if you have the feeling that the initial setup of your site is already very slow, your databases and the database queries will not be optimal and will not be able to withstand an upcoming onslaught of many visitors to the site at the same time. This can lead to a server crash.

The easiest way is to use a debug tool to measure for yourself how many queries are required to call up the page. Another indication that the server response times and therefore the database queries are taking too long can also be found in the Search Console under Settings => Crawling statistics. A value of over 1,100 ms average response time is almost a disaster.

Seitenreaktionszeit Search Console

In my opinion, the good values are under 400 ms. John Müller from Google recommended between 100 and 500 ms in a video call.

Crawling-Statistik mit Serverantwortzeit

The best average server response time I've seen so far was 212 ms in the Search Console crawling statistics for an online store containing text, images, products, etc.

Database optimization agency: Costs for the reduction of database queries

We charge an hourly rate of 240 euros net for this very specialized service. We are able to work with various database systems such as MariaDB, Mongo DB, PostgreSQL or MySQL.

How long it takes to optimize databases per page type in your project always depends on the server and hosting as well as the framework used. Do we have to clone everything or can we already work in a prepared test environment? Do we have SSH access or not? If everything is already set up, two full working days per page type is a good rule of thumb. If the server setup still needs to be prepared, the time is added accordingly.

After a briefing and an initial analysis, we can provide an initial estimate of the actual time required.

If you are also interested in reducing database queries and the associated increase in server-side performance, please get in touch with us.