Monitoring Postgres Database Performance issues using pgAnalyze

Parul Abrol
6 min readDec 24, 2020

--

In this post, we take a look at how to monitor and analyze Postgres performance issues.

Introduction

PostgreSQL is one of the most popular open-source databases in the world and has successful implementations across several mission-critical environments across various domains. There are many reasons why PostgreSQL can slow down and identify such reasons is a very difficult task for engineers, engineers face these type of problems :

  • Manually identifying badly written queries with bad joins, logic, etc. that take a lot of CPU and memory.
  • Continuous monitoring requires a lot of effort.
  • Comparative analysis from past dates is very time-consuming.
  • How to Identify improper logical and physical design.
  • How to effectively monitor queries.

To overcome the above-mentioned challenges and mitigate the risks, pganalyze is the solution. pganalyze is a PostgreSQL performance monitoring software that has been tailor-made to improve visibility into queries. This tool can use to help discover why a query is running slowly. pgAnalyze is built to monitor business databases constantly to provide insight into the current state of the database and how that compares to its past state.

Use Cases

Since pgAnalyze is so versatile, here are a few examples of how it can be used:

  • When production database access is required to debug performance problems, such access is restricted and sometimes unavailable due to the risk of accessing customer data.
  • Historic statistics data and details on specific slow queries are not available.
  • When you want to set up monitoring tools on-premise and don’t want to expose data outside.

How does pgAnalyze help to monitor?

  • Per-query statistics: Access statistics on a per-query basis and catch slow queries before they become big problems. pganalyze lets you sort all queries on your database by different characteristics, such as “% of all runtime”, “average runtime”, or “calls per minute”.
pgAnalyze Query Statistics
  • Visualization of query plans: pganalyze visualizes EXPLAIN output in a convenient tree view. For each node, we have specific information highlighted, that shows what’s most important for that plan.
Query Visualization plans
  • Postgres index check: pganalyze highlights why a slow query is slow and points out if an index might be missing.
pgAnalyze Index Suggestion
  • Database Visualization and Sophisticated Dashboards: The pgAnalyze dashboards allow teams to efficiently get insights into database performance and highlights areas of improvement. The graphs let you drill down on interesting findings and help to uncover root causes in minutes.
pgAnalyze Dashboards
  • Analyze meaningful trends and historic query performance: pgAnalyze lets you monitor your database growth and table sizes, gives insights into your query performance history from up to 60 days and shows CPU utilization over time.
pgAnalyze Trends Dashboard
  • Referencing of schema data: Access relevant schema data for a table that a Sequential Scan node has scanned and review which indices exist on the table. Quickly find out when the last ANALYZE activity happened on a table, to understand whether statistics are out of date.
pgAnalyze Schema Data
  • Plan grouping: pgAnalyze automatically groups similar query plans and highlights plans that differ significantly.
pgAnalyze Plan Grouping
  • Tuning recommendations: You can set up automated checks that analyze your Postgres configuration and suggest optimizations. pganalyze also detects per-table configuration, for example for table-specific autovacuum settings.
pgAnalyze Tuning Recommendations
  • Option to run On-Premise: If you choose to run pganalyze on-premise it will run in a docker container behind your firewall, on your servers. No data will be sent outside your systems so you can always keep control of your data.
pgAnalyze On-Premise
  • Store statistics in compliance with local policies: Some countries, for certain kinds of databases, require you to store all data within their borders. When you use pganalyze on-premise, it allows you to retain statistics data in the correct region of the world to comply with local policies.

Steps to configure pgAnalyze

You need to follow these steps to set up pganalyze on local:

Step 1: Create a separate monitoring user on your PostgreSQL database for pganalyze

Run the following to create a new user, whilst connected as a PostgreSQL superuser, replacing mypassword with one of your choosing:

CREATE USER pganalyze WITH PASSWORD ‘mypassword’ CONNECTION LIMIT 5;

GRANT pg_monitor TO pganalyze;

CREATE SCHEMA pganalyze;

GRANT USAGE ON SCHEMA pganalyze TO pganalyze;

REVOKE ALL ON SCHEMA public FROM pganalyze;

CREATE OR REPLACE FUNCTION pganalyze.get_stat_replication() RETURNS SETOF pg_stat_replication AS

$$

/* pganalyze-collector */ SELECT * FROM pg_catalog.pg_stat_replication;

$$ LANGUAGE sql VOLATILE SECURITY DEFINER;

Step 2 : Enable pg_stat_statements in your Postgres container configuration

While launching postgres set shared_preload_libraries= ’pg_stat_statements’ in configuration placed at : /var/lib/postgresql/data/postgresql.conf.

Step 3: Verify that pg_stat_statements returns data

As a superuser, run the following statements:

CREATE EXTENSION IF NOT EXISTS pg_stat_statements;

SELECT calls, query FROM pg_stat_statements LIMIT 1;

If you have configured your database correctly, this will return a result like this:

calls | query — — — -+ — — — -

8 | SELECT * FROM t WHERE field = ? (1 row)

Step 4: Install the collector

  • Downloading the collector image

Pull the Docker image like this:

docker pull quay.io/pganalyze/collector:stable

  • Configuring the collector with Docker

Set up a pgAnalyze account and go to this link

pgAnalyze waiting for a test response

Here we can see the pgAnalyze is waiting for a test response. Execute the following command to send a test signal to pgAnalyze:

docker run — rm — name pganalyze-setup -e DB_URL=postgres://pganalyze:mypassword@172.17.0.2:5432/database -e PGA_API_KEY=PRJPF7SB4BCIKBEN quay.io/pganalyze/collector:stable test

Fill in the values step-by-step:

DB_URL=postgres://user@password@IPAddress/database-name

PGA_API_KEY= PgAnalyze API key

After executing this command we will be able to send a test signal to pgAnalyze and we will be able to see the following screen :

pgAnalyze test connection was successful

Since our test connection was successful, so now we can connect the pgAnalyze by executing the following command :

docker run — rm — name pganalyze-setup -e DB_URL=postgres://pganalyze:mypassword@172.17.0.2:5432/database -e PGA_API_KEY=PRJPF7SB4BCIKBEN quay.io/pganalyze/collector:stable

and click on Finish Setup & Go To Dashboard. The following screen will be visible after 10–20 minutes :

pgAnalyze Dashboard

Conclusion

We believe pgAnalyze is the best way to keep track of what’s going on with your Postgres database, It’s the easiest way to monitor your database, focused solely on PostgreSQL, providing tailored log filters and alerts out of the box.

--

--

No responses yet