Log Management and Analytics

Explore the full capabilities of Log Management and Analytics powered by SolarWinds Loggly

View Product Info

FEATURES

Infrastructure Monitoring Powered by SolarWinds AppOptics

Instant visibility into servers, virtual hosts, and containerized environments

View Infrastructure Monitoring Info

Application Performance Monitoring Powered by SolarWinds AppOptics

Comprehensive, full-stack visibility, and troubleshooting

View Application Performance Monitoring Info

Digital Experience Monitoring Powered by SolarWinds Pingdom

Make your websites faster and more reliable with easy-to-use web performance and digital experience monitoring

View Digital Experience Monitoring Info
Use Cases

PostgreSQL Logs: Logging Setup and Troubleshooting

Start Free Trial

Fully Functional for 30 Days

As with other relational database management systems (RDBMS), PostgreSQL provides a log of activity and error messages. Logs can be a daunting situation, as they contain many lines of information and, therefore, often leave us confused.

However, logs are there to help efficiently solve errors. If something goes wrong with your app, it’s vital you’ve correctly set up and managed your logs. Confusion will only delay you in solving the error.

The key with logs is to have access to sufficient lines of activity or history to fix an issue. However, there shouldn’t be too much to go through, as this will only slow you down. Therefore, a logs management strategy is important.

In this article, we first look at reasons why logging is important. Next, we explain parameters used to customize logs in PostgreSQL. Furthermore, we describe how to record queries in PostgreSQL and finally recommend a tool for managing PostgreSQL logs.

Why Is Logging Necessary?

Primarily, logging enables database administrators (DBAs) to identify and resolve problems. In simple terms, the log is the place one visits when more detail is necessary to resolve an issue or to monitor a new change. It shows activity along with a timestamp, which is useful in maintaining your application’s database.

From a security perspective, we use logs to act as a red flag when an issue occurs. Reviewing logs may help identify malicious attacks on your system. However, given the large amount of information shown on the logs, it’s impractical to sift through all the lines. Therefore, it’s advised to make use of a logging management system to better organize and set up your logs.

Postgres Log Parameters

As mentioned, it’s vital you have enough logs to solve an issue but not too much, or it’ll slow your investigation down. You can achieve this balance by fully understanding Postgres log parameters. These parameters help you customize the logs so they best apply to your scenario.

The parameters can be set in the PostgreSQL.conf file or on the server command line. Some of the parameters used in the log management system are listed below.

Application Name

The application name contains fewer than 64 characters and can be displayed as pg_stat_activity in the view. It’s also included in the hosted CSV log and can be included in other logs. Please note this application name should only contain printable ASCII characters. Otherwise, they’ll be replaced by question marks.

Parameter log_checkpoints

When this parameter is set to true, checkpoints and restart points are also recorded in the server log.

Parameter logging_collector

PostgreSQL logging is only enabled when this parameter is set to true and the log collector is running. The log collector silently collects logs sent to stderr as a standard fault stream and redirects them to the file destination of the log file. Some messages cannot be printed on the console, so this method is better than Syslog. For example, if an error occurs while running the script, the archive_command () used won’t display the log on the console.

Parameter log_directory

If the log collector is enabled, this parameter determines where the log file should be created. By default, the value of this parameter is pg_log, which can be an absolute or relative path to the cluster database’s location.

Parameter log_destination

The PostgreSQL log management system allows users to store logs in several ways, such as stderr, csvlog, event log (Windows only), and Syslog. For example, if we set this parameter to csvlog, the logs will be saved in a comma-separated format. With this parameter, we can indicate where the log is stored and provide several parameters separated by commas. The most common default value of this parameter ID is stderr.

Parameter log_filename

This parameter determines the name of the file created for logging purposes. This field changes over time and should be mentioned along with the % -escapes escape sequence.

How to Log Queries

It’s easy to temporarily enable logging in PostgreSQL by changing some configuration settings and restarting the server. However, we’ll show how to configure Postgres to create permanent iterative lоg files fоr аll ѕеѕѕiоnѕ and connections.

Step 1: Find Configuration Files

If уоu’re not sure of the PostgreSQL.conf configuration file location, the easiest way to find it is to connect to the Postgres client (psql) using the SHOW config_file command:

$ psql -U postgres -c 'SHOW config_file'

Step 2: Find the Data Directory Path

The next step is to verify the data directory path of the Postgres installation using another SHOW statement command:

$ psql -U postgres -c 'SHOW data_directory'

In some installations, the configuration file and data directory are located on the same path, while in other installations (as shown in this example), it’s different. In either case, please copy this data directory path for future use.

Step 3: Configure PostgreSQL to Generate Log Output

Open the PostgreSQL.conf file you found in step one and scroll down to the ERROR REPORTING AND LOG section. The most important of these settings are log_destination and logging_collector. The following are the recommended settings. However, you can adjust them as needed at any time:

#--------------------------------------------------------------------------

# ERROR REPORTING AND LOGGING

#--------------------------------------------------------------------------




# - Where to Log -




log_destination = ‘csvlog’ # Valid values are combinations of

# stderr, csvlog, syslog, and eventlog,

# depending on platform. csvlog

# requires logging_collector to be on.




# This is used when logging to stderr:

logging_collector = on # Enable capturing of stderr and csvlog

# into log files. Required to be on for

# csvlogs.

# (change requires restart)




# These are only used if logging_collector is on:

log_directory = ‘pg_log’ # directory where log files are written,

# can be absolute or relative to PGDATA

log_filename = ‘postgresql-%Y-%m-%d_%H%M%S.log’ # log file name pattern,

In this case, we instruct PostgreSQL to generate logs in CSV format and output them to the pg_log directory (in the data directory). We also commented on the log_filename setting to generate some suitable names, including the log file’s timestamp.

Step 4: Restart the PostgreSQL Service

Now restart the PostgreSQL service for these settings (especially logging_collector). Performing a Postgres restart is different for each system, but the command for UNIX systems typically looks like this:

$ Service PostgreSQL restart

 * Restart the PostgreSQL 9.3 database server. [OK]

Step 5: Check the Log Generation

After restarting the system, the protocol should start immediately. To ensure this, please scroll to the data / pg_log directory of the Postgres installation. Remember, we’ve already received the data directory path. Just add / pg_log at the end to bring up the log directory and navigate to the directory:

$ cd /var/lib/postgresql/9.3/main/pg_log

List the files. Thereafter, you should see that the log file was created after the previous restart of the service:

$ ls -l -rw ------- 1 postgres postgres 935 Apr 13 20:30 postgresql-2016-04-13_203022.csv

Loggly: Log Analysis and Log Monitoring Solution

As mentioned, logging enables us to identify problems and resolve them efficiently. Postgres allows you to use parameters to customize your logs. A good log setup is important to quickly and correctly resolve your issues. To help you do this, I suggest trying SolarWinds® Loggly®, which provides a level of control to manage access to any server and database type (including PostgreSQL).

Loggly

See unified log analysis and monitoring for yourself.
Start FREE Trial

Fully Functional for 30 Days

Loggly works well for getting all-encompassing information that doesn’t require a high level of sorting/analysis and works nicely for weblogs, server logs, and other types of information. To try Loggly for free, click here.

This post was written by Daniel de Oliveira. Daniel is a full stack developer and analyst, experienced with Java, Angular, and Flutter.