- publishing free software manuals
The PostgreSQL 9.0 Reference Manual - Volume 3 - Server Administration Guide
by The PostgreSQL Global Development Group
Paperback (6"x9"), 274 pages
ISBN 9781906966072
RRP £9.95 ($14.95)

Sales of this book support the PostgreSQL project! Get a printed copy>>>

4.7.4 Using CSV-Format Log Output

Including csvlog in the log_destination list provides a convenient way to import log files into a database table. This option emits log lines in comma-separated-values (CSV) format, with these columns: timestamp with milliseconds, user name, database name, process ID, client host:port number, session ID, per-session line number, command tag, session start time, virtual transaction ID, regular transaction ID, error severity, SQLSTATE code, error message, error message detail, hint, internal query that led to the error (if any), character count of the error position therein, error context, user query that led to the error (if any and enabled by log_min_error_statement), character count of the error position therein, location of the error in the PostgreSQL source code (if log_error_verbosity is set to verbose), and application name. Here is a sample table definition for storing CSV-format log output:

CREATE TABLE postgres_log
(
  log_time timestamp(3) with time zone,
  user_name text,
  database_name text,
  process_id integer,
  connection_from text,
  session_id text,
  session_line_num bigint,
  command_tag text,
  session_start_time timestamp with time zone,
  virtual_transaction_id text,
  transaction_id bigint,
  error_severity text,
  sql_state_code text,
  message text,
  detail text,
  hint text,
  internal_query text,
  internal_query_pos integer,
  context text,
  query text,
  query_pos integer,
  location text,
  application_name text,
  PRIMARY KEY (session_id, session_line_num)
);

To import a log file into this table, use the COPY FROM command:

COPY postgres_log FROM '/full/path/to/logfile.csv' WITH csv;

There are a few things you need to do to simplify importing CSV log files:

  1. Set log_filename and log_rotation_age to provide a consistent, predictable naming scheme for your log files. This lets you predict what the file name will be and know when an individual log file is complete and therefore ready to be imported.
  2. Set log_rotation_size to 0 to disable size-based log rotation, as it makes the log file name difficult to predict.
  3. Set log_truncate_on_rotation to on so that old log data isn't mixed with the new in the same file.
  4. The table definition above includes a primary key specification. This is useful to protect against accidentally importing the same information twice. The COPY command commits all of the data it imports at one time, so any error will cause the entire import to fail. If you import a partial log file and later import the file again when it is complete, the primary key violation will cause the import to fail. Wait until the log is complete and closed before importing. This procedure will also protect against accidentally importing a partial line that hasn't been completely written, which would also cause COPY to fail.
ISBN 9781906966072The PostgreSQL 9.0 Reference Manual - Volume 3 - Server Administration GuideSee the print edition