Posts

Queries on Vacuum

 I am (slowly) adding handy PostgreSQL queries to my GitHub , and Vacuum is the newest category.  The end goal is to have a compilation of queries for those of us who need to keep an instance healthy. Over the years, I have collected hundreds of various queries and hate hunting them down in my code snippet library. Finally, they will be in one place and easy to search.  Please contribute if you have similar or better queries (hint, hint!).

Incremental Backups in PostgreSQL 17

 The old adage that a DBA or SRE is only as good as their last backup is true.  PostgreSQL 17 added the ability to combine multiple incremental backups with a full backup to provide a complete data dictionary to recover a failed system. It is very easy to use. This is a quick example of using incrementals. I recommend watching this video by Robert Haas for more details and some discussion of backup strategies. Step 1 - Enable WALL Summerization The incremental backups require Write Ahead Log Summarization. It is likely not on your instance by default. But it is easy to enable. demo=# alter system set summerize_wal = 'on'; ALTER SYSTEM demo=# select pg_reload_conf(); pg_reload_conf  ----------------  t (1 row) Step 2 - Full Backup The examples below are much simpler than you will run into in real life.  The scenario is that we make a full backup on Sunday and do incrementals the rest of the week. Wednesday through Saturday are omitted below.  stoker@ThinkPa...

A Repository Of Handy Database Administration Queries

 I have started a repository of handy SQL Queries for Database Administration, which I have collected over the years. Finding unused or duplicate indexes or queries without indexes and other information can speed up response times. Knowing statistics like cache hit rates can aid in planning upgrades.  I am currently populating this repo as I run across the queries as part of my 'spring cleaning. ' I welcome your participation if you have a handy bit of SQL for an open-source database you'd like to share. The repo is at https://github.com/davestokes/HandySQL , and I will be adding to it over the next few months. If you have a question that you would like to address, please don't hesitate to let me know. I will try to accommodate you. 

Saving The Output From psql

  Occasionally, you will need to capture data when working with a database. You can cut-n-paste or use a shell command like script . In typical PostgreSQL fashion, psql  has two ways to save your output. The First Way You must invoke psql with the --log-file-<filename> or -L <filename> option. The entire session will be recorded. stoker@ThinkPad:~$ psql --log-file=/tmp/psqllog demo Password for user stoker: psql (17.4 (Ubuntu 17.4-1.pgdg24.04+2)) Type "help" for help. demo=# \d ledger                    Table "public.ledger"      Column     |  Type   | Collation | Nullable | Default ----------------+---------+-----------+----------+---------  id             | integer |           |          |  tx_id          | integer |        ...

PostgreSQL's COPY and \COPY

PostgreSQL is equivalent to a Swiss Army Knife in the database world. There are things in PostgreSQL that are very simple to use, while in another database, they take many more steps to accomplish. But sometimes, the knife has too many blades, which can cause confusion. This is one of those cases. COPY and \COPY I needed a quick dump of a table as part of an ETL (Extract, Transform, and Load - official definition, Exorcism, Trauma, and Lost-sleep - unofficially) to port some data. The COPY command is a quick way to output and write the data to a file.  The two copy commands below illustrate adding a delimiter and then writing the output to a table. demo=# select * from z;  a | b  |  c   ---+----+-----  1 | 10 | 100  2 | 20 | 200  3 | 30 | 300  4 | 40 | 400 (4 rows) demo=# copy z to stdout (DELIMITER ','); 1,10,100 2,20,200 3,30,300 4,40,400 demo=# copy z to '/tmp/z' (DELIMITER ','); COPY 4 demo=#  The trick is that  you must...

Can Artificial Intelligence Created Better Tables Than You?

 Artificial Intelligence is one of those conundrums where the ability to have some tasks handled for you contrasts with a mix of ego and pride that it may be better than you at those tasks.  I recently wrote a blog using another database about an AI-generated SQL that was quite sophisticated . But what about asking an AI to create a table with a specification like a DBA/SRE/Analyst might receive? I used Grok and entered the following prompt: Write the SQL to create a table on a PostgreSQL version 17 server to record data on customers. The data will include a first name, a last name, an address, a birthdate, and a unique UUID primary key. Partition the data by the year of birth. And produce some test data of at least ten records. I am sure many of us have started large projects given less instruction.  Notice: I did not denote the format for the address (US Post Office's format, or UK, or other). Nor did I set a name length. I wanted to see what assumptions were made by t...

Use PASSING with JSON_TABLE() To Make Calculations

I ran across this trick while preparing a talk for the Southern California Linux Expo  (SCaLE) on using JSON_TABLE(). This is a 'reserve' talk, which means it will presented IF another speaker drops out. But I thought this was interesting enough to share here. I will make sure the slides are posted online when finished. Passing Interest JSON_TABLE() temporarily converts JSON data into a relational table so that the temporary table can be processed by Structured Query Langauge (SQL) Commands. A feature in the PostgreSQL implementation is the passing clause.  In the example below, you will see PASSING 1500 AS price_limit in red. This declares price_limit as a variable and assigns a value of 1500 to it.  Later, in the body of the JSON_TABLE() call, you will see in green  is_too_expensive BOOLEAN EXISTS PATH '$.price ? (@ > $price_limit)' .  Here a variable is created named is_too_expensive of type BOOLEAN, which means we get a true or false answer from the r...