- 🚀 Fastest full featured PostgreSQL client for Node.js
- 🚯 1200 LOC - 0 dependencies
- 🏷 ES6 Tagged Template Strings at the core
- 🏄‍♀️ Simple surface API
- đź§Ş No tests! 0% coverage = infinite coverage!
Install
$ npm install postgres
Use
import postgres from 'postgres'
const sql = postgres({ ...options }) // will default to the same as psql
const something = await sql`
select name, age from users
`
// something = [{ name: 'Murray', age: 68 }, { name: 'Walter', age 78 }]
Node pre 13.1.0
const postgres = require('postgres/cjs')
You can use either a postgres:// url connection string or the options to define your database connection properties.
const sql = postgres('postgres://user:pass@host:port/database', {
host : // or hostname
port : // Postgres server port
path : // unix socket path (usually /tmp)
database : // Database to connect to
username : // or username
password : // or password
ssl : false, // True, or an object with options to tls.connect
max : 10, // Max number of connections
timeout : 0, // Idle connection timeout in seconds
types : [], // Custom types, see section below
onconnect : null, // Runs before any queries on each connect
onnotice : console.log // Any NOTICE the db sends will be posted here
})
A query will always return a Promise
which resolves to either an array [...]
or null
depending on the type of query. Destructuring is great to immidiately access the first element.
const [new_user] = await sql`
insert into users (
name, age
) values (
'Murray', 68
)
returning *
`
// new_user = { user_id: 1, name: 'Murray', age: 68 }
Parameters are automatically inferred and handled by Postgres so that SQL injection isn't possible. No special handling is necessarry, simply use JS tagged template literals as usual.
let search = 'Mur'
const users = await sql`
select
name,
age
from users
where
name like ${ search + '%' }
`
// users = [{ name: 'Murray', age: 68 }]
If you want to handle rows returned by a query one by one you can use .stream
which returns a promise that resolves once there are no more rows.
await sql.stream`
select created_at, name from events
`.stream(row => {
// row = { created_at: '2019-11-22T14:22:00Z', name: 'connected' }
})
// No more rows
When you call listen, a dedicated connection will automatically be made to ensure that you receive notifications in realtime. This connection will be used for any further calls to listen.
sql.listen('news', payload => {
const json = JSON.parse(payload)
console.log(json.this) // logs 'is'
})
Notify can be done as usual in sql, or by using the sql.notify
method.
sql.notify('news', JSON.stringify({ no: 'this', is: 'news' }))
Due to the nature of sql and Postgres types various helpers are available to simplify queries.
Sometimes the number of columns can be quite large, so this is shorter.
const data = {
user: {
name: 'Murray'
}
}
const [user] = sql`
insert into users (
name, age
) values ${
sql.row(data.user, 'name', 'age')
}
`
If you need to insert multiple rows at the same time it's much faster to do it with a single insert
. This is easily done using sql.rows
const data = {
users: [{
name: 'Murray',
age: 68
}, {
name: 'Walter',
age: 78
}]
}
const users = sql`
insert into users (
name, age
) values ${
sql.rows(data.users, 'name', 'age')
}
returning *
`
Postgres has a native array type which is similar to js arrays, but Postgres only allows the same type and shape for nested items. This method automatically infers the item type and translates js arrays into Postgres arrays.
const types = sql`
insert into types (
integers,
strings,
dates,
buffers,
multi
) values (
${ sql.array([1,2,3,4,5]) },
${ sql.array(['Hello', 'Postgres']) },
${ sql.array([new Date(), new Date(), new Date()]) },
${ sql.array([Buffer.from('Hello'), Buffer.from('Postgres')]) },
${ sql.array([[[1,2],[3,4]][[5,6],[7,8]]]) },
)
`
const body = { hello: 'postgres' }
const [{ json }] = await sql`
insert into json (
body
) values (
${ sql.json(body) }
)
returning body
`
// json = { hello: 'postgres' }
Calling begin with a function will return a Promise which resolves with the returned value from the function. The function provides a single argument which is sql
with a context of the newly created transaction. BEGIN
is automatically called, and if the Promise fails ROLLBACK
will be called. If it succeeds COMMIT
will be called.
const [user, account] = await sql.begin(async sql => {
const [user] = await sql`
insert into users (
name
) values (
'Alice'
)
`
const [account] = await sql`
insert into accounts (
user_id
) values (
${ user.user_id }
)
`
return [user, account]
})
sql.begin(async sql => {
const [user] = await sql`
insert into users (
name
) values (
'Alice'
)
`
const [account] = (await sql.savepoint(sql =>
sql`
insert into accounts (
user_id
) values (
${ user.user_id }
)
`
).catch(err => {
// Account could not be created. ROLLBACK SAVEPOINT is called because we caught the rejection.
})) || []
return [user, account]
})
.then(([user, account])) => {
// great success - COMMIT succeeded
})
.catch(() => {
// not so good - ROLLBACK was called
})
Do note that you can often achieve the same result using WITH
queries (Common Table Expressions) instead of using transactions.
You can add ergonomic support for custom types, or simply pass an object with a { type, value }
signature that contains the Postgres oid
for the type and the correctly serialized value.
Adding Query helpers is the recommended approach which can be done like this:
const sql = sql({
types: {
rect: {
to : 1337,
from : [1337],
serialize : ({ x, y, width, height }) => [x, y, width, height],
parse : ([x, y, width, height]) => { x, y, width, height }
}
}
})
const [custom] = sql`
insert into rectangles (
name,
rect
) values (
'wat',
${ sql.rect({ x: 13, y: 37: width: 42, height: 80 }) }
)
returning *
`
// custom = { name: 'wat', rect: { x: 13, y: 37: width: 42, height: 80 } }
To ensure proper teardown and cleanup on server restarts use sql.end({ timeout: null })
before process.exit()
Calling sql.end()
will reject new queries and return a Promise which resolves when all queries are finished and the underlying connections are closed. If a timeout is provided any pending queries will be rejected once the timeout is reached and the connections will be destroyed.
Sample shutdown using Prexit
import prexit from 'prexit'
prexit(async () => {
await sql.end({ timeout: 5 })
await new Promise(r => server.close(r))
})
Errors are all thrown to related queries and never globally. Errors comming from Postgres itself are always in the native Postgres format, and the same goes for any Node.js errors eg. coming from the underlying connection.
There are also the following errors specifically for this library.
X (X) is not supported
Whenever a message is received from Postgres which is not supported by this library. Feel free to file an issue if you think something is missing.
Message type X not supported
When using SASL authentication the server responds with a signature at the end of the authentication flow which needs to match the one on the client. This is to avoid man in the middle attacks. If you receive this error the connection was canceled because the server did not reply with the expected signature.
X is a reserved method name
When implementing custom types, the name of the type is used for the method added to the sql
object. There are a few reserved method names which can't be used. This is one of them.
Query not called as a tagged template literal
Making queries has to be done using the sql function as a tagged template. This is to ensure parameters are serialized and passed to Postgres as query parameters with correct types and to avoid SQL injection.
Auth type X not implemented
Postgres supports many different authentication types. This one is not supported.
write CONNECTION_CLOSED host:port
This error is thrown if the connection was closed without an error. This should not happen during normal operation, so please create an issue if this was unexpected.
write CONNECTION_ENDED host:port
This error is thrown if the user has called sql.end()
and performed a query afterwards.
write CONNECTION_DESTROYED host:port
This error is thrown for any queries that were pending when the timeout to sql.end({ timeout: X })
was reached.
nb. You can use onnotice
to listen to any Postgres NOTICE
sent on connections. But note that this will be called for every singlee connection to the database.
A really big thank you to @JAForbes who introduced me to Postgres and still holds my hand navigating all the great opportunities we have.