- It is important you add an event listener to the pool to catch errors. Just like other event emitters, if a pool emits
- an error event and no listeners are added node will emit an uncaught error and
- potentially exit.
-
+When a client is sitting idly in the pool it can still emit errors because it is connected to a live backend.
-### `pool.on('remove', (client: Client) => void) => void`
+If the backend goes down or a network partition is encountered all the idle, connected clients in your application will emit an error _through_ the pool's error event emitter.
+
+The error listener is passed the error as the first argument and the client upon which the error occurred as the 2nd argument. The client will be automatically terminated and removed from the pool, it is only passed to the error handler in case you want to inspect it.
+
+
-
- You must
always return the client to the pool if you successfully check it out, regardless of whether or not there was an error with the queries you ran on the client. If you don't check in the client your application will leak them and eventually your pool will be empty forever and all future requests to check out a client from the pool will wait forever.
+
+
+ You must always return the client to the pool if you successfully check it out, regardless of whether or not
+ there was an error with the queries you ran on the client.
-
+ If you don't release the client your application will leak them and eventually your pool will be empty forever and all
+ future requests to check out a client from the pool will wait forever.
+
### Single query
@@ -114,8 +117,8 @@ const pool = new Pool()
pool
.query('SELECT * FROM users WHERE id = $1', [1])
- .then(res => console.log('user:', res.rows[0]))
- .catch(err =>
+ .then((res) => console.log('user:', res.rows[0]))
+ .catch((err) =>
setImmediate(() => {
throw err
})
@@ -128,14 +131,8 @@ Promises allow us to use `async`/`await` in node v8.0 and above (or earlier if y
const { Pool } = require('pg')
const pool = new Pool()
-;(async () => {
- const { rows } = await pool.query('SELECT * FROM users WHERE id = $1', [1])
- console.log('user:', rows[0])
-})().catch(err =>
- setImmediate(() => {
- throw err
- })
-)
+const { rows } = await pool.query('SELECT * FROM users WHERE id = $1', [1])
+console.log('user:', rows[0])
```
### Shutdown
@@ -146,20 +143,18 @@ To shut down a pool call `pool.end()` on the pool. This will wait for all checke
const { Pool } = require('pg')
const pool = new Pool()
-;(async () => {
- console.log('starting async query')
- const result = await pool.query('SELECT NOW()')
- console.log('async query finished')
+console.log('starting async query')
+const result = await pool.query('SELECT NOW()')
+console.log('async query finished')
- console.log('starting callback query')
- pool.query('SELECT NOW()', (err, res) => {
- console.log('callback query finished')
- })
+console.log('starting callback query')
+pool.query('SELECT NOW()', (err, res) => {
+ console.log('callback query finished')
+})
- console.log('calling end')
- await pool.end()
- console.log('pool has drained')
-})()
+console.log('calling end')
+await pool.end()
+console.log('pool has drained')
```
The output of the above will be:
@@ -173,8 +168,6 @@ callback query finished
pool has drained
```
-
-
- The pool will return errors when attempting to check out a client after you've called `pool.end()` on the pool.
-
-
+
+ The pool will return errors when attempting to check out a client after you've called pool.end() on the pool.
+
diff --git a/docs/pages/features/2-queries.mdx b/docs/pages/features/queries.mdx
similarity index 100%
rename from docs/pages/features/2-queries.mdx
rename to docs/pages/features/queries.mdx
diff --git a/docs/pages/features/6-ssl.mdx b/docs/pages/features/ssl.mdx
similarity index 100%
rename from docs/pages/features/6-ssl.mdx
rename to docs/pages/features/ssl.mdx
diff --git a/docs/pages/features/4-transactions.mdx b/docs/pages/features/transactions.mdx
similarity index 58%
rename from docs/pages/features/4-transactions.mdx
rename to docs/pages/features/transactions.mdx
index 96ddc764..408db52f 100644
--- a/docs/pages/features/4-transactions.mdx
+++ b/docs/pages/features/transactions.mdx
@@ -1,30 +1,62 @@
---
title: Transactions
-slug: /features/transactions
---
+import { Alert } from '/components/alert.tsx'
+
To execute a transaction with node-postgres you simply execute `BEGIN / COMMIT / ROLLBACK` queries yourself through a client. Because node-postgres strives to be low level and un-opinionated, it doesn't provide any higher level abstractions specifically around transactions.
-
+
You must use the same client instance for all statements within a transaction. PostgreSQL
isolates a transaction to individual clients. This means if you initialize or use transactions with the{' '}
pool.query method you will have problems. Do not use transactions with
the pool.query method.
-
+
## Examples
-### A pooled client with callbacks
+### async/await
+
+Things are considerably more straightforward if you're using async/await:
+
+```js
+const { Pool } = require('pg')
+const pool = new Pool()
+
+// note: we don't try/catch this because if connecting throws an exception
+// we don't need to dispose of the client (it will be undefined)
+const client = await pool.connect()
+
+try {
+ await client.query('BEGIN')
+ const queryText = 'INSERT INTO users(name) VALUES($1) RETURNING id'
+ const res = await client.query(queryText, ['brianc'])
+
+ const insertPhotoText = 'INSERT INTO photos(user_id, photo_url) VALUES ($1, $2)'
+ const insertPhotoValues = [res.rows[0].id, 's3.bucket.foo']
+ await client.query(insertPhotoText, insertPhotoValues)
+ await client.query('COMMIT')
+} catch (e) {
+ await client.query('ROLLBACK')
+ throw e
+} finally {
+ client.release()
+}
+```
+
+### callbacks
+
+node-postgres is a very old library, and still has an optional callback API. Here's an example of doing the same code above, but with callbacks:
```js
const { Pool } = require('pg')
const pool = new Pool()
pool.connect((err, client, done) => {
- const shouldAbort = err => {
+ const shouldAbort = (err) => {
if (err) {
console.error('Error in transaction', err.stack)
- client.query('ROLLBACK', err => {
+ client.query('ROLLBACK', (err) => {
if (err) {
console.error('Error rolling back client', err.stack)
}
@@ -35,7 +67,7 @@ pool.connect((err, client, done) => {
return !!err
}
- client.query('BEGIN', err => {
+ client.query('BEGIN', (err) => {
if (shouldAbort(err)) return
const queryText = 'INSERT INTO users(name) VALUES($1) RETURNING id'
client.query(queryText, ['brianc'], (err, res) => {
@@ -46,7 +78,7 @@ pool.connect((err, client, done) => {
client.query(insertPhotoText, insertPhotoValues, (err, res) => {
if (shouldAbort(err)) return
- client.query('COMMIT', err => {
+ client.query('COMMIT', (err) => {
if (err) {
console.error('Error committing transaction', err.stack)
}
@@ -58,39 +90,4 @@ pool.connect((err, client, done) => {
})
```
-
-
- I omitted any additional libraries from the example for clarity, but if you're using callbacks you'd typically be
- using a flow control library like
async.
-
-
-
-### A pooled client with async/await
-
-Things are considerably more straightforward if you're using async/await:
-
-```js
-const { Pool } = require('pg')
-const pool = new Pool()
-;(async () => {
- // note: we don't try/catch this because if connecting throws an exception
- // we don't need to dispose of the client (it will be undefined)
- const client = await pool.connect()
-
- try {
- await client.query('BEGIN')
- const queryText = 'INSERT INTO users(name) VALUES($1) RETURNING id'
- const res = await client.query(queryText, ['brianc'])
-
- const insertPhotoText = 'INSERT INTO photos(user_id, photo_url) VALUES ($1, $2)'
- const insertPhotoValues = [res.rows[0].id, 's3.bucket.foo']
- await client.query(insertPhotoText, insertPhotoValues)
- await client.query('COMMIT')
- } catch (e) {
- await client.query('ROLLBACK')
- throw e
- } finally {
- client.release()
- }
-})().catch(e => console.error(e.stack))
-```
+..thank goodness for `async/await` yeah?
diff --git a/docs/pages/features/5-types.mdx b/docs/pages/features/types.mdx
similarity index 90%
rename from docs/pages/features/5-types.mdx
rename to docs/pages/features/types.mdx
index 929b9718..65c814ba 100644
--- a/docs/pages/features/5-types.mdx
+++ b/docs/pages/features/types.mdx
@@ -1,8 +1,9 @@
---
title: Data Types
-slug: /features/types
---
+import { Alert } from '/components/alert.tsx'
+
PostgreSQL has a rich system of supported [data types](https://www.postgresql.org/docs/9.5/static/datatype.html). node-postgres does its best to support the most common data types out of the box and supplies an extensible type parser to allow for custom type serialization and parsing.
## strings by default
@@ -83,7 +84,7 @@ console.log(result.rows)
psql output:
-```psql
+```
bmc=# select * from dates;
date_col | timestamp_col | timestamptz_col
------------+-------------------------+----------------------------
@@ -95,8 +96,11 @@ node-postgres converts `DATE` and `TIMESTAMP` columns into the **local** time of
_note: I generally use `TIMESTAMPTZ` when storing dates; otherwise, inserting a time from a process in one timezone and reading it out in a process in another timezone can cause unexpected differences in the time._
-
-
- Although PostgreSQL supports microseconds in dates, JavaScript only supports dates to the millisecond precision. Keep this in mind when you send dates to and from PostgreSQL from node: your microseconds will be truncated when converting to a JavaScript date object even if they exist in the database. If you need to preserve them, I recommend using a custom type parser.
+
+
+ Although PostgreSQL supports microseconds in dates, JavaScript only supports dates to the millisecond precision.
+ Keep this in mind when you send dates to and from PostgreSQL from node: your microseconds will be truncated when
+ converting to a JavaScript date object even if they exist in the database. If you need to preserve them, I recommend
+ using a custom type parser.
-
+
diff --git a/docs/pages/guides/_meta.json b/docs/pages/guides/_meta.json
new file mode 100644
index 00000000..3889a099
--- /dev/null
+++ b/docs/pages/guides/_meta.json
@@ -0,0 +1,5 @@
+{
+ "project-structure": "Suggested Code Structure",
+ "async-express": "Express with Async/Await",
+ "upgrading": "Upgrading"
+}
diff --git a/docs/pages/guides/2-async-express.md b/docs/pages/guides/async-express.md
similarity index 97%
rename from docs/pages/guides/2-async-express.md
rename to docs/pages/guides/async-express.md
index 5e64a03e..3be6d955 100644
--- a/docs/pages/guides/2-async-express.md
+++ b/docs/pages/guides/async-express.md
@@ -1,6 +1,5 @@
---
title: Express with async/await
-slug: /guides/async-express
---
My preferred way to use node-postgres (and all async code in node.js) is with `async/await`. I find it makes reasoning about control-flow easier and allows me to write more concise and maintainable code.
@@ -61,7 +60,7 @@ Then in my `routes/index.js` file I'll have something like this which mounts eac
const users = require('./user')
const photos = require('./photos')
-module.exports = app => {
+module.exports = (app) => {
app.use('/users', users)
app.use('/photos', photos)
// etc..
diff --git a/docs/pages/guides/1-project-structure.md b/docs/pages/guides/project-structure.md
similarity index 94%
rename from docs/pages/guides/1-project-structure.md
rename to docs/pages/guides/project-structure.md
index 95d9ea97..742451da 100644
--- a/docs/pages/guides/1-project-structure.md
+++ b/docs/pages/guides/project-structure.md
@@ -1,6 +1,5 @@
---
title: Suggested Project Structure
-slug: /guides/project-structure
---
Whenever I am writing a project & using node-postgres I like to create a file within it and make all interactions with the database go through this file. This serves a few purposes:
@@ -81,7 +80,7 @@ module.exports = {
That was pretty quick! And now all of our queries everywhere in our application are being logged.
-_note: I didn't log the query parameters. Depending on your application you might be storing encrypted passwords or other sensitive information in your database. If you log your query parameters you might accidentally log sensitive information. Every app is different though so do what suits you best!_
+_note: I didn't log the query parameters. Depending on your application you might be storing encrypted passwords or other sensitive information in your database. If you log your query parameters you might accidentally log sensitive information. Every app is different though so do what suits you best!_
Now what if we need to check out a client from the pool to run several queries in a row in a transaction? We can add another method to our `db/index.js` file when we need to do this:
@@ -103,7 +102,7 @@ module.exports = {
pool.connect((err, client, done) => {
callback(err, client, done)
})
- }
+ },
}
```
@@ -152,13 +151,13 @@ module.exports = {
callback(err, client, release)
})
- }
+ },
}
```
Using async/await:
-```
+```js
module.exports = {
async query(text, params) {
const start = Date.now()
@@ -191,7 +190,8 @@ module.exports = {
return release.apply(client)
}
return client
- }
+ },
}
```
+
That should hopefully give us enough diagnostic information to track down any leaks.
diff --git a/docs/pages/guides/3-upgrading.md b/docs/pages/guides/upgrading.md
similarity index 100%
rename from docs/pages/guides/3-upgrading.md
rename to docs/pages/guides/upgrading.md
diff --git a/docs/pages/index.mdx b/docs/pages/index.mdx
index a345900e..234cf11e 100644
--- a/docs/pages/index.mdx
+++ b/docs/pages/index.mdx
@@ -15,13 +15,13 @@ $ npm install pg
node-postgres continued development and support is made possible by the many [supporters](https://github.com/brianc/node-postgres/blob/master/SPONSORS.md) with a special thanks to our featured supporters:
-