Skip to content

Commit

Permalink
docs(connections): clarify maxRetriesPerRequest usage (#3028)
Browse files Browse the repository at this point in the history
  • Loading branch information
roggervalf authored Jan 25, 2025
1 parent 7e18e19 commit 3709687
Show file tree
Hide file tree
Showing 4 changed files with 29 additions and 17 deletions.
3 changes: 0 additions & 3 deletions docs/gitbook/bull/patterns/persistent-connections.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,3 @@ A simple Queue instance used for managing the queue such as adding jobs, pausing
For example, say that you are adding jobs to a queue as the result of a call to an HTTP endpoint. The caller of this endpoint cannot wait forever if the connection to Redis happens to be down when this call is made.

Therefore the `maxRetriesPerRequest` setting should either be left at its default (which currently is 20) or set it to another value, maybe 1 so that the user gets an error quickly and can retry later.



8 changes: 8 additions & 0 deletions docs/gitbook/guide/connections.md
Original file line number Diff line number Diff line change
Expand Up @@ -53,6 +53,14 @@ const mySecondWorker = new Worker('mySecondWorker', async job => {}, {

Note that in the third example, even though the ioredis instance is being reused, the worker will create a duplicated connection that it needs internally to make blocking connections. Consult the [ioredis](https://github.com/luin/ioredis/blob/master/API.md) documentation to learn how to properly create an instance of `IORedis`.

#### `maxRetriesPerRequest`

This setting tells the ioredis client how many times to try a command that fails before throwing an error. So even though Redis is not reachable or offline, the command will be retried until this situation changes or the maximum number of attempts is reached.

This guarantees that the workers will keep processing forever as long as there is a working connection. If you create a Redis client manually, BullMQ will throw an exception if this setting is not set to null when it is passed into worker instances.

### Queue

Also note that simple Queue instance used for managing the queue such as adding jobs, pausing, using getters, etc. usually has different requirements from the worker.

For example, say that you are adding jobs to a queue as the result of a call to an HTTP endpoint - producer service. The caller of this endpoint cannot wait forever if the connection to Redis happens to be down when this call is made. Therefore the `maxRetriesPerRequest` setting should either be left at its default (which currently is 20) or set it to another value, maybe 1 so that the user gets an error quickly and can retry later.
Expand Down
4 changes: 4 additions & 0 deletions docs/gitbook/guide/metrics/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -47,3 +47,7 @@ const metrics = await queue.getMetrics('completed');
```

Note that the `getMetrics` method also accepts a `start` and `end` argument (`0` and `-1` by default), that you can use if you want to implement pagination.

## Read more:

- 💡 [Get Metrics API Reference](https://api.docs.bullmq.io/classes/v5.Queue.html#getMetrics)
31 changes: 17 additions & 14 deletions docs/gitbook/guide/metrics/prometheus.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,31 +7,31 @@ description: How to use the built-in prometheus exporter
BullMQ provides a simple API that can be used to export metrics to Prometheus. You just need to create an endpoint in your webserver that calls exportPrometheusMetrics() and configure prometheus to consume from this endpoint. For example using vanilla NodeJS:

```typescript
import http from "http";
import { Queue } from "bullmq";
import http from 'http';
import { Queue } from 'bullmq';

const queue = new Queue("my-queue");
const queue = new Queue('my-queue');

const server = http.createServer(
async (req: http.IncomingMessage, res: http.ServerResponse) => {
try {
if (req.url === "/metrics" && req.method === "GET") {
const metrics = await queue.exportMetrics();
if (req.url === '/metrics' && req.method === 'GET') {
const metrics = await queue.exportPrometheusMetrics();

res.writeHead(200, {
"Content-Type": "text/plain",
"Content-Length": Buffer.byteLength(metrics),
'Content-Type': 'text/plain',
'Content-Length': Buffer.byteLength(metrics),
});
res.end(metrics);
} else {
res.writeHead(404);
res.end("Not Found");
res.end('Not Found');
}
} catch (err: unknown) {
res.writeHead(500);
res.end(`Error: ${err instanceof Error ? err.message : "Unknown error"}`);
res.end(`Error: ${err instanceof Error ? err.message : 'Unknown error'}`);
}
}
},
);

const PORT = process.env.PORT || 3000;
Expand All @@ -53,9 +53,9 @@ You will get an output similar to this:
```
HELP bullmq_job_count Number of jobs in the queue by state
TYPE bullmq_job_count gauge
bullmq_job_count{queue="my-queue", state="waiting"} 5
bullmq_job_count{queue="my-queue", state="active"} 3
bullmq_job_count{queue="my-queue", state="completed"} 12
bullmq_job_count{queue="my-queue", state="waiting"} 5
bullmq_job_count{queue="my-queue", state="active"} 3
bullmq_job_count{queue="my-queue", state="completed"} 12
bullmq_job_count{queue="my-queue", state="failed"} 2
```

Expand All @@ -70,7 +70,7 @@ const queue = new Queue('my-queue');

app.get('/metrics', async (req, res) => {
try {
const metrics = await queue.exportMetrics();
const metrics = await queue.exportPrometheusMetrics();
res.set('Content-Type', 'text/plain');
res.send(metrics);
} catch (err) {
Expand All @@ -86,3 +86,6 @@ app.listen(PORT, () => {
});
```

## Read more:

- 💡 [Export Prometheus Metrics API Reference](https://api.docs.bullmq.io/classes/v5.Queue.html#exportPrometheusMetrics)

0 comments on commit 3709687

Please sign in to comment.