diff --git a/docs/child-loggers.md b/docs/child-loggers.md
index 6890e8419..13b6ebc2d 100644
--- a/docs/child-loggers.md
+++ b/docs/child-loggers.md
@@ -71,7 +71,7 @@ $ cat my-log
{"pid":95469,"hostname":"MacBook-Pro-3.home","level":30,"msg":"howdy","time":1459534114473,"a":"property","a":"prop"}
```
-Notice how there are two keys named `a` in the JSON output. The sub-childs properties
+Notice how there are two keys named `a` in the JSON output. The sub-child's properties
appear after the parent child properties.
At some point, the logs will most likely be processed (for instance with a [transport](transports.md)),
@@ -83,7 +83,7 @@ $ cat my-log | node -e "process.stdin.once('data', (line) => console.log(JSON.st
{"pid":95469,"hostname":"MacBook-Pro-3.home","level":30,"msg":"howdy","time":"2016-04-01T18:08:34.473Z","a":"prop"}
```
-Ultimately the conflict is resolved by taking the last value, which aligns with Bunyans child logging
+Ultimately the conflict is resolved by taking the last value, which aligns with Bunyan's child logging
behavior.
There may be cases where this edge case becomes problematic if a JSON parser with alternative behavior
diff --git a/docs/help.md b/docs/help.md
index 53b0ca4ee..7caf422c4 100644
--- a/docs/help.md
+++ b/docs/help.md
@@ -94,7 +94,7 @@ See [`pino.multistream`](/docs/api.md#pino-multistream).
## Log Filtering
-The Pino philosophy advocates common, pre-existing, system utilities.
+The Pino philosophy advocates common, preexisting, system utilities.
Some recommendations in line with this philosophy are:
@@ -153,7 +153,7 @@ for information on this is handled.
## Log levels as labels instead of numbers
-Pino log lines are meant to be parseable. Thus, Pino's default mode of operation
+Pino log lines are meant to be parsable. Thus, Pino's default mode of operation
is to print the level value instead of the string name.
However, you can use the [`formatters`](/docs/api.md#formatters-object) option
with a [`level`](/docs/api.md#level) function to print the string name instead of the level value :
diff --git a/docs/redaction.md b/docs/redaction.md
index 66dcae5c0..9b7e4ff09 100644
--- a/docs/redaction.md
+++ b/docs/redaction.md
@@ -100,7 +100,7 @@ See [pino options in API](/docs/api.md#redact-array-object) for `redact` API det
## Path Syntax
The syntax for paths supplied to the `redact` option conform to the syntax in path lookups
-in standard EcmaScript, with two additions:
+in standard ECMAScript, with two additions:
* paths may start with bracket notation
* paths may contain the asterisk `*` to denote a wildcard
diff --git a/docs/transports.md b/docs/transports.md
index a47543f89..3dadc3ce9 100644
--- a/docs/transports.md
+++ b/docs/transports.md
@@ -326,7 +326,7 @@ const transport = pino.transport({
pino(transport)
```
-The `options.destination` property may also be a number to represent a filedescriptor. Typically this would be `1` to write to STDOUT or `2` to write to STDERR. If `options.destination` is not set, it defaults to `1` which means logs will be written to STDOUT. If `options.destination` is a string integer, e.g. `'1'`, it will be coerced to a number and used as a file descriptor. If this is not desired, provide a full path, e.g. `/tmp/1`.
+The `options.destination` property may also be a number to represent a file descriptor. Typically this would be `1` to write to STDOUT or `2` to write to STDERR. If `options.destination` is not set, it defaults to `1` which means logs will be written to STDOUT. If `options.destination` is a string integer, e.g. `'1'`, it will be coerced to a number and used as a file descriptor. If this is not desired, provide a full path, e.g. `/tmp/1`.
The difference between using the `pino/file` transport builtin and using `pino.destination` is that `pino.destination` runs in the main thread, whereas `pino/file` sets up `pino.destination` in a worker thread.
@@ -466,7 +466,7 @@ Given an application `foo` that logs via pino, you would use `pino-applicationin
$ node foo | pino-applicationinsights --key blablabla
```
-For full documentation of command line switches read [readme](https://github.com/ovhemert/pino-applicationinsights#readme)
+For full documentation of command line switches read [README](https://github.com/ovhemert/pino-applicationinsights#readme)
### pino-azuretable
@@ -478,7 +478,7 @@ Given an application `foo` that logs via pino, you would use `pino-azuretable` l
$ node foo | pino-azuretable --account storageaccount --key blablabla
```
-For full documentation of command line switches read [readme](https://github.com/ovhemert/pino-azuretable#readme)
+For full documentation of command line switches read [README](https://github.com/ovhemert/pino-azuretable#readme)
### pino-cloudwatch
@@ -514,7 +514,7 @@ Given an application `foo` that logs via pino, you would use `pino-datadog` like
$ node foo | pino-datadog --key blablabla
```
-For full documentation of command line switches read [readme](https://github.com/ovhemert/pino-datadog#readme)
+For full documentation of command line switches read [README](https://github.com/ovhemert/pino-datadog#readme)
### pino-elasticsearch
@@ -530,9 +530,9 @@ $ node app.js | pino-elasticsearch
Assuming Elasticsearch is running on localhost.
-To connect to an external elasticsearch instance (recommended for production):
+To connect to an external Elasticsearch instance (recommended for production):
-* Check that `network.host` is defined in the `elasticsearch.yml` configuration file. See [elasticsearch Network Settings documentation](https://www.elastic.co/guide/en/elasticsearch/reference/current/modules-network.html#common-network-settings) for more details.
+* Check that `network.host` is defined in the `elasticsearch.yml` configuration file. See [Elasticsearch Network Settings documentation](https://www.elastic.co/guide/en/elasticsearch/reference/current/modules-network.html#common-network-settings) for more details.
* Launch:
```sh
@@ -594,7 +594,7 @@ $ node index.js | pino-kafka -b 10.10.10.5:9200 -d mytopic
$ node index.js | pino-logdna --key YOUR_INGESTION_KEY
```
-Tags and other metadata can be included using the available command line options. See the [pino-logdna readme](https://github.com/logdna/pino-logdna#options) for a full list.
+Tags and other metadata can be included using the available command line options. See the [pino-logdna README](https://github.com/logdna/pino-logdna#options) for a full list.
### pino-logflare
@@ -631,7 +631,7 @@ A base configuration file can be initialized with:
pino-mq -g
```
-For full documentation of command line switches and configuration see [the `pino-mq` readme](https://github.com/itavy/pino-mq#readme)
+For full documentation of command line switches and configuration see [the `pino-mq` README](https://github.com/itavy/pino-mq#readme)
### pino-loki
@@ -653,7 +653,7 @@ const transport = pino.transport({
pino(transport)
```
-For full documentation and configuration, see the [readme](https://github.com/Julien-R44/pino-loki).
+For full documentation and configuration, see the [README](https://github.com/Julien-R44/pino-loki).
### pino-papertrail
@@ -667,12 +667,12 @@ node yourapp.js | pino-papertrail --host bar.papertrailapp.com --port 12345 --ap
```
-for full documentation of command line switches read [readme](https://github.com/ovhemert/pino-papertrail#readme)
+for full documentation of command line switches read [README](https://github.com/ovhemert/pino-papertrail#readme)
### pino-pg
[pino-pg](https://www.npmjs.com/package/pino-pg) stores logs into PostgreSQL.
-Full documentation in the [readme](https://github.com/Xstoudi/pino-pg).
+Full documentation in the [README](https://github.com/Xstoudi/pino-pg).
### pino-mysql
@@ -686,7 +686,7 @@ $ node app.js | pino-mysql -c db-configuration.json
`pino-mysql` can extract and save log fields into corresponding database fields
and/or save the entire log stream as a [JSON Data Type][JSONDT].
-For full documentation and command line switches read the [readme][pino-mysql].
+For full documentation and command line switches read the [README][pino-mysql].
[pino-mysql]: https://www.npmjs.com/package/pino-mysql
[MySQL]: https://www.mysql.com/
@@ -714,7 +714,7 @@ $ node app.js | pino-redis -U redis://username:password@localhost:6379
$ node app.js | pino-sentry --dsn=https://******@sentry.io/12345
```
-For full documentation of command line switches see the [pino-sentry readme](https://github.com/aandrewww/pino-sentry/blob/master/README.md).
+For full documentation of command line switches see the [pino-sentry README](https://github.com/aandrewww/pino-sentry/blob/master/README.md).
[pino-sentry]: https://www.npmjs.com/package/pino-sentry
[Sentry]: https://sentry.io/
@@ -896,7 +896,7 @@ like so:
$ node foo | pino-stackdriver --project bar --credentials /credentials.json
```
-For full documentation of command line switches read [readme](https://github.com/ovhemert/pino-stackdriver#readme)
+For full documentation of command line switches read [README](https://github.com/ovhemert/pino-stackdriver#readme)
### pino-syslog
diff --git a/docs/web.md b/docs/web.md
index 0882c7aeb..adf2ac6d4 100644
--- a/docs/web.md
+++ b/docs/web.md
@@ -64,7 +64,7 @@ app.get('/', function (req, res) {
app.listen(3000)
```
-See the [pino-http readme](https://npm.im/pino-http) for more info.
+See the [pino-http README](https://npm.im/pino-http) for more info.
## Pino with Hapi
@@ -122,7 +122,7 @@ start().catch((err) => {
})
```
-See the [hapi-pino readme](https://npm.im/hapi-pino) for more info.
+See the [hapi-pino README](https://npm.im/hapi-pino) for more info.
## Pino with Restify
@@ -145,7 +145,7 @@ server.get('/', function (req, res) {
server.listen(3000)
```
-See the [restify-pino-logger readme](https://npm.im/restify-pino-logger) for more info.
+See the [restify-pino-logger README](https://npm.im/restify-pino-logger) for more info.
## Pino with Koa
@@ -169,7 +169,7 @@ app.use((ctx) => {
app.listen(3000)
```
-See the [koa-pino-logger readme](https://github.com/pinojs/koa-pino-logger) for more info.
+See the [koa-pino-logger README](https://github.com/pinojs/koa-pino-logger) for more info.
## Pino with Node core `http`
@@ -192,7 +192,7 @@ function handle (req, res) {
server.listen(3000)
```
-See the [pino-http readme](https://npm.im/pino-http) for more info.
+See the [pino-http README](https://npm.im/pino-http) for more info.
@@ -231,7 +231,7 @@ async function bootstrap() {
bootstrap()
```
-See the [nestjs-pino readme](https://npm.im/nestjs-pino) for more info.
+See the [nestjs-pino README](https://npm.im/nestjs-pino) for more info.
@@ -258,4 +258,4 @@ app.use('/', (req) => {
createServer(app).listen(process.env.PORT || 3000)
```
-See the [pino-http readme](https://npm.im/pino-http) for more info.
+See the [pino-http README](https://npm.im/pino-http) for more info.
diff --git a/lib/levels.js b/lib/levels.js
index 4a3af01ea..67e6a99db 100644
--- a/lib/levels.js
+++ b/lib/levels.js
@@ -122,7 +122,7 @@ function isLevelEnabled (logLevel) {
* against the current threshold (`expected`).
*
* @param {SORTING_ORDER} direction comparison direction "ASC" or "DESC"
- * @param {number} current current log level number representatiton
+ * @param {number} current current log level number representation
* @param {number} expected threshold value to compare with
* @returns {boolean}
*/
diff --git a/pino.d.ts b/pino.d.ts
index 5de22014e..41ffdfb4c 100644
--- a/pino.d.ts
+++ b/pino.d.ts
@@ -356,7 +356,7 @@ declare namespace pino {
customLevels?: { [level in CustomLevels]: number };
/**
* Use this option to define custom comparison of log levels.
- * Usefull to compare custom log levels or non-standard level values.
+ * Useful to compare custom log levels or non-standard level values.
* Default: "ASC"
*/
levelComparison?: "ASC" | "DESC" | ((current: number, expected: number) => boolean);
diff --git a/test/fixtures/transport-worker.js b/test/fixtures/transport-worker.js
index b373a8682..16bbefc66 100644
--- a/test/fixtures/transport-worker.js
+++ b/test/fixtures/transport-worker.js
@@ -6,7 +6,7 @@ module.exports = (options) => {
const myTransportStream = new Writable({
autoDestroy: true,
write (chunk, enc, cb) {
- // Bypass console.log() to avoid flakyness
+ // Bypass console.log() to avoid flakiness
fs.writeSync(1, chunk.toString())
cb()
}