Merge branch 'master' into tcp-refactor

This commit is contained in:
Gareth Jones 2017-11-14 08:14:22 +11:00
commit eab01bf077
24 changed files with 4847 additions and 2776 deletions

View File

@ -12,9 +12,10 @@
"strict": 0,
"import/no-extraneous-dependencies": 1,
"prefer-spread": 0,
"prefer-rest-params": 0
"prefer-rest-params": 0,
"prefer-destructuring": 0
},
"parser-options": {
"parserOptions": {
"ecmaVersion": 6
}
}

View File

@ -6,9 +6,9 @@
This is a conversion of the [log4js](https://github.com/stritti/log4js)
framework to work with [node](http://nodejs.org). I started out just stripping out the browser-specific code and tidying up some of the javascript to work better in node. It grew from there. Although it's got a similar name to the Java library [log4j](https://logging.apache.org/log4j/2.x/), thinking that it will behave the same way will only bring you sorrow and confusion.
The full documentation is available [here](https://nomiddlename.github.io/log4js-node/).
The full documentation is available [here](https://log4js-node.github.io/log4js-node/).
There have been a few changes between log4js 1.x and 2.x (and 0.x too). You should probably read this [migration guide](https://nomiddlename.github.io/log4js-node/migration-guide.html) if things aren't working.
There have been a few changes between log4js 1.x and 2.x (and 0.x too). You should probably read this [migration guide](https://log4js-node.github.io/log4js-node/migration-guide.html) if things aren't working.
Out of the box it supports the following features:
@ -63,19 +63,33 @@ Output (in `cheese.log`):
```bash
[2010-01-17 11:43:37.987] [ERROR] cheese - Cheese is too ripe!
[2010-01-17 11:43:37.990] [FATAL] cheese - Cheese was breeding ground for listeria.
```
```
## Note for library makers
If you're writing a library and would like to include support for log4js, without introducing a dependency headache for your users, take a look at [log4js-api](https://github.com/log4js-node/log4js-api).
## Documentation
Available [here](https://nomiddlename.github.io/log4js-node/).
Available [here](https://log4js-node.github.io/log4js-node/).
There's also [an example application](https://github.com/nomiddlename/log4js-example).
There's also [an example application](https://github.com/log4js-node/log4js-example).
## TypeScript
```ts
import { configure, getLogger } from './log4js';
configure('./filename');
const logger = getLogger();
logger.level = 'debug';
logger.debug("Some debug messages");
configure({
appenders: { cheese: { type: 'file', filename: 'cheese.log' } },
categories: { default: { appenders: ['cheese'], level: 'error' } }
});
```
## Contributing
Contributions welcome, but take a look at the [rules](https://github.com/nomiddlename/log4js-node/wiki/Contributing) first.
Contributions welcome, but take a look at the [rules](https://log4js-node.github.io/log4js-node/contrib-guidelines.html) first.
## License

View File

@ -16,8 +16,9 @@ Any other configuration parameters will be passed to the underlying [streamrolle
* `compress` - `boolean` (default false) - compress the backup files during rolling (backup files will have `.gz` extension)
* `alwaysIncludePattern` - `boolean` (default false) - include the pattern in the name of the current log file as well as the backups.
* `daysToKeep` - `integer` (default 0) - if this value is greater than zero, then files older than that many days will be deleted during log rolling.
* `keepFileExt` - `boolean` (default false) - preserve the file extension when rotating log files (`file.log` becomes `file.2017-05-30.log` instead of `file.log.2017-05-30`).
The `pattern` is used to determine when the current log file should be renamed and a new log file created. For example, with a filename of 'cheese.log', and the default pattern of `.yyyy-MM-dd` - on startup this will result in a file called `cheese.log` being created and written to until the next write after midnight. When this happens, `cheese.log` will be renamed to `cheese.log.2017-04-30` and a new `cheese.log` file created. Note that, unlike the [file appender](file.md) there is no maximum number of backup files and you will have to clean up yourself (or submit a [pull request](contrib-guidelines.md) to add this feature). The appender uses the [date-format](https://github.com/nomiddlename/date-format) library to parse the `pattern`, and any of the valid formats can be used. Also note that there is no timer controlling the log rolling - changes in the pattern are determined on every log write. If no writes occur, then no log rolling will happen. If your application logs infrequently this could result in no log file being written for a particular time period.
The `pattern` is used to determine when the current log file should be renamed and a new log file created. For example, with a filename of 'cheese.log', and the default pattern of `.yyyy-MM-dd` - on startup this will result in a file called `cheese.log` being created and written to until the next write after midnight. When this happens, `cheese.log` will be renamed to `cheese.log.2017-04-30` and a new `cheese.log` file created. The appender uses the [date-format](https://github.com/nomiddlename/date-format) library to parse the `pattern`, and any of the valid formats can be used. Also note that there is no timer controlling the log rolling - changes in the pattern are determined on every log write. If no writes occur, then no log rolling will happen. If your application logs infrequently this could result in no log file being written for a particular time period.
## Example (default daily log rolling)

View File

@ -15,6 +15,7 @@ Any other configuration parameters will be passed to the underlying [streamrolle
* `mode`- `integer` (default 0644)
* `flags` - `string` (default 'a')
* `compress` - `boolean` (default false) - compress the backup files during rolling (backup files will have `.gz` extension)
* `keepFileExt` - `boolean` (default false) - preserve the file extension when rotating log files (`file.log` becomes `file.1.log` instead of `file.log.1`)
## Example

View File

@ -4,7 +4,7 @@ The sync file appender writes log events to a file, the only difference to the n
## Configuration
* `type` - `"file"`
* `type` - `"fileSync"`
* `filename` - `string` - the path of the file where you want your logs written.
* `maxLogSize` - `integer` (optional) - the maximum size (in bytes) for the log file. If not specified, then no log rolling will happen.
* `backups` - `integer` (optional, default value = 5) - the number of old log files to keep during log rolling.

View File

@ -9,8 +9,9 @@ This appender sends log events to a [logstash](https://www.elastic.co/products/l
* `port` - `integer` - port of the logstash server
* `logType` - `string` (optional) - used for the `type` field in the logstash data
* `category` - `string` (optional) - used for the `type` field of the logstash data if `logType` is not defined
* `fields` - `object` (optional) - extra fields to log with each event
* `fields` - `object` (optional) - extra fields to log with each event. User-defined fields can be either a string or a function. Functions will be passed the log event, and should return a string.
* `layout` - (optional, defaults to dummyLayout) - used for the `message` field of the logstash data (see [layouts](layouts.md))
* `args` - (optional, defaults to both) - determines how to log arguments and configuration fields: `direct` logs them as direct properties of the log object, `fields` logs them as child properties of the `fields` property, and `both` logs both.
## Example
```javascript
@ -21,7 +22,10 @@ log4js.configure({
host: 'log.server',
port: '12345',
logType: 'application',
fields: { biscuits: 'digestive', tea: 'tetley' }
fields: { biscuits: 'digestive', tea: 'tetley', user: function(logEvent) {
return AuthLibrary.currentUser();
}
}
}
},
categories: {
@ -42,6 +46,7 @@ This will result in a JSON message being sent to `log.server:12345` over UDP, wi
'level': 'INFO',
'category': 'default',
'biscuits': 'hobnob',
'user': 'charlie',
'cheese': 'gouda',
'tea': 'tetley'
}

View File

@ -19,7 +19,7 @@ Then your v2 config should be something like this:
{
appenders: {
out: { type: 'console' },
tasks: {
task: {
type: 'dateFile',
filename: 'logs/task',
pattern: '-dd.log',

View File

@ -4,7 +4,15 @@ Log4js can load appenders from outside its core set. To add a custom appender, t
## Loading mechanism
When log4js parses your configuration, it loops through the defined appenders. For each one, it will `require` the appender initially using the `type` value prepended with './appenders' as the module identifier - this is to try loading from the core appenders first. If that fails (the module could not be found in the core appenders), then log4js will try to require the module using just the `type` value. If that fails, an error will be raised.
When log4js parses your configuration, it loops through the defined appenders. For each one, it will `require` the appender initially using the `type` value prepended with './appenders' as the module identifier - this is to try loading from the core appenders first. If that fails (the module could not be found in the core appenders), then log4js will try to require the module using variations of the `type` value.
Log4js checks the following places (in this order) for appenders based on the type value:
1. The core appenders: `require('./appenders/' + type)`
2. node_modules: `require(type)`
3. relative to the main file of your application: `require(path.dirname(require.main.filename) + '/' + type)`
4. relative to the process' current working directory: `require(process.cwd() + '/' + type)`
If that fails, an error will be raised.
## Appender Modules

View File

@ -40,7 +40,8 @@ function fileAppender(file, layout, logSize, numBackups, options, timezoneOffset
// there has to be at least one backup if logSize has been specified
numBackups = numBackups === 0 ? 1 : numBackups;
debug('Creating file appender (',
debug(
'Creating file appender (',
file, ', ',
logSize, ', ',
numBackups, ', ',

View File

@ -7,6 +7,17 @@ const os = require('os');
const eol = os.EOL || '\n';
function touchFile(file, options) {
// if the file exists, nothing to do
if (fs.existsSync(file)) {
return;
}
// touch the file to apply flags (like w to truncate the file)
const id = fs.openSync(file, options.flags, options.mode);
fs.closeSync(id);
}
class RollingFileSync {
constructor(filename, size, backups, options) {
debug('In RollingFileStream');
@ -22,16 +33,17 @@ class RollingFileSync {
this.filename = filename;
this.size = size;
this.backups = backups || 1;
this.options = options || { encoding: 'utf8', mode: parseInt('0644', 8), flags: 'a' }; // eslint-disable-line
this.options = options;
this.currentSize = 0;
function currentFileSize(file) {
let fileSize = 0;
try {
fileSize = fs.statSync(file).size;
} catch (e) {
// file does not exist
fs.appendFileSync(filename, '');
touchFile(file, options);
}
return fileSize;
}
@ -130,8 +142,9 @@ class RollingFileSync {
* has been reached (default 5)
* @param timezoneOffset - optional timezone offset in minutes
* (default system local)
* @param options - passed as is to fs options
*/
function fileAppender(file, layout, logSize, numBackups, timezoneOffset) {
function fileAppender(file, layout, logSize, numBackups, timezoneOffset, options) {
debug('fileSync appender created');
file = path.normalize(file);
numBackups = numBackups === undefined ? 5 : numBackups;
@ -145,14 +158,13 @@ function fileAppender(file, layout, logSize, numBackups, timezoneOffset) {
stream = new RollingFileSync(
filePath,
fileSize,
numFiles
numFiles,
options
);
} else {
stream = (((f) => {
// create file if it doesn't exist
if (!fs.existsSync(f)) {
fs.appendFileSync(f, '');
}
// touch the file to apply flags (like w to truncate the file)
touchFile(f, options);
return {
write(data) {
@ -178,12 +190,19 @@ function configure(config, layouts) {
layout = layouts.layout(config.layout.type, config.layout);
}
const options = {
flags: config.flags || 'a',
encoding: config.encoding || 'utf8',
mode: config.mode || 0o644
};
return fileAppender(
config.filename,
layout,
config.maxLogSize,
config.backups,
config.timezoneOffset
config.timezoneOffset,
options
);
}

View File

@ -8,14 +8,14 @@ const OS = require('os');
const debug = require('debug')('log4js:gelf');
/* eslint no-unused-vars:0 */
const LOG_EMERG = 0; // system is unusable(unused)
const LOG_ALERT = 1; // action must be taken immediately(unused)
const LOG_CRIT = 2; // critical conditions
const LOG_ERROR = 3; // error conditions
const LOG_WARNING = 4; // warning conditions
const LOG_NOTICE = 5; // normal, but significant, condition(unused)
const LOG_INFO = 6; // informational message
const LOG_DEBUG = 7; // debug-level message
const LOG_EMERG = 0; // system is unusable(unused)
const LOG_ALERT = 1; // action must be taken immediately(unused)
const LOG_CRIT = 2; // critical conditions
const LOG_ERROR = 3; // error conditions
const LOG_WARNING = 4; // warning conditions
const LOG_NOTICE = 5; // normal, but significant, condition(unused)
const LOG_INFO = 6; // informational message
const LOG_DEBUG = 7; // debug-level message
/**
* GELF appender that supports sending UDP packets to a GELF compatible server such as Graylog
@ -113,7 +113,7 @@ function gelfAppender(layout, config, levels) {
const app = (loggingEvent) => {
const message = preparePacket(loggingEvent);
zlib.gzip(new Buffer(JSON.stringify(message)), (err, packet) => {
zlib.gzip(Buffer.from(JSON.stringify(message)), (err, packet) => {
if (err) {
console.error(err.stack);
} else {

View File

@ -36,11 +36,11 @@ function logFacesAppender(config) {
return function log(event) {
// convert to logFaces compact json format
const lfsEvent = {
a: config.application || '', // application name
t: event.startTime.getTime(), // time stamp
p: event.level.levelStr, // level (priority)
g: event.categoryName, // logger name
m: format(event.data) // message text
a: config.application || '', // application name
t: event.startTime.getTime(), // time stamp
p: event.level.levelStr, // level (priority)
g: event.categoryName, // logger name
m: format(event.data) // message text
};
// add context variables if exist
@ -52,9 +52,7 @@ function logFacesAppender(config) {
sender.post('', lfsEvent)
.catch((error) => {
if (error.response) {
console.error(
`log4js.logFaces-HTTP Appender error posting to ${config.url}: ${error.response.status} - ${error.response.data}`
);
console.error(`log4js.logFaces-HTTP Appender error posting to ${config.url}: ${error.response.status} - ${error.response.data}`);
return;
}
console.error(`log4js.logFaces-HTTP Appender error: ${error.message}`);
@ -67,8 +65,9 @@ function configure(config) {
}
function format(logData) {
const data = Array.isArray(logData) ?
logData : Array.prototype.slice.call(arguments);
const data = Array.isArray(logData)
? logData
: Array.prototype.slice.call(arguments);
return util.format.apply(util, wrapErrorsWithInspect(data));
}

View File

@ -21,7 +21,7 @@ function datagram(config) {
const port = config.port || 55201;
return function (event) {
const buff = new Buffer(JSON.stringify(event));
const buff = Buffer.from(JSON.stringify(event));
sock.send(buff, 0, buff.length, port, host, (err) => {
if (err) {
console.error(`log4js.logFacesUDPAppender error sending to ${host}:${port}, error: `, err);
@ -46,11 +46,11 @@ function logFacesUDPAppender(config) {
return function log(event) {
// convert to logFaces compact json format
const lfsEvent = {
a: config.application || '', // application name
t: event.startTime.getTime(), // time stamp
p: event.level.levelStr, // level (priority)
g: event.categoryName, // logger name
m: format(event.data) // message text
a: config.application || '', // application name
t: event.startTime.getTime(), // time stamp
p: event.level.levelStr, // level (priority)
g: event.categoryName, // logger name
m: format(event.data) // message text
};
// add context variables if exist
@ -82,8 +82,9 @@ function wrapErrorsWithInspect(items) {
}
function format(logData) {
const data = Array.isArray(logData) ?
logData : Array.prototype.slice.call(arguments);
const data = Array.isArray(logData)
? logData
: Array.prototype.slice.call(arguments);
return util.format.apply(util, wrapErrorsWithInspect(data));
}

View File

@ -4,7 +4,7 @@ const dgram = require('dgram');
const util = require('util');
function sendLog(udp, host, port, logObject) {
const buffer = new Buffer(JSON.stringify(logObject));
const buffer = Buffer.from(JSON.stringify(logObject));
/* eslint no-unused-vars:0 */
udp.send(buffer, 0, buffer.length, port, host, (err, bytes) => {
@ -23,6 +23,22 @@ function logstashUDP(config, layout) {
config.fields = {};
}
function checkArgs(argsValue, logUnderFields) {
if ((!argsValue) || (argsValue === 'both')) {
return true;
}
if (logUnderFields && (argsValue === 'fields')) {
return true;
}
if ((!logUnderFields) && (argsValue === 'direct')) {
return true;
}
return false;
}
function log(loggingEvent) {
/*
https://gist.github.com/jordansissel/2996677
@ -40,15 +56,17 @@ function logstashUDP(config, layout) {
const fields = {};
Object.keys(config.fields).forEach((key) => {
fields[key] = config.fields[key];
fields[key] = typeof config.fields[key] === 'function' ? config.fields[key](loggingEvent) : config.fields[key];
});
/* eslint no-prototype-builtins:1,no-restricted-syntax:[1, "ForInStatement"] */
if (loggingEvent.data.length > 1) {
const secondEvData = loggingEvent.data[1];
Object.keys(secondEvData).forEach((key) => {
fields[key] = secondEvData[key];
});
if ((secondEvData !== undefined) && (secondEvData !== null)) {
Object.keys(secondEvData).forEach((key) => {
fields[key] = secondEvData[key];
});
}
}
fields.level = loggingEvent.level.levelStr;
fields.category = loggingEvent.categoryName;
@ -57,13 +75,18 @@ function logstashUDP(config, layout) {
'@version': '1',
'@timestamp': (new Date(loggingEvent.startTime)).toISOString(),
type: type,
message: layout(loggingEvent),
fields: fields
message: layout(loggingEvent)
};
Object.keys(fields).forEach((key) => {
logObject[key] = fields[key];
});
if (checkArgs(config.args, true)) {
logObject.fields = fields;
}
if (checkArgs(config.args, false)) {
Object.keys(fields).forEach((key) => {
logObject[key] = fields[key];
});
}
sendLog(udp, config.host, config.port, logObject);
}

View File

@ -113,7 +113,7 @@ function smtpAppender(config, layout, subjectLayout) {
}
const appender = (loggingEvent) => {
unsentCount++; // eslint-disable-line no-plusplus
unsentCount++; // eslint-disable-line no-plusplus
logEventBuffer.push(loggingEvent);
if (sendInterval > 0) {
scheduleSend();

6691
package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@ -1,8 +1,8 @@
{
"name": "log4js",
"version": "2.3.4",
"version": "2.3.12",
"description": "Port of Log4js to work with node.",
"homepage": "https://nomiddlename.github.io/log4js-node/",
"homepage": "https://log4js-node.github.io/log4js-node/",
"keywords": [
"logging",
"log",
@ -11,13 +11,14 @@
],
"license": "Apache-2.0",
"main": "./lib/log4js",
"types": "./types/log4js.d.ts",
"author": "Gareth Jones <gareth.nomiddlename@gmail.com>",
"repository": {
"type": "git",
"url": "https://github.com/nomiddlename/log4js-node.git"
"url": "https://github.com/log4js-node/log4js-node.git"
},
"bugs": {
"url": "http://github.com/nomiddlename/log4js-node/issues"
"url": "http://github.com/log4js-node/log4js-node/issues"
},
"engines": {
"node": ">=4.0"
@ -41,20 +42,20 @@
"date-format": "^1.1.0",
"debug": "^2.6.8",
"semver": "^5.3.0",
"streamroller": "^0.5.2"
"streamroller": "^0.6.0"
},
"devDependencies": {
"codecov": "^1.0.1",
"conventional-changelog": "^1.1.4",
"eslint": "^3.19.0",
"eslint-config-airbnb-base": "^11.2.0",
"codecov": "^3.0.0",
"conventional-changelog": "^1.1.6",
"eslint": "^4.10.0",
"eslint-config-airbnb-base": "^12.1.0",
"eslint-import-resolver-node": "^0.3.1",
"eslint-plugin-import": "^2.6.1",
"husky": "^0.12.0",
"nyc": "^10.3.2",
"eslint-plugin-import": "^2.8.0",
"husky": "^0.14.3",
"nyc": "^11.3.0",
"sandboxed-module": "^2.0.3",
"tap": "^8.0.1",
"validate-commit-msg": "^2.12.2"
"tap": "^10.7.3",
"validate-commit-msg": "^2.14.0"
},
"optionalDependencies": {
"hipchat-notifier": "^1.1.0",

View File

@ -158,5 +158,31 @@ test('log4js fileSyncAppender', (batch) => {
});
});
batch.test('test options', (t) => {
// using non-standard options
log4js.configure({
appenders: {
sync: {
type: 'fileSync',
filename: 'tmp-options-tests.log',
layout: { type: 'messagePassThrough' },
flags: 'w',
encoding: 'ascii',
mode: 0o666
}
},
categories: {
default: { appenders: ['sync'], level: 'info' }
}
});
const logger = log4js.getLogger();
logger.warn('log message');
fs.readFile('tmp-options-tests.log', 'ascii', (err, contents) => {
t.include(contents, `log message${EOL}`);
t.end();
});
});
batch.end();
});

View File

@ -104,7 +104,7 @@ test('log4js layouts', (batch) => {
}
}
}),
/at Object\.<anonymous>\s+\((.*)test[\\/]tap[\\/]layouts-test\.js:\d+:\d+\)/,
/at (Test.batch.test|Test.<anonymous>)\s+\((.*)test[\\/]tap[\\/]layouts-test\.js:\d+:\d+\)/,
'regexp did not return a match - should print the stacks of a passed error objects'
);

View File

@ -120,6 +120,33 @@ test('logstashUDP appender', (batch) => {
t.end();
});
batch.test('configuration can include functions to generate field values at run-time', (t) => {
const setup = setupLogging('myCategory', {
host: '127.0.0.1',
port: 10001,
type: 'logstashUDP',
logType: 'myAppType',
category: 'myLogger',
fields: {
field1: 'value1',
field2: function () {
return 'evaluated at runtime';
}
},
layout: {
type: 'pattern',
pattern: '%m'
}
});
setup.logger.log('trace', 'Log event #1');
const json = JSON.parse(setup.results.buffer.toString());
t.equal(json.fields.field1, 'value1');
t.equal(json.fields.field2, 'evaluated at runtime' );
t.end();
});
batch.test('extra fields should be added to the fields structure', (t) => {
const setup = setupLogging('myLogger', {
host: '127.0.0.1',
@ -143,6 +170,87 @@ test('logstashUDP appender', (batch) => {
t.end();
});
batch.test('use direct args', (t) => {
const setup = setupLogging('myLogger', {
host: '127.0.0.1',
port: 10001,
type: 'logstashUDP',
category: 'myLogger',
args: 'direct',
layout: {
type: 'dummy'
}
});
setup.logger.log('info', 'Log event with fields', { extra1: 'value1', extra2: 'value2' });
const json = JSON.parse(setup.results.buffer.toString());
t.equal(json.extra1, 'value1');
t.equal(json.extra2, 'value2');
t.equal(json.fields, undefined);
t.end();
});
batch.test('use fields args', (t) => {
const setup = setupLogging('myLogger', {
host: '127.0.0.1',
port: 10001,
type: 'logstashUDP',
category: 'myLogger',
args: 'fields',
layout: {
type: 'dummy'
}
});
setup.logger.log('info', 'Log event with fields', { extra1: 'value1', extra2: 'value2' });
const json = JSON.parse(setup.results.buffer.toString());
t.equal(json.extra1, undefined);
t.equal(json.extra2, undefined);
t.equal(json.fields.extra1, 'value1');
t.equal(json.fields.extra2, 'value2');
t.end();
});
batch.test('Send null as argument', (t) => {
const setup = setupLogging('myLogger', {
host: '127.0.0.1',
port: 10001,
type: 'logstashUDP',
category: 'myLogger',
layout: {
type: 'dummy'
}
});
const msg = 'test message with null';
setup.logger.info(msg, null);
const json = JSON.parse(setup.results.buffer.toString());
t.equal(json.message, msg);
t.end();
});
batch.test('Send undefined as argument', (t) => {
const setup = setupLogging('myLogger', {
host: '127.0.0.1',
port: 10001,
type: 'logstashUDP',
category: 'myLogger',
layout: {
type: 'dummy'
}
});
const msg = 'test message with undefined';
setup.logger.info(msg, undefined);
const json = JSON.parse(setup.results.buffer.toString());
t.equal(json.message, msg);
t.end();
});
batch.test('shutdown should close sockets', (t) => {
const setup = setupLogging('myLogger', {
host: '127.0.0.1',

View File

@ -29,9 +29,9 @@ test('multiprocess appender shutdown (master)', { timeout: 2000 }, (t) => {
t.ok(err, 'we got a connection error');
t.end();
});
}, 500);
}, 250);
});
}, 500);
}, 250);
});
test('multiprocess appender shutdown (worker)', (t) => {

View File

@ -76,12 +76,12 @@ if (cluster.isMaster) {
});
const anotherLogger = log4js.getLogger('test');
anotherLogger.info('this should now get logged');
}, 1000);
}, 500);
// we have to wait a bit, so that the process.send messages get a chance to propagate
setTimeout(() => {
const events = recorder.replay();
process.send({ type: 'testing', instance: process.env.NODE_APP_INSTANCE, events: events });
cluster.worker.disconnect();
}, 2000);
}, 2500);
}

452
types/log4js.d.ts vendored Normal file
View File

@ -0,0 +1,452 @@
// Type definitions for log4js
export function getLogger(category?: string): Logger;
export function configure(filename: string): void;
export function configure(config: Configuration): void;
export function addLayout(name: string, config: (a: any) => (logEvent: LoggingEvent) => string): void;
export function connectLogger(logger: Logger, options: { format?: string; level?: string; nolog?: any; }): any; // express.Handler;
export function levels(): Levels;
export function shutdown(cb?: (error: Error) => void): void | null;
export interface BaseLayout {
type: 'basic';
}
export interface ColoredLayout {
type: 'colored' | 'coloured';
}
export interface MessagePassThroughLayout {
type: 'messagePassThrough';
}
export interface DummyLayout {
type: 'dummy';
}
export interface Level {
isEqualTo(other: string): boolean;
isEqualTo(otherLevel: Level): boolean;
isLessThanOrEqualTo(other: string): boolean;
isLessThanOrEqualTo(otherLevel: Level): boolean;
isGreaterThanOrEqualTo(other: string): boolean;
isGreaterThanOrEqualTo(otherLevel: Level): boolean;
}
export interface LoggingEvent {
categoryName: string; // name of category
level: Level; // level of message
data: any[]; // objects to log
startTime: Date;
pid: number;
context: any;
cluster?: {
workerId: number;
worker: number;
};
}
export type Token = ((logEvent: LoggingEvent) => string) | string;
export interface PatternLayout {
type: 'pattern';
// specifier for the output format, using placeholders as described below
pattern: string;
// user-defined tokens to be used in the pattern
tokens?: { [name: string]: Token };
}
export interface CustomLayout {
[key: string]: any;
type: string;
}
export type Layout = BaseLayout | ColoredLayout | MessagePassThroughLayout | DummyLayout | PatternLayout | CustomLayout;
/**
* Category Filter
*
* @see https://log4js-node.github.io/log4js-node/categoryFilter.html
*/
export interface CategoryFilterAppender {
type: "categoryFilter";
// the category (or categories if you provide an array of values) that will be excluded from the appender.
exclude?: string | string[];
// the name of the appender to filter. see https://log4js-node.github.io/log4js-node/layouts.html
appender?: string;
}
/**
* Console Appender
*
* @see https://log4js-node.github.io/log4js-node/console.html
*/
export interface ConsoleAppender {
type: 'console';
// defaults to colouredLayout
layout?: Layout;
}
export interface FileAppender {
type: 'file';
// the path of the file where you want your logs written.
filename: string;
// the maximum size (in bytes) for the log file. If not specified, then no log rolling will happen.
maxLogSize?: number;
// (default value = 5) - the number of old log files to keep during log rolling.
backups?: number;
// defaults to basic layout
layout?: Layout;
numBackups?: number;
compress?: boolean; // compress the backups
// keep the file extension when rotating logs
keepFileExt?: boolean;
encoding?: string;
mode?: number;
flags?: string;
}
export interface SyncfileAppender {
type: 'fileSync';
// the path of the file where you want your logs written.
filename: string;
// the maximum size (in bytes) for the log file. If not specified, then no log rolling will happen.
maxLogSize?: number;
// (default value = 5) - the number of old log files to keep during log rolling.
backups?: number;
// defaults to basic layout
layout?: Layout;
}
export interface DateFileAppender {
type: 'dateFile';
// the path of the file where you want your logs written.
filename: string;
// defaults to basic layout
layout?: Layout;
// defaults to .yyyy-MM-dd - the pattern to use to determine when to roll the logs.
/**
* The following strings are recognised in the pattern:
* - yyyy : the full year, use yy for just the last two digits
* - MM : the month
* - dd : the day of the month
* - hh : the hour of the day (24-hour clock)
* - mm : the minute of the hour
* - ss : seconds
* - SSS : milliseconds (although I'm not sure you'd want to roll your logs every millisecond)
* - O : timezone (capital letter o)
*/
pattern?: string;
// default “utf-8”
encoding?: string;
// default 0644
mode?: number;
// default a
flags?: string;
// compress the backup files during rolling (backup files will have .gz extension)(default false)
compress?: boolean;
// include the pattern in the name of the current log file as well as the backups.(default false)
alwaysIncludePattern?: boolean;
// keep the file extension when rotating logs
keepFileExt?: boolean;
// if this value is greater than zero, then files older than that many days will be deleted during log rolling.(default 0)
daysToKeep?: number;
}
export interface GELFAppender {
'type': 'gelf';
// (defaults to localhost) - the gelf server hostname
host?: string;
// (defaults to 12201) - the port the gelf server is listening on
port?: number;
// (defaults to OS.hostname()) - the hostname used to identify the origin of the log messages.
hostname?: string;
facility?: string;
// fields to be added to each log message; custom fields must start with an underscore.
customFields?: { [field: string]: any };
}
export interface HipchatAppender {
type: 'hipchat';
// User token with notification privileges
hipchat_token: string;
// Room ID or name
hipchat_room: string;
// (defaults to empty string) - a label to say where the message is from
hipchat_from?: string;
// (defaults to false) - make hipchat annoy people
hipchat_notify?: boolean;
// (defaults to api.hipchat.com) - set this if you have your own hipchat server
hipchat_host?: string;
// (defaults to only throwing errors) - implement this function if you want intercept the responses from hipchat
hipchat_response_callback?(err: Error, response: any): any;
// (defaults to messagePassThroughLayout)
layout?: Layout;
}
export interface LogFacesHTTPAppender {
type: 'logFaces-HTTP';
// logFaces receiver servlet URL
url: string;
// (defaults to empty string) - used to identify your applications logs
application?: string;
// (defaults to 5000ms) - the timeout for the HTTP request.
timeout?: number;
}
export interface LogFacesUDPAppender {
type: 'logFaces-UDP';
// (defaults to 127.0.0.1)- hostname or IP address of the logFaces receiver
remoteHost?: string;
// (defaults to 55201) - port the logFaces receiver is listening on
port?: number;
// (defaults to empty string) - used to identify your applications logs
application?: string;
}
export interface LogglyAppender {
type: 'loggly';
// your really long input token
token: string;
// your subdomain
subdomain: string;
// tags to include in every log message
tags?: string[];
}
export interface LogLevelFilterAppender {
type: 'logLevelFilter';
// the name of an appender, defined in the same configuration, that you want to filter
appender: string;
// the minimum level of event to allow through the filter
level: string;
// (defaults to FATAL) - the maximum level of event to allow through the filter
maxLevel?: string;
}
export interface LogstashUDPAppender {
type: 'logstashUDP';
// hostname (or IP-address) of the logstash server
host: string;
// port of the logstash server
port: number;
// used for the type field in the logstash data
logType?: string;
// used for the type field of the logstash data if logType is not defined
category?: string;
// extra fields to log with each event
fields?: { [fieldname: string]: any };
// (defaults to dummyLayout) used for the message field of the logstash data
layout?: Layout;
}
export interface MailgunAppender {
type: 'mailgun';
// your mailgun API key
apiKey: string;
// your domain
domain: string;
from: string;
to: string;
subject: string;
// (defaults to basicLayout)
layout?: Layout;
}
export interface MultiFileAppender {
type: 'multiFile';
// the base part of the generated log filename
base: string;
// the value to use to split files (see below).
property: string;
// the suffix for the generated log filename.
extension: string;
}
export interface MultiprocessAppender {
type: 'multiprocess';
// controls whether the appender listens for log events sent over the network, or is responsible for serialising events and sending them to a server.
mode: 'master' | 'worker';
// (only needed if mode == master)- the name of the appender to send the log events to
appender?: string;
// (defaults to 5000) - the port to listen on, or send to
loggerPort?: number;
// (defaults to localhost) - the host/IP address to listen on, or send to
loggerHost?: string;
}
export interface RedisAppender {
type: 'redis';
// (defaults to 127.0.0.1) - the location of the redis server
host?: string;
// (defaults to 6379) - the port the redis server is listening on
port?: number;
// password to use when authenticating connection to redis
pass?: string;
// the redis channel that log events will be published to
channel: string;
// (defaults to messagePassThroughLayout) - the layout to use for log events.
layout?: Layout;
}
export interface SlackAppender {
type: 'slack';
// your Slack API token (see the slack and slack-node docs)
token: string;
// the channel to send log messages
channel_id: string;
// the icon to use for the message
icon_url?: string;
// the username to display with the message
username: string;
// (defaults to basicLayout) - the layout to use for the message.
layout?: Layout;
}
export interface RecordingAppender {
type: 'recording';
}
export interface SmtpAppender {
type: 'smtp';
// (if not present will use transport field)
SMTP?: {
// (defaults to localhost)
host?: string;
// (defaults to 25)
port?: number;
// authentication details
auth?: {
user: string;
pass: string;
};
};
// (if not present will use SMTP) - see nodemailer docs for transport options
transport?: {
// (defaults to smtp) - the nodemailer transport plugin to use
plugin?: string;
// configuration for the transport plugin
options?: any;
} | string;
// send logs as email attachment
attachment?: {
// (defaults to false)
enable?: boolean;
// (defaults to See logs as attachment) - message to put in body of email
message: string;
// (defaults to default.log) - attachment filename
filename: string;
};
// integer(defaults to 0) - batch emails and send in one email every sendInterval seconds, if 0 then every log message will send an email.
sendInterval?: number;
// (defaults to 5) - time in seconds to wait for emails to be sent during shutdown
shutdownTimeout?: number;
// email addresses to send the logs to
recipients: string;
// (defaults to message from first log event in batch) - subject for email
subject?: string;
// who the logs should be sent as
sender?: string;
// (defaults to false) - send the email as HTML instead of plain text
html?: boolean;
// (defaults to basicLayout)
layout?: Layout;
}
export interface StandardErrorAppender {
type: 'stderr';
// (defaults to colouredLayout)
layout?: Layout;
}
export interface StandardOutputAppender {
type: 'stdout';
// (defaults to colouredLayout)
layout?: Layout;
}
export interface CustomAppender {
type: string;
[key: string]: any;
}
export type Appender = CategoryFilterAppender
| ConsoleAppender
| FileAppender
| SyncfileAppender
| DateFileAppender
| GELFAppender
| HipchatAppender
| LogFacesHTTPAppender
| LogFacesUDPAppender
| LogglyAppender
| LogLevelFilterAppender
| LogstashUDPAppender
| MailgunAppender
| MultiFileAppender
| MultiprocessAppender
| RedisAppender
| SlackAppender
| RecordingAppender
| SmtpAppender
| StandardErrorAppender
| StandardOutputAppender
| CustomAppender;
export interface Levels {
[index: string]: {
value: number;
colour: string;
};
}
export interface Configuration {
appenders: { [name: string]: Appender; };
categories: { [name: string]: { appenders: string[]; level: string; } };
pm2?: boolean;
pm2InstanceVar?: string;
levels?: Levels;
disableClustering?: boolean;
}
export interface Logger {
new(dispatch: Function, name: string): Logger;
level: string;
log(...args: any[]): void;
isLevelEnabled(level?: string): boolean;
isTraceEnabled(): boolean;
isDebugEnabled(): boolean;
isInfoEnabled(): boolean;
isWarnEnabled(): boolean;
isErrorEnabled(): boolean;
isFatalEnabled(): boolean;
_log(level: string, data: any): void;
addContext(key: string, value: any): void;
removeContext(key: string): void;
clearContext(): void;
trace(message: string, ...args: any[]): void;
debug(message: string, ...args: any[]): void;
info(message: string, ...args: any[]): void;
warn(message: string, ...args: any[]): void;
error(message: string, ...args: any[]): void;
fatal(message: string, ...args: any[]): void;
}

110
types/test.ts Normal file
View File

@ -0,0 +1,110 @@
import * as log4js from './log4js';
log4js.configure('./filename');
const logger1 = log4js.getLogger();
logger1.level = 'debug';
logger1.debug("Some debug messages");
const logger3 = log4js.getLogger('cheese');
logger3.trace('Entering cheese testing');
logger3.debug('Got cheese.');
logger3.info('Cheese is Gouda.');
logger3.warn('Cheese is quite smelly.');
logger3.error('Cheese is too ripe!');
logger3.fatal('Cheese was breeding ground for listeria.');
log4js.configure({
appenders: { cheese: { type: 'console', filename: 'cheese.log' } },
categories: { default: { appenders: ['cheese'], level: 'error' } }
});
log4js.configure({
appenders: {
out: { type: 'file', filename: 'pm2logs.log' }
},
categories: {
default: { appenders: ['out'], level: 'info' }
},
pm2: true,
pm2InstanceVar: 'INSTANCE_ID'
});
log4js.addLayout('json', config => function (logEvent) {
return JSON.stringify(logEvent) + config.separator;
});
log4js.configure({
appenders: {
out: { type: 'stdout', layout: { type: 'json', separator: ',' } }
},
categories: {
default: { appenders: ['out'], level: 'info' }
}
});
log4js.configure({
appenders: {
file: { type: 'dateFile', filename: 'thing.log', pattern: '.mm' }
},
categories: {
default: { appenders: ['file'], level: 'debug' }
}
});
const logger4 = log4js.getLogger('thing');
const logger5 = log4js.getLogger('json-test');
logger5.info('this is just a test');
logger5.error('of a custom appender');
logger5.warn('that outputs json');
log4js.shutdown(() => { });
log4js.configure({
appenders: {
cheeseLogs: { type: 'file', filename: 'cheese.log' },
console: { type: 'console' }
},
categories: {
cheese: { appenders: ['cheeseLogs'], level: 'error' },
another: { appenders: ['console'], level: 'trace' },
default: { appenders: ['console', 'cheeseLogs'], level: 'trace' }
}
});
const logger6 = log4js.getLogger('cheese');
// only errors and above get logged.
const otherLogger = log4js.getLogger();
// this will get coloured output on console, and appear in cheese.log
otherLogger.error('AAArgh! Something went wrong', { some: 'otherObject', useful_for: 'debug purposes' });
otherLogger.log('This should appear as info output');
// these will not appear (logging level beneath error)
logger6.trace('Entering cheese testing');
logger6.debug('Got cheese.');
logger6.info('Cheese is Gouda.');
logger6.log('Something funny about cheese.');
logger6.warn('Cheese is quite smelly.');
// these end up only in cheese.log
logger6.error('Cheese %s is too ripe!', 'gouda');
logger6.fatal('Cheese was breeding ground for listeria.');
// these don't end up in cheese.log, but will appear on the console
const anotherLogger = log4js.getLogger('another');
anotherLogger.debug('Just checking');
// will also go to console and cheese.log, since that's configured for all categories
const pantsLog = log4js.getLogger('pants');
pantsLog.debug('Something for pants');
import { configure, getLogger } from './log4js';
configure('./filename');
const logger2 = getLogger();
logger2.level = 'debug';
logger2.debug("Some debug messages");
configure({
appenders: { cheese: { type: 'file', filename: 'cheese.log' } },
categories: { default: { appenders: ['cheese'], level: 'error' } }
});