chore: resolved conflicts manually

This commit is contained in:
Gareth Jones 2018-05-30 08:26:32 +10:00
commit a0749b5d01
29 changed files with 3762 additions and 3600 deletions

View File

@ -34,7 +34,6 @@ The following appenders are included with log4js. Some require extra dependencie
* [recording](recording.md)
* [redis](redis.md)
* [slack](slack.md)
* [smtp](smtp.md)
* [stderr](stderr.md)
* [stdout](stdout.md)
* [tcp](tcp.md)
@ -49,6 +48,7 @@ The following appenders are supported by log4js, but are no longer distributed w
* [loggly](https://github.com/log4js-node/loggly)
* [logstashUDP](https://github.com/log4js-node/logstashUDP)
* [mailgun](https://github.com/log4js-node/mailgun)
* [smtp](https://github.com/log4js-node/smtp)
For example, if you were previously using the gelf appender (`type: 'gelf'`) then you should add `@log4js-node/gelf` to your dependencies and change the type to `type: '@log4js-node/gelf'`.

View File

@ -11,7 +11,7 @@ There have been a few changes between log4js 1.x and 2.x (and 0.x too). You shou
* coloured console logging to [stdout](stdout.md) or [stderr](stderr.md)
* [file appender](file.md), with configurable log rolling based on file size or [date](dateFile.md)
* [SMTP appender](smtp.md)
* [SMTP appender](https://github.com/log4js-node/smtp)
* [GELF appender](https://github.com/log4js-node/gelf)
* [Loggly appender](https://github.com/log4js-node/loggly)
* [Logstash UDP appender](https://github.com/log4js-node/logstashUDP)

View File

@ -1,6 +1,6 @@
# logstash Appender (HTTP)
The logstash appenders send NDJSON formatted log events to [logstash](https://www.elastic.co/products/logstash) receivers. This appender uses HTTP to send the events (there is another logstash appender that uses [UDP](logstashUDP.md)). You will need to include [axios](https://www.npmjs.com/package/axios) in your dependencies to use this appender.
The logstash appenders send NDJSON formatted log events to [logstash](https://www.elastic.co/products/logstash) receivers. This appender uses HTTP to send the events (there is another logstash appender that uses [UDP](https://github.com/log4js-node/logstashUDP)). You will need to include [axios](https://www.npmjs.com/package/axios) in your dependencies to use this appender.
## Configuration

View File

@ -8,6 +8,7 @@ The multiFile appender can be used to dynamically write logs to multiple files,
* `base` - `string` - the base part of the generated log filename
* `property` - `string` - the value to use to split files (see below).
* `extension` - `string` - the suffix for the generated log filename.
* `timeout` - `integer` - optional activity timeout in ms after which the file will be closed.
All other properties will be passed to the created [file](file.md) appenders. For the property value, `categoryName` is probably the most useful - although you could use `pid` or `level`. If the property is not found then the appender will look for the value in the context map. If that fails, then the logger will not output the logging event, without an error. This is to allow for dynamic properties which may not exist for all log messages.

View File

@ -1,96 +0,0 @@
# SMTP Appender
Sends log events as emails. To use this appender you will need to include the [nodemailer](https://www.npmjs.com/package/nodemailer) package in your dependencies. If you use this appender, you should also call `log4js.shutdown` when your application closes so that any remaining emails can be sent. Many of the configuration options below are passed through to nodemailer, so you should read their docs to get the most out of this appender.
## Configuration
* `type` - `smtp`
* `SMTP` - `object` (optional, if not present will use `transport` field)
* `host` - `string` (optional, defaults to `localhost`)
* `port` - `integer` (optional, defaults to `25`)
* `auth` - `object` (optional) - authentication details
* `user` - `string`
* `pass` - `string`
* `transport` - `object` (optional, if not present will use `SMTP`) - see nodemailer docs for transport options
* `plugin` - `string` (optional, defaults to `smtp`) - the nodemailer transport plugin to use
* `options` - `object` - configuration for the transport plugin
* `attachment` - `object` (optional) - send logs as email attachment
* `enable` - `boolean` (optional, defaults to `false`)
* `message` - `string` (optional, defaults to `See logs as attachment`) - message to put in body of email
* `filename` - `string` (optional, defaults to `default.log`) - attachment filename
* `sendInterval` - `integer` (optional, defaults to `0`) - batch emails and send in one email every `sendInterval` seconds, if `0` then every log message will send an email.
* `shutdownTimeout` - `integer` (optional, defaults to `5`) - time in seconds to wait for emails to be sent during shutdown
* `recipients` - `string` - email addresses to send the logs to
* `subject` - `string` (optional, defaults to message from first log event in batch) - subject for email
* `sender` - `string` (optional) - who the logs should be sent as
* `html` - `boolean` (optional, defaults to `false`) - send the email as HTML instead of plain text
* `layout` - `object` (optional, defaults to basicLayout) - see [layouts](layouts.md)
## Example (default config)
```javascript
log4js.configure({
appenders: {
'email': {
type: 'smtp', recipients: 'dev.team@company.name'
}
},
categories: { default: { appenders: [ 'email' ], level: 'error' } }
});
```
This configuration will send an email using the smtp server running on `localhost:25`, for every log event of level `ERROR` and above. The email will be sent to `dev.team@company.name`, the subject will be the message part of the log event, the body of the email will be log event formatted by the basic layout function.
## Example (logs as attachments, batched)
```javascript
log4js.configure({
appenders: {
'email': {
type: 'smtp',
recipients: 'dev.team@company.name',
subject: 'Latest logs',
sender: 'my.application@company.name',
attachment: {
enable: true,
filename: 'latest.log',
message: 'See the attachment for the latest logs'
},
sendInterval: 3600
}
},
categories: { default: { appenders: ['email'], level: 'ERROR' } }
});
```
This configuration will send an email once every hour, with all the log events of level 'ERROR' and above as an attached file.
## Example (custom SMTP host)
```javascript
log4js.configure({
appenders: {
email: {
type: 'smtp', smtp: { host: 'smtp.company.name', port: 8025 }, recipients: 'dev.team@company.name'
}
},
categories: { default: { appenders: ['email'], level: 'info' } }
});
```
This configuration can also be written as:
```javascript
log4js.configure({
appenders: {
email: {
type: 'smtp',
transport: {
plugin: 'smtp',
options: {
host: 'smtp.company.name',
port: 8025
}
},
recipients: 'dev.team@company.name'
}
},
categories: {
default: { appenders: ['email'], level: 'info' }
}
});
```
A similar config can be used to specify a different transport plugin than `smtp`. See the nodemailer docs for more details.

46
lib/appenders/adapters.js Normal file
View File

@ -0,0 +1,46 @@
'use strict';
function maxFileSizeUnitTransform(maxLogSize) {
if (typeof maxLogSize === 'number' && Number.isInteger(maxLogSize)) {
return maxLogSize;
}
const units = {
K: 1024,
M: 1024 * 1024,
G: 1024 * 1024 * 1024,
};
const validUnit = Object.keys(units);
const unit = maxLogSize.substr(maxLogSize.length - 1).toLocaleUpperCase();
const value = maxLogSize.substring(0, maxLogSize.length - 1).trim();
if (validUnit.indexOf(unit) < 0 || !Number.isInteger(Number(value))) {
throw Error(`maxLogSize: "${maxLogSize}" is invalid`);
} else {
return value * units[unit];
}
}
function adapter(configAdapter, config) {
const newConfig = Object.assign({}, config);
Object.keys(configAdapter).forEach((key) => {
if (newConfig[key]) {
newConfig[key] = configAdapter[key](config[key]);
}
});
return newConfig;
}
function fileAppenderAdapter(config) {
const configAdapter = {
maxLogSize: maxFileSizeUnitTransform
};
return adapter(configAdapter, config);
}
const adapters = {
file: fileAppenderAdapter,
fileSync: fileAppenderAdapter
};
module.exports.modifyConfig = config => (adapters[config.type] ? adapters[config.type](config) : config);

View File

@ -59,7 +59,13 @@ function fileAppender(file, layout, logSize, numBackups, options, timezoneOffset
writer.closeTheStream(writer.openTheStream.bind(writer));
};
app.sighupHandler = function () {
debug('SIGHUP handler called.');
app.reopen();
};
app.shutdown = function (complete) {
process.removeListener('SIGHUP', app.sighupHandler);
writer.write('', 'utf-8', () => {
writer.end(complete);
});
@ -68,10 +74,7 @@ function fileAppender(file, layout, logSize, numBackups, options, timezoneOffset
// On SIGHUP, close and reopen all files. This allows this appender to work with
// logrotate. Note that if you are using logrotate, you should not set
// `logSize`.
process.on('SIGHUP', () => {
debug('SIGHUP handler called.');
app.reopen();
});
process.on('SIGHUP', app.sighupHandler);
return app;
}

View File

@ -4,6 +4,7 @@ const configuration = require('../configuration');
const clustering = require('../clustering');
const levels = require('../levels');
const layouts = require('../layouts');
const adapters = require('./adapters');
// pre-load the core appenders so that webpack can find them
const coreAppenders = new Map();
@ -56,7 +57,7 @@ const createAppender = (name, config) => {
return clustering.onlyOnMaster(() => {
debug(`calling appenderModule.configure for ${name} / ${appenderConfig.type}`);
return appenderModule.configure(
appenderConfig,
adapters.modifyConfig(appenderConfig),
layouts,
appender => appenders.get(appender),
levels

View File

@ -9,6 +9,25 @@ const findFileKey = (property, event) => event[property] || event.context[proper
module.exports.configure = (config, layouts) => {
debug('Creating a multi-file appender');
const files = new Map();
const timers = new Map();
function checkForTimeout(fileKey) {
const timer = timers.get(fileKey);
const app = files.get(fileKey);
if (timer && app) {
if (Date.now() - timer.lastUsed > timer.timeout) {
debug('%s not used for > %d ms => close', fileKey, timer.timeout);
clearInterval(timer.interval);
timers.delete(fileKey);
files.delete(fileKey);
app.shutdown((err) => {
if (err) {
debug('ignore error on file shutdown: %s', err.message);
}
});
}
}
}
const appender = (logEvent) => {
const fileKey = findFileKey(config.property, logEvent);
@ -21,16 +40,30 @@ module.exports.configure = (config, layouts) => {
config.filename = path.join(config.base, fileKey + config.extension);
file = fileAppender.configure(config, layouts);
files.set(fileKey, file);
if (config.timeout) {
debug('creating new timer');
timers.set(fileKey, {
timeout: config.timeout,
lastUsed: Date.now(),
interval: setInterval(checkForTimeout.bind(null, fileKey), config.timeout)
});
}
} else if (config.timeout) {
timers.get(fileKey).lastUsed = Date.now();
}
file(logEvent);
} else {
debug('No fileKey for logEvent, quietly ignoring this log event');
}
debug('No fileKey for logEvent, quietly ignoring this log event');
};
appender.shutdown = (cb) => {
let shutdownFunctions = files.size;
let error;
timers.forEach((timer) => {
clearInterval(timer.interval);
});
files.forEach((app, fileKey) => {
debug('calling shutdown for ', fileKey);
app.shutdown((err) => {

View File

@ -64,8 +64,21 @@ function logServer(config, actualAppender, levels) {
}
}
function handleError(error) {
const loggingEvent = {
startTime: new Date(),
categoryName: 'log4js',
level: levels.ERROR,
data: ['A worker log process hung up unexpectedly', error],
remoteAddress: clientSocket.remoteAddress,
remotePort: clientSocket.remotePort
};
actualAppender(loggingEvent);
}
clientSocket.on('data', chunkReceived);
clientSocket.on('end', chunkReceived);
clientSocket.on('error', handleError);
});
server.listen(config.loggerPort || 5000, config.loggerHost || 'localhost', function () {

View File

@ -1,5 +1,10 @@
'use strict';
/**
* This appender has been deprecated.
* Updates and bug fixes should be made against https://github.com/log4js-node/smtp
*/
const mailer = require('nodemailer');
const os = require('os');
@ -124,6 +129,9 @@ function smtpAppender(config, layout, subjectLayout) {
appender.shutdown = shutdown;
// trigger a deprecation warning.
appender.deprecated = '@logj4s-node/smtp';
return appender;
}

6797
package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@ -1,6 +1,6 @@
{
"name": "log4js",
"version": "2.5.3",
"version": "2.7.0",
"description": "Port of Log4js to work with node.",
"homepage": "https://log4js-node.github.io/log4js-node/",
"files": [
@ -29,11 +29,12 @@
},
"scripts": {
"clean": "find test -type f ! -name '*.json' ! -name '*.js' ! -name '.eslintrc' -delete && rm *.log",
"prepush": "npm test",
"prepush": "npm test && npm run typings",
"commitmsg": "validate-commit-msg",
"posttest": "npm run clean",
"pretest": "eslint 'lib/**/*.js' 'test/**/*.js'",
"test": "tap 'test/tap/**/*.js'",
"typings": "tsc -p types/tsconfig.json",
"coverage": "tap 'test/tap/**/*.js' --cov",
"codecov": "tap 'test/tap/**/*.js' --cov --coverage-report=lcov && codecov"
},
@ -42,23 +43,24 @@
"lib": "lib"
},
"dependencies": {
"circular-json": "^0.5.1",
"circular-json": "^0.5.4",
"date-format": "^1.2.0",
"debug": "^3.1.0",
"semver": "^5.3.0",
"streamroller": "^0.7.0"
"semver": "^5.5.0",
"streamroller": "0.7.0"
},
"devDependencies": {
"codecov": "^3.0.0",
"conventional-changelog": "^1.1.6",
"eslint": "^4.10.0",
"codecov": "^3.0.2",
"conventional-changelog": "^1.1.24",
"eslint": "^4.19.1",
"eslint-config-airbnb-base": "^12.1.0",
"eslint-import-resolver-node": "^0.3.1",
"eslint-plugin-import": "^2.8.0",
"eslint-plugin-import": "^2.11.0",
"husky": "^0.14.3",
"nyc": "^11.3.0",
"@log4js-node/sandboxed-module": "^2.1.1",
"tap": "^10.7.3",
"nyc": "^11.7.3",
"@log4js-node/sandboxed-module": "^2.1.0",
"tap": "^11.1.5",
"typescript": "^2.8.3",
"validate-commit-msg": "^2.14.0"
},
"optionalDependencies": {

View File

@ -2,8 +2,40 @@
const test = require('tap').test;
const sandbox = require('@log4js-node/sandboxed-module');
const consoleAppender = require('../../lib/appenders/console');
test('log4js console appender', (batch) => {
batch.test('should export a configure function', (t) => {
t.type(consoleAppender.configure, 'function');
t.end();
});
batch.test('should use default layout if none specified', (t) => {
const messages = [];
const fakeConsole = {
log: function (msg) {
messages.push(msg);
}
};
const log4js = sandbox.require(
'../../lib/log4js',
{
globals: {
console: fakeConsole
}
}
);
log4js.configure({
appenders: { console: { type: 'console' } },
categories: { default: { appenders: ['console'], level: 'DEBUG' } }
});
log4js.getLogger().info('blah');
t.match(messages[0], /.*default.*blah/);
t.end();
});
batch.test('should output to console', (t) => {
const messages = [];
const fakeConsole = {

View File

@ -47,3 +47,18 @@ test('file appender SIGHUP', (t) => {
t.end();
}, 100);
});
test('file appender SIGHUP handler leak', (t) => {
const log4js = require('../../lib/log4js');
const initialListeners = process.listenerCount('SIGHUP');
log4js.configure({
appenders: {
file: { type: 'file', filename: 'test.log' }
},
categories: { default: { appenders: ['file'], level: 'info' } }
});
log4js.shutdown(() => {
t.equal(process.listenerCount('SIGHUP'), initialListeners);
t.end();
});
});

View File

@ -110,6 +110,49 @@ test('log4js fileAppender', (batch) => {
}, 100);
});
batch.test('with a max file size in unit mode and no backups', (t) => {
const testFile = path.join(__dirname, 'fa-maxFileSize-unit-test.log');
const logger = log4js.getLogger('max-file-size-unit');
t.tearDown(() => {
removeFile(testFile);
removeFile(`${testFile}.1`);
});
removeFile(testFile);
removeFile(`${testFile}.1`);
// log file of 1K = 1024 bytes maximum, no backups
log4js.configure({
appenders: {
file: {
type: 'file', filename: testFile, maxLogSize: '1K', backups: 0
}
},
categories: {
default: { appenders: ['file'], level: 'debug' }
}
});
const maxLine = 13;
for (let i = 0; i < maxLine; i++) {
logger.info('This is the first log message.');
}
logger.info('This is the second log message.');
// wait for the file system to catch up
setTimeout(() => {
fs.readFile(testFile, 'utf8', (err, fileContents) => {
t.include(fileContents, 'This is the second log message.');
t.equal(fileContents.indexOf('This is the first log message.'), -1);
fs.readdir(__dirname, (e, files) => {
const logFiles = files.filter(file => file.includes('fa-maxFileSize-unit-test.log'));
t.equal(logFiles.length, 2, 'should be 2 files');
t.end();
});
});
}, 100);
});
batch.test('with a max file size and 2 backups', (t) => {
const testFile = path.join(__dirname, 'fa-maxFileSize-with-backups-test.log');
const logger = log4js.getLogger('max-file-size-backups');

View File

@ -84,6 +84,52 @@ test('log4js fileSyncAppender', (batch) => {
t.end();
});
batch.test('with a max file size in unit mode and no backups', (t) => {
const testFile = path.join(__dirname, '/fa-maxFileSize-unit-sync-test.log');
const logger = log4js.getLogger('max-file-size-unit');
remove(testFile);
remove(`${testFile}.1`);
t.tearDown(() => {
remove(testFile);
remove(`${testFile}.1`);
});
// log file of 100 bytes maximum, no backups
log4js.configure({
appenders: {
sync: {
type: 'fileSync', filename: testFile, maxLogSize: '1K', backups: 0
}
},
categories: { default: { appenders: ['sync'], level: 'debug' } }
});
const maxLine = 13;
for (let i = 0; i < maxLine; i++) {
logger.info('This is the first log message.');
}
logger.info('This is the second log message.');
t.test('log file should only contain the second message', (assert) => {
fs.readFile(testFile, 'utf8', (err, fileContents) => {
assert.include(fileContents, `This is the second log message.${EOL}`);
assert.equal(fileContents.indexOf('This is the first log message.'), -1);
assert.end();
});
});
t.test('there should be two test files', (assert) => {
fs.readdir(__dirname, (err, files) => {
const logFiles = files.filter(file => file.includes('fa-maxFileSize-unit-sync-test.log'));
assert.equal(logFiles.length, 2);
assert.end();
});
});
t.end();
});
batch.test('with a max file size and 2 backups', (t) => {
const testFile = path.join(__dirname, '/fa-maxFileSize-with-backups-sync-test.log');
const logger = log4js.getLogger('max-file-size-backups');

View File

@ -2,6 +2,7 @@
const test = require('tap').test;
const sandbox = require('@log4js-node/sandboxed-module');
const appender = require('../../lib/appenders/logFaces-HTTP');
function setupLogging(category, options) {
const fakeAxios = {
@ -50,6 +51,11 @@ function setupLogging(category, options) {
}
test('logFaces appender', (batch) => {
batch.test('should export a configure function', (t) => {
t.type(appender.configure, 'function');
t.end();
});
batch.test('when using HTTP receivers', (t) => {
const setup = setupLogging('myCategory', {
application: 'LFS-HTTP',

View File

@ -2,6 +2,7 @@
const test = require('tap').test;
const sandbox = require('@log4js-node/sandboxed-module');
const appender = require('../../lib/appenders/logFaces-UDP');
function setupLogging(category, options) {
const fakeDgram = {
@ -53,6 +54,11 @@ function setupLogging(category, options) {
}
test('logFaces appender', (batch) => {
batch.test('should export a configure function', (t) => {
t.type(appender.configure, 'function');
t.end();
});
batch.test('when using UDP receivers', (t) => {
const setup = setupLogging('udpCategory', {
application: 'LFS-UDP',

View File

@ -2,6 +2,7 @@
const test = require('tap').test;
const sandbox = require('@log4js-node/sandboxed-module');
const appender = require('../../lib/appenders/logstashHTTP');
function setupLogging(category, options) {
const fakeAxios = {
@ -50,6 +51,11 @@ function setupLogging(category, options) {
}
test('logstashappender', (batch) => {
batch.test('should export a configure function', (t) => {
t.type(appender.configure, 'function');
t.end();
});
batch.test('when using HTTP receivers', (t) => {
const setup = setupLogging('myCategory', {
application: 'logstash-sample',

View File

@ -1,6 +1,8 @@
'use strict';
const process = require('process');
const test = require('tap').test;
const debug = require('debug');
const log4js = require('../../lib/log4js');
const fs = require('fs');
@ -47,6 +49,41 @@ test('multiFile appender', (batch) => {
});
});
batch.test('should close file after timeout', (t) => {
/* checking that the file is closed after a timeout is done by looking at the debug logs
since detecting file locks with node.js is platform specific.
*/
const debugWasEnabled = debug.enabled('log4js:multiFile');
const debugLogs = [];
const originalWrite = process.stderr.write;
process.stderr.write = (string, encoding, fd) => {
debugLogs.push(string);
if (debugWasEnabled) {
originalWrite.apply(process.stderr, [string, encoding, fd]);
}
};
debug.enable('log4js:multiFile');
log4js.configure({
appenders: {
multi: {
type: 'multiFile', base: 'logs/', property: 'label', extension: '.log', timeout: 20
}
},
categories: { default: { appenders: ['multi'], level: 'info' } }
});
const loggerC = log4js.getLogger('cheese');
loggerC.addContext('label', 'C');
loggerC.info('I am in logger C');
setTimeout(() => {
t.contains(debugLogs[debugLogs.length - 1], 'C not used for > 20 ms => close');
if (!debugWasEnabled) {
debug.disable('log4js:multiFile');
}
process.stderr.write = originalWrite;
t.end();
}, 50);
});
batch.test('should fail silently if loggingEvent property has no value', (t) => {
log4js.configure({
appenders: {

View File

@ -3,6 +3,7 @@
const test = require('tap').test;
const log4js = require('../../lib/log4js');
const net = require('net');
const childProcess = require('child_process');
const sandbox = require('@log4js-node/sandboxed-module');
test('multiprocess appender shutdown (master)', { timeout: 2000 }, (t) => {
@ -88,3 +89,48 @@ test('multiprocess appender shutdown (worker)', (t) => {
t.end();
}, 500);
});
test('multiprocess appender crash (worker)', (t) => {
const loggerPort = 12346;
const messages = [];
const fakeConsole = {
log: function (msg) {
messages.push(msg);
}
};
const log4jsWithFakeConsole = sandbox.require(
'../../lib/log4js',
{
globals: {
console: fakeConsole
}
}
);
log4jsWithFakeConsole.configure({
appenders: {
console: { type: 'console', layout: { type: 'messagePassThrough' } },
multi: {
type: 'multiprocess',
mode: 'master',
loggerPort: loggerPort,
appender: 'console'
}
},
categories: { default: { appenders: ['multi'], level: 'debug' } }
});
setTimeout(() => {
const worker = childProcess.fork(
require.resolve('./multiprocess-worker'),
['start-multiprocess-worker', loggerPort]
);
setTimeout(() => {
worker.kill();
setTimeout(() => {
t.equal(messages[0], 'Logging from worker');
log4jsWithFakeConsole.shutdown(() => t.end());
}, 250);
}, 250);
}, 250);
});

View File

@ -236,6 +236,14 @@ test('Multiprocess Appender', (batch) => {
assert.end();
});
t.test('should log the error on "error" event', (assert) => {
net.cbs.error(new Error('Expected error'));
const logEvents = recording.replay();
assert.plan(2);
assert.equal(logEvents.length, 1);
assert.equal('A worker log process hung up unexpectedly', logEvents[0].data[0]);
});
t.test('when a client connects', (assert) => {
const logString = `${JSON.stringify({
level: { level: 10000, levelStr: 'DEBUG' },

View File

@ -0,0 +1,11 @@
if (process.argv.indexOf('start-multiprocess-worker') >= 0) {
const log4js = require('../../lib/log4js');
const port = parseInt(process.argv[process.argv.length - 1], 10);
log4js.configure({
appenders: {
multi: { type: 'multiprocess', mode: 'worker', loggerPort: port }
},
categories: { default: { appenders: ['multi'], level: 'debug' } }
});
log4js.getLogger('worker').info('Logging from worker');
}

View File

@ -2,6 +2,7 @@
const test = require('tap').test;
const sandbox = require('@log4js-node/sandboxed-module');
const appender = require('../../lib/appenders/rabbitmq');
function setupLogging(category, options) {
const fakeRabbitmq = {
@ -52,6 +53,11 @@ function setupLogging(category, options) {
}
test('log4js rabbitmqAppender', (batch) => {
batch.test('should export a configure function', (t) => {
t.type(appender.configure, 'function');
t.end();
});
batch.test('rabbitmq setup', (t) => {
const result = setupLogging('rabbitmq setup', {
host: '123.123.123.123',

View File

@ -2,6 +2,7 @@
const test = require('tap').test;
const sandbox = require('@log4js-node/sandboxed-module');
const appender = require('../../lib/appenders/redis');
function setupLogging(category, options) {
const fakeRedis = {
@ -56,6 +57,11 @@ function setupLogging(category, options) {
}
test('log4js redisAppender', (batch) => {
batch.test('should export a configure function', (t) => {
t.type(appender.configure, 'function');
t.end();
});
batch.test('redis setup', (t) => {
const result = setupLogging('redis setup', {
host: '123.123.123.123',

39
types/log4js.d.ts vendored
View File

@ -1,12 +1,13 @@
// Type definitions for log4js
export interface Log4js {
getLogger,
configure,
addLayout,
connectLogger,
levels,
shutdown
getLogger(category?: string): Logger;
configure(filename: string): Log4js;
configure(config: Configuration): Log4js;
addLayout(name: string, config: (a: any) => (logEvent: LoggingEvent) => string): void;
connectLogger(logger: Logger, options: { format?: string; level?: string; nolog?: any; }): any; // express.Handler;
levels(): Levels;
shutdown(cb?: (error: Error) => void): void | null;
}
export function getLogger(category?: string): Logger;
@ -106,15 +107,15 @@ export interface FileAppender {
// the path of the file where you want your logs written.
filename: string;
// the maximum size (in bytes) for the log file. If not specified, then no log rolling will happen.
maxLogSize?: number;
maxLogSize?: number | string;
// (default value = 5) - the number of old log files to keep during log rolling.
backups?: number;
// defaults to basic layout
layout?: Layout;
numBackups?: number;
compress?: boolean; // compress the backups
// keep the file extension when rotating logs
keepFileExt?: boolean;
// keep the file extension when rotating logs
keepFileExt?: boolean;
encoding?: string;
mode?: number;
flags?: string;
@ -125,7 +126,7 @@ export interface SyncfileAppender {
// the path of the file where you want your logs written.
filename: string;
// the maximum size (in bytes) for the log file. If not specified, then no log rolling will happen.
maxLogSize?: number;
maxLogSize?: number | string;
// (default value = 5) - the number of old log files to keep during log rolling.
backups?: number;
// defaults to basic layout
@ -161,8 +162,8 @@ export interface DateFileAppender {
compress?: boolean;
// include the pattern in the name of the current log file as well as the backups.(default false)
alwaysIncludePattern?: boolean;
// keep the file extension when rotating logs
keepFileExt?: boolean;
// keep the file extension when rotating logs
keepFileExt?: boolean;
// if this value is greater than zero, then files older than that many days will be deleted during log rolling.(default 0)
daysToKeep?: number;
}
@ -376,7 +377,7 @@ export interface Logger {
isLevelEnabled(level?: string): boolean;
isTraceEnabled(): boolean;
isTraceEnabled(): boolean;
isDebugEnabled(): boolean;
isInfoEnabled(): boolean;
isWarnEnabled(): boolean;
@ -391,15 +392,15 @@ export interface Logger {
clearContext(): void;
trace(message: string, ...args: any[]): void;
trace(message: any, ...args: any[]): void;
debug(message: string, ...args: any[]): void;
debug(message: any, ...args: any[]): void;
info(message: string, ...args: any[]): void;
info(message: any, ...args: any[]): void;
warn(message: string, ...args: any[]): void;
warn(message: any, ...args: any[]): void;
error(message: string, ...args: any[]): void;
error(message: any, ...args: any[]): void;
fatal(message: string, ...args: any[]): void;
fatal(message: any, ...args: any[]): void;
}

View File

@ -4,6 +4,9 @@ log4js.configure('./filename');
const logger1 = log4js.getLogger();
logger1.level = 'debug';
logger1.debug("Some debug messages");
logger1.fatal({
whatever: 'foo'
})
const logger3 = log4js.getLogger('cheese');
logger3.trace('Entering cheese testing');
@ -52,6 +55,7 @@ log4js.configure({
});
const logger4 = log4js.getLogger('thing');
logger4.log('logging a thing');
const logger5 = log4js.getLogger('json-test');
logger5.info('this is just a test');

9
types/tsconfig.json Normal file
View File

@ -0,0 +1,9 @@
{
"compileOnSave": false,
"compilerOptions": {
"strict": true,
"noUnusedParameters": true,
"noUnusedLocals": true,
"noEmit": true
}
}