mirror of
https://github.com/jsdoc/jsdoc.git
synced 2025-12-08 19:46:11 +00:00
chore: use Prettier to format source files
This commit is contained in:
parent
3025520e15
commit
1305499207
@ -9,4 +9,4 @@ indent_size = 2
|
||||
|
||||
[{**/*.js,**/*.css,**/*.json}]
|
||||
indent_style = space
|
||||
indent_size = 4
|
||||
indent_size = 2
|
||||
|
||||
@ -1,3 +1,3 @@
|
||||
module.exports = {
|
||||
extends: '@jsdoc'
|
||||
extends: ['@jsdoc', 'plugin:prettier/recommended'],
|
||||
};
|
||||
|
||||
4
.github/ISSUE_TEMPLATE.md
vendored
4
.github/ISSUE_TEMPLATE.md
vendored
@ -40,8 +40,8 @@ Your debug output here
|
||||
|
||||
### Your environment
|
||||
|
||||
| Software | Version
|
||||
| ---------------- | -------
|
||||
| Software | Version |
|
||||
| ---------------- | ------- |
|
||||
| JSDoc |
|
||||
| Node.js |
|
||||
| npm |
|
||||
|
||||
18
.github/PULL_REQUEST_TEMPLATE.md
vendored
18
.github/PULL_REQUEST_TEMPLATE.md
vendored
@ -5,14 +5,14 @@ https://github.com/jsdoc3/jsdoc/blob/master/CONTRIBUTING.md
|
||||
https://github.com/jsdoc3/jsdoc/blob/master/CODE_OF_CONDUCT.md
|
||||
-->
|
||||
|
||||
| Q | A
|
||||
| ---------------- | ---
|
||||
| Bug fix? | yes/no
|
||||
| New feature? | yes/no
|
||||
| Breaking change? | yes/no
|
||||
| Deprecations? | yes/no
|
||||
| Tests added? | yes/no
|
||||
| Fixed issues | comma-separated list of issues fixed by the pull request, if any
|
||||
| License | Apache-2.0
|
||||
| Q | A |
|
||||
| ---------------- | ---------------------------------------------------------------- |
|
||||
| Bug fix? | yes/no |
|
||||
| New feature? | yes/no |
|
||||
| Breaking change? | yes/no |
|
||||
| Deprecations? | yes/no |
|
||||
| Tests added? | yes/no |
|
||||
| Fixed issues | comma-separated list of issues fixed by the pull request, if any |
|
||||
| License | Apache-2.0 |
|
||||
|
||||
<!-- Describe your changes below in as much detail as possible. -->
|
||||
|
||||
5
.prettierignore
Normal file
5
.prettierignore
Normal file
@ -0,0 +1,5 @@
|
||||
# Ignore test fixtures.
|
||||
**/test/fixtures/**
|
||||
|
||||
# Ignore code coverage reports.
|
||||
.nyc_output/
|
||||
3
.prettierrc.js
Normal file
3
.prettierrc.js
Normal file
@ -0,0 +1,3 @@
|
||||
module.exports = {
|
||||
...require('./packages/jsdoc-prettier-config'),
|
||||
};
|
||||
@ -1,11 +1,7 @@
|
||||
{
|
||||
"extends": [
|
||||
"config:base"
|
||||
],
|
||||
"extends": ["config:base"],
|
||||
"statusCheckVerify": true,
|
||||
"ignoreDeps": [
|
||||
"taffydb"
|
||||
],
|
||||
"ignoreDeps": ["taffydb"],
|
||||
"automerge": true,
|
||||
"automergeType": "branch",
|
||||
"rangeStrategy": "bump"
|
||||
|
||||
1092
CHANGES.md
1092
CHANGES.md
File diff suppressed because it is too large
Load Diff
@ -11,20 +11,20 @@ of experience, nationality, personal appearance, race, religion, or sexual ident
|
||||
|
||||
Examples of behavior that contributes to creating a positive environment include:
|
||||
|
||||
* Using welcoming and inclusive language
|
||||
* Being respectful of differing viewpoints and experiences
|
||||
* Gracefully accepting constructive criticism
|
||||
* Focusing on what is best for the community
|
||||
* Showing empathy towards other community members
|
||||
- Using welcoming and inclusive language
|
||||
- Being respectful of differing viewpoints and experiences
|
||||
- Gracefully accepting constructive criticism
|
||||
- Focusing on what is best for the community
|
||||
- Showing empathy towards other community members
|
||||
|
||||
Examples of unacceptable behavior by participants include:
|
||||
|
||||
* The use of sexualized language or imagery and unwelcome sexual attention or advances
|
||||
* Trolling, insulting/derogatory comments, and personal or political attacks
|
||||
* Public or private harassment
|
||||
* Publishing others' private information, such as a physical or electronic address, without explicit
|
||||
permission
|
||||
* Other conduct which could reasonably be considered inappropriate in a professional setting
|
||||
- The use of sexualized language or imagery and unwelcome sexual attention or advances
|
||||
- Trolling, insulting/derogatory comments, and personal or political attacks
|
||||
- Public or private harassment
|
||||
- Publishing others' private information, such as a physical or electronic address, without explicit
|
||||
permission
|
||||
- Other conduct which could reasonably be considered inappropriate in a professional setting
|
||||
|
||||
## Our Responsibilities
|
||||
|
||||
|
||||
@ -1,5 +1,4 @@
|
||||
Pull Requests
|
||||
-------------
|
||||
## Pull Requests
|
||||
|
||||
If you're thinking about making some changes, maybe fixing a bug, or adding a
|
||||
snazzy new feature, first, thank you. Contributions are very welcome. Things
|
||||
@ -9,7 +8,7 @@ you set up your branches and work with git, are just suggestions, but pretty goo
|
||||
ones.
|
||||
|
||||
1. **Create a remote to track the base jsdoc/jsdoc repository**
|
||||
This is just a convenience to make it easier to update your ```<tracking branch>```
|
||||
This is just a convenience to make it easier to update your `<tracking branch>`
|
||||
(more on that shortly). You would execute something like:
|
||||
|
||||
git remote add base git://github.com/jsdoc/jsdoc.git
|
||||
@ -17,7 +16,7 @@ ones.
|
||||
Here 'base' is the name of the remote. Feel free to use whatever you want.
|
||||
|
||||
2. **Set up a tracking branch for the base repository**
|
||||
We're gonna call this your ```<tracking branch>```. You will only ever update
|
||||
We're gonna call this your `<tracking branch>`. You will only ever update
|
||||
this branch by pulling from the 'base' remote. (as opposed to 'origin')
|
||||
|
||||
git branch --track pullpost base/master
|
||||
@ -26,21 +25,21 @@ ones.
|
||||
Here 'pullpost' is the name of the branch. Fell free to use whatever you want.
|
||||
|
||||
3. **Create your change branch**
|
||||
Once you are in ```<tracking branch>```, make sure it's up to date, then create
|
||||
Once you are in `<tracking branch>`, make sure it's up to date, then create
|
||||
a branch for your changes off of that one.
|
||||
|
||||
git branch fix-for-issue-395
|
||||
git checkout fix-for-issue-395
|
||||
|
||||
Here 'fix-for-issue-395' is the name of the branch. Feel free to use whatever
|
||||
you want. We'll call this the ```<change branch>```. This is the branch that
|
||||
you want. We'll call this the `<change branch>`. This is the branch that
|
||||
you will eventually issue your pull request from.
|
||||
|
||||
The purpose of these first three steps is to make sure that your merge request
|
||||
has a nice clean diff that only involves the changes related to your fix/feature.
|
||||
|
||||
4. **Make your changes**
|
||||
On your ```<change branch>``` make any changes relevant to your fix/feature. Don't
|
||||
On your `<change branch>` make any changes relevant to your fix/feature. Don't
|
||||
group fixes for multiple unrelated issues or multiple unrelated features together.
|
||||
Create a separate branch for each unrelated changeset. For instance, if you're
|
||||
fixing a bug in the parser and adding some new UI to the default template, those
|
||||
@ -56,7 +55,7 @@ ones.
|
||||
Commit your changes and publish your branch (or push it if it's already published)
|
||||
|
||||
7. **Issue your pull request**
|
||||
On github.com, switch to your ```<change branch>``` and click the 'Pull Request'
|
||||
On github.com, switch to your `<change branch>` and click the 'Pull Request'
|
||||
button. Enter some meaningful information about the pull request. If it's a bugfix,
|
||||
that doesn't already have an issue associated with it, provide some info on what
|
||||
situations that bug occurs in and a sense of it's severity. If it does already have
|
||||
|
||||
53
README.md
53
README.md
@ -6,8 +6,7 @@ An API documentation generator for JavaScript.
|
||||
|
||||
Want to contribute to JSDoc? Please read [`CONTRIBUTING.md`](CONTRIBUTING.md).
|
||||
|
||||
Installation and Usage
|
||||
----------------------
|
||||
## Installation and Usage
|
||||
|
||||
JSDoc supports stable versions of Node.js 8.15.0 and later. You can install
|
||||
JSDoc globally or in your project's `node_modules` folder.
|
||||
@ -51,41 +50,41 @@ and customize your documentation. Here are a few of them:
|
||||
|
||||
### Templates
|
||||
|
||||
+ [jaguarjs-jsdoc](https://github.com/davidshimjs/jaguarjs-jsdoc)
|
||||
+ [DocStrap](https://github.com/docstrap/docstrap)
|
||||
([example](https://docstrap.github.io/docstrap))
|
||||
+ [jsdoc3Template](https://github.com/DBCDK/jsdoc3Template)
|
||||
- [jaguarjs-jsdoc](https://github.com/davidshimjs/jaguarjs-jsdoc)
|
||||
- [DocStrap](https://github.com/docstrap/docstrap)
|
||||
([example](https://docstrap.github.io/docstrap))
|
||||
- [jsdoc3Template](https://github.com/DBCDK/jsdoc3Template)
|
||||
([example](https://github.com/danyg/jsdoc3Template/wiki#wiki-screenshots))
|
||||
+ [minami](https://github.com/Nijikokun/minami)
|
||||
+ [docdash](https://github.com/clenemt/docdash)
|
||||
([example](http://clenemt.github.io/docdash/))
|
||||
+ [tui-jsdoc-template](https://github.com/nhnent/tui.jsdoc-template)
|
||||
([example](https://nhnent.github.io/tui.jsdoc-template/latest/))
|
||||
+ [better-docs](https://github.com/SoftwareBrothers/better-docs)
|
||||
([example](https://softwarebrothers.github.io/admin-bro-dev/index.html))
|
||||
- [minami](https://github.com/Nijikokun/minami)
|
||||
- [docdash](https://github.com/clenemt/docdash)
|
||||
([example](http://clenemt.github.io/docdash/))
|
||||
- [tui-jsdoc-template](https://github.com/nhnent/tui.jsdoc-template)
|
||||
([example](https://nhnent.github.io/tui.jsdoc-template/latest/))
|
||||
- [better-docs](https://github.com/SoftwareBrothers/better-docs)
|
||||
([example](https://softwarebrothers.github.io/admin-bro-dev/index.html))
|
||||
|
||||
### Build tools
|
||||
|
||||
+ [JSDoc Grunt plugin](https://github.com/krampstudio/grunt-jsdoc)
|
||||
+ [JSDoc Gulp plugin](https://github.com/mlucool/gulp-jsdoc3)
|
||||
+ [JSDoc GitHub Action](https://github.com/andstor/jsdoc-action)
|
||||
- [JSDoc Grunt plugin](https://github.com/krampstudio/grunt-jsdoc)
|
||||
- [JSDoc Gulp plugin](https://github.com/mlucool/gulp-jsdoc3)
|
||||
- [JSDoc GitHub Action](https://github.com/andstor/jsdoc-action)
|
||||
|
||||
### Other tools
|
||||
|
||||
+ [jsdoc-to-markdown](https://github.com/jsdoc2md/jsdoc-to-markdown)
|
||||
+ [Integrating GitBook with
|
||||
JSDoc](https://medium.com/@kevinast/integrate-gitbook-jsdoc-974be8df6fb3)
|
||||
- [jsdoc-to-markdown](https://github.com/jsdoc2md/jsdoc-to-markdown)
|
||||
- [Integrating GitBook with
|
||||
JSDoc](https://medium.com/@kevinast/integrate-gitbook-jsdoc-974be8df6fb3)
|
||||
|
||||
## For more information
|
||||
|
||||
+ Documentation is available at [jsdoc.app](https://jsdoc.app/).
|
||||
+ Contribute to the docs at
|
||||
[jsdoc/jsdoc.github.io](https://github.com/jsdoc/jsdoc.github.io).
|
||||
+ [Join JSDoc's Slack channel](https://jsdoc-slack.appspot.com/).
|
||||
+ Ask for help on the
|
||||
[JSDoc Users mailing list](http://groups.google.com/group/jsdoc-users).
|
||||
+ Post questions tagged `jsdoc` to
|
||||
[Stack Overflow](http://stackoverflow.com/questions/tagged/jsdoc).
|
||||
- Documentation is available at [jsdoc.app](https://jsdoc.app/).
|
||||
- Contribute to the docs at
|
||||
[jsdoc/jsdoc.github.io](https://github.com/jsdoc/jsdoc.github.io).
|
||||
- [Join JSDoc's Slack channel](https://jsdoc-slack.appspot.com/).
|
||||
- Ask for help on the
|
||||
[JSDoc Users mailing list](http://groups.google.com/group/jsdoc-users).
|
||||
- Post questions tagged `jsdoc` to
|
||||
[Stack Overflow](http://stackoverflow.com/questions/tagged/jsdoc).
|
||||
|
||||
## License
|
||||
|
||||
|
||||
19
gulpfile.js
19
gulpfile.js
@ -2,6 +2,7 @@ const eslint = require('gulp-eslint');
|
||||
const { exec } = require('child_process');
|
||||
const gulp = require('gulp');
|
||||
const path = require('path');
|
||||
const prettier = require('gulp-prettier');
|
||||
|
||||
function execCb(cb, err, stdout, stderr) {
|
||||
console.log(stdout);
|
||||
@ -15,15 +16,11 @@ const options = {
|
||||
'packages/**/*.js',
|
||||
'!packages/**/test/*.js',
|
||||
'!packages/**/test/**/*.js',
|
||||
'!packages/**/static/*.js'
|
||||
],
|
||||
lintPaths: [
|
||||
'*.js',
|
||||
'packages/**/*.js',
|
||||
'!packages/**/static/*.js'
|
||||
'!packages/**/static/*.js',
|
||||
],
|
||||
lintPaths: ['*.js', 'packages/**/*.js', '!packages/**/static/*.js'],
|
||||
nodeBin: path.resolve(__dirname, './packages/jsdoc/jsdoc.js'),
|
||||
nodePath: process.execPath
|
||||
nodePath: process.execPath,
|
||||
};
|
||||
|
||||
function coverage(cb) {
|
||||
@ -32,8 +29,13 @@ function coverage(cb) {
|
||||
return exec(cmd, execCb.bind(null, cb));
|
||||
}
|
||||
|
||||
function format() {
|
||||
return gulp.src(options.lintPaths).pipe(prettier());
|
||||
}
|
||||
|
||||
function lint() {
|
||||
return gulp.src(options.lintPaths)
|
||||
return gulp
|
||||
.src(options.lintPaths)
|
||||
.pipe(eslint())
|
||||
.pipe(eslint.formatEach())
|
||||
.pipe(eslint.failAfterError());
|
||||
@ -47,5 +49,6 @@ function test(cb) {
|
||||
|
||||
exports.coverage = coverage;
|
||||
exports.default = gulp.series(lint, test);
|
||||
exports.format = format;
|
||||
exports.lint = lint;
|
||||
exports.test = test;
|
||||
|
||||
@ -1,6 +1,4 @@
|
||||
{
|
||||
"packages": [
|
||||
"packages/*"
|
||||
],
|
||||
"packages": ["packages/*"],
|
||||
"version": "independent"
|
||||
}
|
||||
|
||||
16671
package-lock.json
generated
16671
package-lock.json
generated
File diff suppressed because it is too large
Load Diff
@ -7,14 +7,18 @@
|
||||
"@jsdoc/test-matchers": "^0.1.6",
|
||||
"ajv": "^8.6.3",
|
||||
"eslint": "^7.32.0",
|
||||
"eslint-config-prettier": "^8.3.0",
|
||||
"eslint-plugin-prettier": "^4.0.0",
|
||||
"gulp": "^4.0.2",
|
||||
"gulp-eslint": "^6.0.0",
|
||||
"gulp-prettier": "^4.0.0",
|
||||
"jasmine": "^3.9.0",
|
||||
"jasmine-console-reporter": "^3.1.0",
|
||||
"klaw-sync": "^6.0.0",
|
||||
"lerna": "^4.0.0",
|
||||
"mock-fs": "^5.1.0",
|
||||
"nyc": "^15.1.0"
|
||||
"nyc": "^15.1.0",
|
||||
"prettier": "^2.4.1"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=v14.17.6"
|
||||
|
||||
@ -3,7 +3,7 @@ const { EventBus } = require('@jsdoc/util');
|
||||
const flags = require('./flags');
|
||||
const help = require('./help');
|
||||
const { LEVELS, Logger } = require('./logger');
|
||||
const {default: ow} = require('ow');
|
||||
const { default: ow } = require('ow');
|
||||
const yargs = require('yargs-parser');
|
||||
|
||||
function validateChoice(flagInfo, choices, values) {
|
||||
@ -13,9 +13,7 @@ function validateChoice(flagInfo, choices, values) {
|
||||
|
||||
for (let value of values) {
|
||||
if (!choices.includes(value)) {
|
||||
throw new TypeError(
|
||||
`The flag ${flagNames} accepts only these values: ${choices.join(', ')}`
|
||||
);
|
||||
throw new TypeError(`The flag ${flagNames} accepts only these values: ${choices.join(', ')}`);
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -37,13 +35,13 @@ const { KNOWN_FLAGS, YARGS_FLAGS } = (() => {
|
||||
boolean: [],
|
||||
coerce: {},
|
||||
narg: {},
|
||||
normalize: []
|
||||
normalize: [],
|
||||
};
|
||||
|
||||
// `_` contains unparsed arguments.
|
||||
names.add('_');
|
||||
|
||||
Object.keys(flags).forEach(flag => {
|
||||
Object.keys(flags).forEach((flag) => {
|
||||
const value = flags[flag];
|
||||
|
||||
names.add(flag);
|
||||
@ -76,7 +74,7 @@ const { KNOWN_FLAGS, YARGS_FLAGS } = (() => {
|
||||
|
||||
return {
|
||||
KNOWN_FLAGS: names,
|
||||
YARGS_FLAGS: opts
|
||||
YARGS_FLAGS: opts,
|
||||
};
|
||||
})();
|
||||
|
||||
@ -103,11 +101,11 @@ class Engine {
|
||||
ow(opts.version, ow.optional.string);
|
||||
|
||||
this._bus = new EventBus('jsdoc', {
|
||||
cache: _.isBoolean(opts._cacheEventBus) ? opts._cacheEventBus : true
|
||||
cache: _.isBoolean(opts._cacheEventBus) ? opts._cacheEventBus : true,
|
||||
});
|
||||
this._logger = new Logger({
|
||||
emitter: this._bus,
|
||||
level: opts.logLevel
|
||||
level: opts.logLevel,
|
||||
});
|
||||
this.flags = [];
|
||||
this.revision = opts.revision;
|
||||
@ -147,8 +145,7 @@ class Engine {
|
||||
const maxLength = opts.maxLength || Infinity;
|
||||
|
||||
return (
|
||||
`Options:\n${help({ maxLength })}\n\n` +
|
||||
'Visit https://jsdoc.app/ for more information.'
|
||||
`Options:\n${help({ maxLength })}\n\n` + 'Visit https://jsdoc.app/ for more information.'
|
||||
);
|
||||
}
|
||||
|
||||
@ -188,8 +185,7 @@ class Engine {
|
||||
for (let flag of parsedFlagNames) {
|
||||
if (!KNOWN_FLAGS.has(flag)) {
|
||||
throw new TypeError(
|
||||
'Unknown command-line option: ' +
|
||||
(flag.length === 1 ? `-${flag}` : `--${flag}`)
|
||||
'Unknown command-line option: ' + (flag.length === 1 ? `-${flag}` : `--${flag}`)
|
||||
);
|
||||
}
|
||||
}
|
||||
@ -199,7 +195,7 @@ class Engine {
|
||||
if (parsedFlags[flag] && flags[flag].choices) {
|
||||
let flagInfo = {
|
||||
name: flag,
|
||||
alias: flags[flag].alias
|
||||
alias: flags[flag].alias,
|
||||
};
|
||||
|
||||
validateChoice(flagInfo, flags[flag].choices, parsedFlags[flag]);
|
||||
|
||||
@ -14,94 +14,94 @@ module.exports = {
|
||||
choices: ['all', 'package', 'private', 'protected', 'public', 'undefined'],
|
||||
defaultDescription: 'All except `private`',
|
||||
description: 'Document only symbols with the specified access level.',
|
||||
requiresArg: true
|
||||
requiresArg: true,
|
||||
},
|
||||
configure: {
|
||||
alias: 'c',
|
||||
description: 'The configuration file to use.',
|
||||
normalize: true,
|
||||
requiresArg: true
|
||||
requiresArg: true,
|
||||
},
|
||||
debug: {
|
||||
boolean: true,
|
||||
description: 'Log information to help with debugging.'
|
||||
description: 'Log information to help with debugging.',
|
||||
},
|
||||
destination: {
|
||||
alias: 'd',
|
||||
default: './out',
|
||||
description: 'The output directory.',
|
||||
normalize: true,
|
||||
requiresArg: true
|
||||
requiresArg: true,
|
||||
},
|
||||
encoding: {
|
||||
alias: 'e',
|
||||
default: 'utf8',
|
||||
description: 'The encoding to assume when reading source files.',
|
||||
requiresArg: true
|
||||
requiresArg: true,
|
||||
},
|
||||
explain: {
|
||||
alias: 'X',
|
||||
boolean: true,
|
||||
description: 'Print the parse results to the console and exit.'
|
||||
description: 'Print the parse results to the console and exit.',
|
||||
},
|
||||
help: {
|
||||
alias: 'h',
|
||||
boolean: true,
|
||||
description: 'Print help information and exit.'
|
||||
description: 'Print help information and exit.',
|
||||
},
|
||||
match: {
|
||||
description: 'Run only tests whose names contain this value.',
|
||||
requiresArg: true
|
||||
requiresArg: true,
|
||||
},
|
||||
package: {
|
||||
alias: 'P',
|
||||
description: 'The path to the `package.json` file to use.',
|
||||
normalize: true,
|
||||
requiresArg: true
|
||||
requiresArg: true,
|
||||
},
|
||||
pedantic: {
|
||||
boolean: true,
|
||||
description: 'Treat errors as fatal errors, and treat warnings as errors.'
|
||||
description: 'Treat errors as fatal errors, and treat warnings as errors.',
|
||||
},
|
||||
private: {
|
||||
alias: 'p',
|
||||
boolean: true,
|
||||
description: 'Document private symbols (equivalent to `--access all`).'
|
||||
description: 'Document private symbols (equivalent to `--access all`).',
|
||||
},
|
||||
query: {
|
||||
alias: 'q',
|
||||
coerce: ((str) => cast(querystring.parse(str))),
|
||||
coerce: (str) => cast(querystring.parse(str)),
|
||||
description: 'A query string to parse and store (for example, `foo=bar&baz=true`).',
|
||||
requiresArg: true
|
||||
requiresArg: true,
|
||||
},
|
||||
readme: {
|
||||
alias: 'R',
|
||||
description: 'The `README` file to include in the documentation.',
|
||||
normalize: true,
|
||||
requiresArg: true
|
||||
requiresArg: true,
|
||||
},
|
||||
recurse: {
|
||||
alias: 'r',
|
||||
boolean: true,
|
||||
description: 'Recurse into subdirectories to find source files.'
|
||||
description: 'Recurse into subdirectories to find source files.',
|
||||
},
|
||||
template: {
|
||||
alias: 't',
|
||||
description: 'The template package to use.',
|
||||
requiresArg: true
|
||||
requiresArg: true,
|
||||
},
|
||||
test: {
|
||||
alias: 'T',
|
||||
boolean: true,
|
||||
description: 'Run all tests and exit.'
|
||||
description: 'Run all tests and exit.',
|
||||
},
|
||||
verbose: {
|
||||
boolean: true,
|
||||
description: 'Log detailed information to the console.'
|
||||
description: 'Log detailed information to the console.',
|
||||
},
|
||||
version: {
|
||||
alias: 'v',
|
||||
boolean: true,
|
||||
description: 'Display the version number and exit.'
|
||||
}
|
||||
description: 'Display the version number and exit.',
|
||||
},
|
||||
};
|
||||
|
||||
@ -11,7 +11,7 @@ function padRight(str, length) {
|
||||
function findMaxLength(arr) {
|
||||
let max = 0;
|
||||
|
||||
arr.forEach(({length}) => {
|
||||
arr.forEach(({ length }) => {
|
||||
max = Math.max(max, length);
|
||||
});
|
||||
|
||||
@ -24,7 +24,7 @@ function concatWithMaxLength(items, maxLength) {
|
||||
// to prevent endless loops, always use the first item, regardless of length
|
||||
result += items.shift();
|
||||
|
||||
while (items.length && (result.length + items[0].length < maxLength)) {
|
||||
while (items.length && result.length + items[0].length < maxLength) {
|
||||
result += ` ${items.shift()}`;
|
||||
}
|
||||
|
||||
@ -48,13 +48,13 @@ function concatWithMaxLength(items, maxLength) {
|
||||
* @param {Object} opts - Options for formatting the text.
|
||||
* @param {number} opts.maxLength - The maximum length of each line.
|
||||
*/
|
||||
function formatHelpInfo({names, descriptions}, {maxLength}) {
|
||||
function formatHelpInfo({ names, descriptions }, { maxLength }) {
|
||||
const MARGIN_SIZE = 4;
|
||||
const GUTTER_SIZE = MARGIN_SIZE;
|
||||
const results = [];
|
||||
|
||||
const maxNameLength = findMaxLength(names);
|
||||
const wrapDescriptionAt = maxLength - (MARGIN_SIZE * 2) - GUTTER_SIZE - maxNameLength;
|
||||
const wrapDescriptionAt = maxLength - MARGIN_SIZE * 2 - GUTTER_SIZE - maxNameLength;
|
||||
|
||||
// Build the string for each flag.
|
||||
names.forEach((name, i) => {
|
||||
@ -97,12 +97,12 @@ function formatHelpInfo({names, descriptions}, {maxLength}) {
|
||||
module.exports = ({ maxLength }) => {
|
||||
const flagInfo = {
|
||||
names: [],
|
||||
descriptions: []
|
||||
descriptions: [],
|
||||
};
|
||||
|
||||
Object.keys(flags)
|
||||
.sort()
|
||||
.forEach(flagName => {
|
||||
.forEach((flagName) => {
|
||||
const flagDetail = flags[flagName];
|
||||
let description = '';
|
||||
let name = '';
|
||||
@ -131,5 +131,5 @@ module.exports = ({ maxLength }) => {
|
||||
flagInfo.descriptions.push(description);
|
||||
});
|
||||
|
||||
return `${formatHelpInfo(flagInfo, {maxLength}).join('\n')}`;
|
||||
return `${formatHelpInfo(flagInfo, { maxLength }).join('\n')}`;
|
||||
};
|
||||
|
||||
@ -1,5 +1,5 @@
|
||||
const _ = require('lodash');
|
||||
const {default: ow} = require('ow');
|
||||
const { default: ow } = require('ow');
|
||||
|
||||
/**
|
||||
* Logging levels for the JSDoc logger. The default logging level is
|
||||
@ -63,7 +63,7 @@ const LEVELS = {
|
||||
*
|
||||
* @alias module:@jsdoc/cli.LOG_LEVELS.VERBOSE
|
||||
*/
|
||||
VERBOSE: 1000
|
||||
VERBOSE: 1000,
|
||||
};
|
||||
|
||||
const DEFAULT_LEVEL = LEVELS.WARN;
|
||||
@ -73,14 +73,14 @@ const FUNCS = {
|
||||
[LEVELS.INFO]: 'info',
|
||||
[LEVELS.FATAL]: 'error',
|
||||
[LEVELS.VERBOSE]: 'debug',
|
||||
[LEVELS.WARN]: 'warn'
|
||||
[LEVELS.WARN]: 'warn',
|
||||
};
|
||||
const LEVELS_BY_NUMBER = _.invert(LEVELS);
|
||||
const PREFIXES = {
|
||||
[LEVELS.DEBUG]: 'DEBUG: ',
|
||||
[LEVELS.ERROR]: 'ERROR: ',
|
||||
[LEVELS.FATAL]: 'FATAL: ',
|
||||
[LEVELS.WARN]: 'WARNING: '
|
||||
[LEVELS.WARN]: 'WARNING: ',
|
||||
};
|
||||
|
||||
// Add a prefix to a log message if necessary.
|
||||
@ -98,14 +98,17 @@ class Logger {
|
||||
constructor(opts) {
|
||||
ow(opts, ow.object);
|
||||
// We validate `opts.level` in the setter, so no need to validate it here.
|
||||
ow(opts.emitter, ow.object.partialShape({
|
||||
ow(
|
||||
opts.emitter,
|
||||
ow.object.partialShape({
|
||||
off: ow.function,
|
||||
on: ow.function,
|
||||
once: ow.function
|
||||
}));
|
||||
once: ow.function,
|
||||
})
|
||||
);
|
||||
|
||||
this._console = opts._console || console;
|
||||
const emitter = this._emitter = opts.emitter;
|
||||
const emitter = (this._emitter = opts.emitter);
|
||||
|
||||
this.level = opts.level || DEFAULT_LEVEL;
|
||||
|
||||
@ -121,10 +124,7 @@ class Logger {
|
||||
levelNameLower = levelName.toLowerCase();
|
||||
levelNumber = LEVELS[levelName];
|
||||
|
||||
emitter.on(
|
||||
`logger:${levelNameLower}`,
|
||||
(...args) => this._maybeLog(levelNumber, args)
|
||||
);
|
||||
emitter.on(`logger:${levelNameLower}`, (...args) => this._maybeLog(levelNumber, args));
|
||||
}
|
||||
}
|
||||
|
||||
@ -144,7 +144,9 @@ class Logger {
|
||||
|
||||
if (_.isUndefined(LEVELS_BY_NUMBER[level])) {
|
||||
errorMsg = `Unrecognized logging level ${level}. Known levels are: `;
|
||||
errorMsg += Object.keys(LEVELS).map(k => `${k}: ${LEVELS[k]}`).join(', ');
|
||||
errorMsg += Object.keys(LEVELS)
|
||||
.map((k) => `${k}: ${LEVELS[k]}`)
|
||||
.join(', ');
|
||||
|
||||
throw new TypeError(errorMsg);
|
||||
}
|
||||
@ -155,5 +157,5 @@ class Logger {
|
||||
|
||||
module.exports = {
|
||||
LEVELS,
|
||||
Logger
|
||||
Logger,
|
||||
};
|
||||
|
||||
123
packages/jsdoc-cli/package-lock.json
generated
123
packages/jsdoc-cli/package-lock.json
generated
@ -1,8 +1,129 @@
|
||||
{
|
||||
"name": "@jsdoc/cli",
|
||||
"version": "0.2.5",
|
||||
"lockfileVersion": 1,
|
||||
"lockfileVersion": 2,
|
||||
"requires": true,
|
||||
"packages": {
|
||||
"": {
|
||||
"name": "@jsdoc/cli",
|
||||
"version": "0.2.5",
|
||||
"license": "Apache-2.0",
|
||||
"dependencies": {
|
||||
"lodash": "^4.17.21",
|
||||
"ow": "^0.27.0",
|
||||
"strip-bom": "^4.0.0",
|
||||
"yargs-parser": "^20.2.9"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=v14.17.6"
|
||||
}
|
||||
},
|
||||
"node_modules/@sindresorhus/is": {
|
||||
"version": "4.0.1",
|
||||
"resolved": "https://registry.npmjs.org/@sindresorhus/is/-/is-4.0.1.tgz",
|
||||
"integrity": "sha512-Qm9hBEBu18wt1PO2flE7LPb30BHMQt1eQgbV76YntdNk73XZGpn3izvGTYxbGgzXKgbCjiia0uxTd3aTNQrY/g==",
|
||||
"engines": {
|
||||
"node": ">=10"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sindresorhus/is?sponsor=1"
|
||||
}
|
||||
},
|
||||
"node_modules/callsites": {
|
||||
"version": "3.1.0",
|
||||
"resolved": "https://registry.npmjs.org/callsites/-/callsites-3.1.0.tgz",
|
||||
"integrity": "sha512-P8BjAsXvZS+VIDUI11hHCQEv74YT67YUi5JJFNWIqL235sBmjX4+qx9Muvls5ivyNENctx46xQLQ3aTuE7ssaQ==",
|
||||
"engines": {
|
||||
"node": ">=6"
|
||||
}
|
||||
},
|
||||
"node_modules/dot-prop": {
|
||||
"version": "6.0.1",
|
||||
"resolved": "https://registry.npmjs.org/dot-prop/-/dot-prop-6.0.1.tgz",
|
||||
"integrity": "sha512-tE7ztYzXHIeyvc7N+hR3oi7FIbf/NIjVP9hmAt3yMXzrQ072/fpjGLx2GxNxGxUl5V73MEqYzioOMoVhGMJ5cA==",
|
||||
"dependencies": {
|
||||
"is-obj": "^2.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=10"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/sindresorhus"
|
||||
}
|
||||
},
|
||||
"node_modules/is-obj": {
|
||||
"version": "2.0.0",
|
||||
"resolved": "https://registry.npmjs.org/is-obj/-/is-obj-2.0.0.tgz",
|
||||
"integrity": "sha512-drqDG3cbczxxEJRoOXcOjtdp1J/lyp1mNn0xaznRs8+muBhgQcrnbspox5X5fOw0HnMnbfDzvnEMEtqDEJEo8w==",
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/lodash": {
|
||||
"version": "4.17.21",
|
||||
"resolved": "https://registry.npmjs.org/lodash/-/lodash-4.17.21.tgz",
|
||||
"integrity": "sha512-v2kDEe57lecTulaDIuNTPy3Ry4gLGJ6Z1O3vE1krgXZNrsQ+LFTGHVxVjcXPs17LhbZVGedAJv8XZ1tvj5FvSg=="
|
||||
},
|
||||
"node_modules/lodash.isequal": {
|
||||
"version": "4.5.0",
|
||||
"resolved": "https://registry.npmjs.org/lodash.isequal/-/lodash.isequal-4.5.0.tgz",
|
||||
"integrity": "sha1-QVxEePK8wwEgwizhDtMib30+GOA="
|
||||
},
|
||||
"node_modules/ow": {
|
||||
"version": "0.27.0",
|
||||
"resolved": "https://registry.npmjs.org/ow/-/ow-0.27.0.tgz",
|
||||
"integrity": "sha512-SGnrGUbhn4VaUGdU0EJLMwZWSupPmF46hnTRII7aCLCrqixTAC5eKo8kI4/XXf1eaaI8YEVT+3FeGNJI9himAQ==",
|
||||
"dependencies": {
|
||||
"@sindresorhus/is": "^4.0.1",
|
||||
"callsites": "^3.1.0",
|
||||
"dot-prop": "^6.0.1",
|
||||
"lodash.isequal": "^4.5.0",
|
||||
"type-fest": "^1.2.1",
|
||||
"vali-date": "^1.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=12"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/sindresorhus"
|
||||
}
|
||||
},
|
||||
"node_modules/strip-bom": {
|
||||
"version": "4.0.0",
|
||||
"resolved": "https://registry.npmjs.org/strip-bom/-/strip-bom-4.0.0.tgz",
|
||||
"integrity": "sha512-3xurFv5tEgii33Zi8Jtp55wEIILR9eh34FAW00PZf+JnSsTmV/ioewSgQl97JHvgjoRGwPShsWm+IdrxB35d0w==",
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/type-fest": {
|
||||
"version": "1.2.2",
|
||||
"resolved": "https://registry.npmjs.org/type-fest/-/type-fest-1.2.2.tgz",
|
||||
"integrity": "sha512-pfkPYCcuV0TJoo/jlsUeWNV8rk7uMU6ocnYNvca1Vu+pyKi8Rl8Zo2scPt9O72gCsXIm+dMxOOWuA3VFDSdzWA==",
|
||||
"engines": {
|
||||
"node": ">=10"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/sindresorhus"
|
||||
}
|
||||
},
|
||||
"node_modules/vali-date": {
|
||||
"version": "1.0.0",
|
||||
"resolved": "https://registry.npmjs.org/vali-date/-/vali-date-1.0.0.tgz",
|
||||
"integrity": "sha1-G5BKWWCfsyjvB4E4Qgk09rhnCaY=",
|
||||
"engines": {
|
||||
"node": ">=0.10.0"
|
||||
}
|
||||
},
|
||||
"node_modules/yargs-parser": {
|
||||
"version": "20.2.9",
|
||||
"resolved": "https://registry.npmjs.org/yargs-parser/-/yargs-parser-20.2.9.tgz",
|
||||
"integrity": "sha512-y11nGElTIV+CT3Zv9t7VKl+Q3hTQoT9a1Qzezhhl6Rp21gJ/IVTW7Z3y9EWXhuUBC2Shnf+DX0antecpAwSP8w==",
|
||||
"engines": {
|
||||
"node": ">=10"
|
||||
}
|
||||
}
|
||||
},
|
||||
"dependencies": {
|
||||
"@sindresorhus/is": {
|
||||
"version": "4.0.1",
|
||||
|
||||
@ -194,7 +194,7 @@ describe('@jsdoc/cli/lib/engine', () => {
|
||||
|
||||
expect(parsed.query).toEqual({
|
||||
foo: 'bar',
|
||||
baz: true
|
||||
baz: true,
|
||||
});
|
||||
});
|
||||
});
|
||||
@ -217,7 +217,7 @@ describe('@jsdoc/cli/lib/engine', () => {
|
||||
const revision = new Date();
|
||||
const instance = new Engine({
|
||||
version: '1.2.3',
|
||||
revision
|
||||
revision,
|
||||
});
|
||||
|
||||
expect(instance.versionDetails).toBe(`JSDoc 1.2.3 (${revision.toUTCString()})`);
|
||||
|
||||
@ -1,5 +1,5 @@
|
||||
const flags = require('../../../lib/flags');
|
||||
const {default: ow} = require('ow');
|
||||
const { default: ow } = require('ow');
|
||||
|
||||
function validate(name, opts) {
|
||||
name = `--${name}`;
|
||||
|
||||
@ -12,11 +12,11 @@ describe('@jsdoc/cli/lib/logger', () => {
|
||||
beforeEach(() => {
|
||||
bus = new EventBus('loggerTest', {
|
||||
_console: console,
|
||||
cache: false
|
||||
cache: false,
|
||||
});
|
||||
logger = new Logger({ emitter: bus });
|
||||
|
||||
['debug', 'error', 'info', 'warn'].forEach(func => spyOn(console, func));
|
||||
['debug', 'error', 'info', 'warn'].forEach((func) => spyOn(console, func));
|
||||
});
|
||||
|
||||
it('exports a Logger constructor', () => {
|
||||
@ -41,28 +41,29 @@ describe('@jsdoc/cli/lib/logger', () => {
|
||||
});
|
||||
|
||||
it('accepts a valid level', () => {
|
||||
expect(() => new Logger({
|
||||
expect(
|
||||
() =>
|
||||
new Logger({
|
||||
emitter: bus,
|
||||
level: LEVELS.VERBOSE
|
||||
})).not.toThrow();
|
||||
level: LEVELS.VERBOSE,
|
||||
})
|
||||
).not.toThrow();
|
||||
});
|
||||
|
||||
it('throws on an invalid level', () => {
|
||||
expect(() => new Logger({
|
||||
expect(
|
||||
() =>
|
||||
new Logger({
|
||||
emitter: bus,
|
||||
level: LEVELS.VERBOSE + 1
|
||||
})).toThrowErrorOfType(TYPE_ERROR);
|
||||
level: LEVELS.VERBOSE + 1,
|
||||
})
|
||||
).toThrowErrorOfType(TYPE_ERROR);
|
||||
});
|
||||
});
|
||||
|
||||
describe('events', () => {
|
||||
it('passes all event arguments through', () => {
|
||||
const args = [
|
||||
'My name is %s %s %s',
|
||||
'foo',
|
||||
'bar',
|
||||
'baz'
|
||||
];
|
||||
const args = ['My name is %s %s %s', 'foo', 'bar', 'baz'];
|
||||
const eventType = 'logger:info';
|
||||
|
||||
logger.level = LEVELS.VERBOSE;
|
||||
|
||||
@ -9,5 +9,5 @@ const name = require('./lib/name');
|
||||
|
||||
module.exports = {
|
||||
config,
|
||||
name
|
||||
name,
|
||||
};
|
||||
|
||||
@ -11,11 +11,11 @@ const stripJsonComments = require('strip-json-comments');
|
||||
|
||||
const MODULE_NAME = 'jsdoc';
|
||||
|
||||
const defaults = exports.defaults = {
|
||||
const defaults = (exports.defaults = {
|
||||
// TODO(hegemonic): Integrate CLI options with other options.
|
||||
opts: {
|
||||
destination: './out',
|
||||
encoding: 'utf8'
|
||||
encoding: 'utf8',
|
||||
},
|
||||
/**
|
||||
* The JSDoc plugins to load.
|
||||
@ -45,7 +45,7 @@ const defaults = exports.defaults = {
|
||||
* The type of source file. In general, you should use the value `module`. If none of your
|
||||
* source files use ECMAScript >=2015 syntax, you can use the value `script`.
|
||||
*/
|
||||
type: 'module'
|
||||
type: 'module',
|
||||
},
|
||||
/**
|
||||
* Settings for interpreting JSDoc tags.
|
||||
@ -62,10 +62,7 @@ const defaults = exports.defaults = {
|
||||
* If you specify two or more tag dictionaries, and a tag is defined in multiple
|
||||
* dictionaries, JSDoc uses the definition from the first dictionary that includes that tag.
|
||||
*/
|
||||
dictionaries: [
|
||||
'jsdoc',
|
||||
'closure'
|
||||
]
|
||||
dictionaries: ['jsdoc', 'closure'],
|
||||
},
|
||||
/**
|
||||
* Settings for generating output with JSDoc templates. Some JSDoc templates might ignore these
|
||||
@ -80,9 +77,9 @@ const defaults = exports.defaults = {
|
||||
/**
|
||||
* Set to `true` to use a monospaced font for all links.
|
||||
*/
|
||||
monospaceLinks: false
|
||||
}
|
||||
};
|
||||
monospaceLinks: false,
|
||||
},
|
||||
});
|
||||
|
||||
// TODO: Consider exporting this class.
|
||||
class Config {
|
||||
@ -106,7 +103,7 @@ const explorerSync = cosmiconfigSync(MODULE_NAME, {
|
||||
'.json': loadJson,
|
||||
'.yaml': loadYaml,
|
||||
'.yml': loadYaml,
|
||||
noExt: loadYaml
|
||||
noExt: loadYaml,
|
||||
},
|
||||
searchPlaces: [
|
||||
'package.json',
|
||||
@ -115,8 +112,8 @@ const explorerSync = cosmiconfigSync(MODULE_NAME, {
|
||||
`.${MODULE_NAME}rc.yaml`,
|
||||
`.${MODULE_NAME}rc.yml`,
|
||||
`.${MODULE_NAME}rc.js`,
|
||||
`${MODULE_NAME}.config.js`
|
||||
]
|
||||
`${MODULE_NAME}.config.js`,
|
||||
],
|
||||
});
|
||||
|
||||
exports.loadSync = (filepath) => {
|
||||
@ -128,8 +125,5 @@ exports.loadSync = (filepath) => {
|
||||
loaded = explorerSync.search() || {};
|
||||
}
|
||||
|
||||
return new Config(
|
||||
loaded.filepath,
|
||||
_.defaultsDeep({}, loaded.config, defaults)
|
||||
);
|
||||
return new Config(loaded.filepath, _.defaultsDeep({}, loaded.config, defaults));
|
||||
};
|
||||
|
||||
@ -19,7 +19,7 @@ exports.LONGNAMES = {
|
||||
/** Longname used for doclets that do not have a longname, such as anonymous functions. */
|
||||
ANONYMOUS: '<anonymous>',
|
||||
/** Longname that represents global scope. */
|
||||
GLOBAL: '<global>'
|
||||
GLOBAL: '<global>',
|
||||
};
|
||||
|
||||
// Module namespace prefix.
|
||||
@ -32,26 +32,26 @@ exports.MODULE_NAMESPACE = 'module:';
|
||||
* @static
|
||||
* @memberof module:jsdoc/name
|
||||
*/
|
||||
const SCOPE = exports.SCOPE = {
|
||||
const SCOPE = (exports.SCOPE = {
|
||||
NAMES: {
|
||||
GLOBAL: 'global',
|
||||
INNER: 'inner',
|
||||
INSTANCE: 'instance',
|
||||
STATIC: 'static'
|
||||
STATIC: 'static',
|
||||
},
|
||||
PUNC: {
|
||||
INNER: '~',
|
||||
INSTANCE: '#',
|
||||
STATIC: '.'
|
||||
}
|
||||
};
|
||||
STATIC: '.',
|
||||
},
|
||||
});
|
||||
|
||||
// Keys must be lowercase.
|
||||
const SCOPE_TO_PUNC = exports.SCOPE_TO_PUNC = {
|
||||
const SCOPE_TO_PUNC = (exports.SCOPE_TO_PUNC = {
|
||||
inner: SCOPE.PUNC.INNER,
|
||||
instance: SCOPE.PUNC.INSTANCE,
|
||||
static: SCOPE.PUNC.STATIC
|
||||
};
|
||||
static: SCOPE.PUNC.STATIC,
|
||||
});
|
||||
|
||||
exports.PUNC_TO_SCOPE = _.invert(SCOPE_TO_PUNC);
|
||||
|
||||
@ -91,14 +91,14 @@ exports.nameIsLongname = (name, memberof) => {
|
||||
* @param {string} name - The name in which to change `prototype` to `#`.
|
||||
* @returns {string} The updated name.
|
||||
*/
|
||||
const prototypeToPunc = exports.prototypeToPunc = name => {
|
||||
const prototypeToPunc = (exports.prototypeToPunc = (name) => {
|
||||
// Don't mangle symbols named `prototype`.
|
||||
if (name === 'prototype') {
|
||||
return name;
|
||||
}
|
||||
|
||||
return name.replace(/(?:^|\.)prototype\.?/g, SCOPE.PUNC.INSTANCE);
|
||||
};
|
||||
});
|
||||
|
||||
/**
|
||||
* Check whether a name begins with a character that identifies a scope.
|
||||
@ -106,7 +106,7 @@ const prototypeToPunc = exports.prototypeToPunc = name => {
|
||||
* @param {string} name - The name to check.
|
||||
* @returns {boolean} `true` if the name begins with a scope character; otherwise, `false`.
|
||||
*/
|
||||
exports.hasLeadingScope = name => REGEXP_LEADING_SCOPE.test(name);
|
||||
exports.hasLeadingScope = (name) => REGEXP_LEADING_SCOPE.test(name);
|
||||
|
||||
/**
|
||||
* Check whether a name ends with a character that identifies a scope.
|
||||
@ -114,7 +114,7 @@ exports.hasLeadingScope = name => REGEXP_LEADING_SCOPE.test(name);
|
||||
* @param {string} name - The name to check.
|
||||
* @returns {boolean} `true` if the name ends with a scope character; otherwise, `false`.
|
||||
*/
|
||||
exports.hasTrailingScope = name => REGEXP_TRAILING_SCOPE.test(name);
|
||||
exports.hasTrailingScope = (name) => REGEXP_TRAILING_SCOPE.test(name);
|
||||
|
||||
/**
|
||||
* Get a symbol's basename, which is the first part of its full name before any punctuation (other
|
||||
@ -128,7 +128,7 @@ exports.hasTrailingScope = name => REGEXP_TRAILING_SCOPE.test(name);
|
||||
* @param {?string} [name] - The symbol's full name.
|
||||
* @returns {?string} The symbol's basename.
|
||||
*/
|
||||
exports.getBasename = name => {
|
||||
exports.getBasename = (name) => {
|
||||
if (!name) {
|
||||
return null;
|
||||
}
|
||||
@ -137,7 +137,7 @@ exports.getBasename = name => {
|
||||
};
|
||||
|
||||
// TODO: docs
|
||||
exports.stripNamespace = longname => longname.replace(/^[a-zA-Z]+:/, '');
|
||||
exports.stripNamespace = (longname) => longname.replace(/^[a-zA-Z]+:/, '');
|
||||
|
||||
// TODO: docs
|
||||
function slice(longname, sliceChars, forcedMemberof) {
|
||||
@ -160,11 +160,10 @@ function slice(longname, sliceChars, forcedMemberof) {
|
||||
let punc = '';
|
||||
|
||||
// Is there a leading bracket?
|
||||
if ( /^\[/.test(p2) ) {
|
||||
if (/^\[/.test(p2)) {
|
||||
// Is it a static or instance member?
|
||||
punc = p1 ? SCOPE.PUNC.INSTANCE : SCOPE.PUNC.STATIC;
|
||||
p2 = p2.replace(/^\[/g, '')
|
||||
.replace(/\]$/g, '');
|
||||
p2 = p2.replace(/^\[/g, '').replace(/\]$/g, '');
|
||||
}
|
||||
|
||||
token = `@{${tokens.length}}@`;
|
||||
@ -186,8 +185,7 @@ function slice(longname, sliceChars, forcedMemberof) {
|
||||
if (parts[2]) {
|
||||
scopePunc = parts[2];
|
||||
}
|
||||
}
|
||||
else if (longname) {
|
||||
} else if (longname) {
|
||||
parts = longname.match(new RegExp(`^(:?(.+)([${sliceChars.join()}]))?(.+?)$`)) || [];
|
||||
name = parts.pop() || '';
|
||||
scopePunc = parts.pop() || '';
|
||||
@ -214,7 +212,7 @@ function slice(longname, sliceChars, forcedMemberof) {
|
||||
memberof: memberof,
|
||||
scope: scopePunc,
|
||||
name: name,
|
||||
variation: variation
|
||||
variation: variation,
|
||||
};
|
||||
}
|
||||
|
||||
@ -231,9 +229,7 @@ function slice(longname, sliceChars, forcedMemberof) {
|
||||
* @param {string} forcedMemberof
|
||||
* @returns {object} Representing the properties of the given name.
|
||||
*/
|
||||
exports.toParts = (longname, forcedMemberof) => slice(
|
||||
longname, null, forcedMemberof
|
||||
);
|
||||
exports.toParts = (longname, forcedMemberof) => slice(longname, null, forcedMemberof);
|
||||
|
||||
// TODO: docs
|
||||
/**
|
||||
@ -286,15 +282,11 @@ exports.hasAncestor = (parent, child) => {
|
||||
};
|
||||
|
||||
// TODO: docs
|
||||
const fromParts = exports.fromParts = ({memberof, scope, name, variation}) => [
|
||||
(memberof || ''),
|
||||
(scope || ''),
|
||||
(name || ''),
|
||||
(variation ? `(${variation})` : '')
|
||||
].join('');
|
||||
const fromParts = (exports.fromParts = ({ memberof, scope, name, variation }) =>
|
||||
[memberof || '', scope || '', name || '', variation ? `(${variation})` : ''].join(''));
|
||||
|
||||
// TODO: docs
|
||||
exports.stripVariation = name => {
|
||||
exports.stripVariation = (name) => {
|
||||
const parts = slice(name);
|
||||
|
||||
parts.variation = '';
|
||||
@ -310,7 +302,7 @@ function splitLongname(longname, options) {
|
||||
const splitters = SCOPE_PUNC.concat('/');
|
||||
|
||||
options = _.defaults(options || {}, {
|
||||
includeVariation: true
|
||||
includeVariation: true,
|
||||
});
|
||||
|
||||
do {
|
||||
@ -324,7 +316,7 @@ function splitLongname(longname, options) {
|
||||
|
||||
return {
|
||||
chunks: chunks.reverse(),
|
||||
nameInfo: nameInfo
|
||||
nameInfo: nameInfo,
|
||||
};
|
||||
}
|
||||
|
||||
@ -414,7 +406,7 @@ exports.longnamesToTree = (longnames, doclets) => {
|
||||
const splitOptions = { includeVariation: false };
|
||||
const tree = {};
|
||||
|
||||
longnames.forEach(longname => {
|
||||
longnames.forEach((longname) => {
|
||||
let currentLongname = '';
|
||||
let currentParent = tree;
|
||||
let nameInfo;
|
||||
@ -428,7 +420,7 @@ exports.longnamesToTree = (longnames, doclets) => {
|
||||
processed = splitLongname(longname, splitOptions);
|
||||
nameInfo = processed.nameInfo;
|
||||
|
||||
processed.chunks.forEach(chunk => {
|
||||
processed.chunks.forEach((chunk) => {
|
||||
currentLongname += chunk;
|
||||
|
||||
if (currentParent !== tree) {
|
||||
@ -493,17 +485,16 @@ function splitNameMatchingBrackets(nameDesc) {
|
||||
|
||||
return {
|
||||
name: buffer.join(''),
|
||||
description: RegExp.$1
|
||||
description: RegExp.$1,
|
||||
};
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* Split a string that starts with a name and ends with a description into separate parts.
|
||||
* @param {string} str - The string that contains the name and description.
|
||||
* @returns {object} An object with `name` and `description` properties.
|
||||
*/
|
||||
exports.splitNameAndDescription = str => {
|
||||
exports.splitNameAndDescription = (str) => {
|
||||
// Like: `name`, `[name]`, `name text`, `[name] text`, `name - text`, or `[name] - text`.
|
||||
// To ensure that we don't get confused by leading dashes in Markdown list items, the hyphen
|
||||
// must be on the same line as the name.
|
||||
@ -522,6 +513,6 @@ exports.splitNameAndDescription = str => {
|
||||
|
||||
return {
|
||||
name: RegExp.$1,
|
||||
description: RegExp.$2
|
||||
description: RegExp.$2,
|
||||
};
|
||||
};
|
||||
|
||||
273
packages/jsdoc-core/package-lock.json
generated
273
packages/jsdoc-core/package-lock.json
generated
@ -1,8 +1,279 @@
|
||||
{
|
||||
"name": "@jsdoc/core",
|
||||
"version": "0.4.0",
|
||||
"lockfileVersion": 1,
|
||||
"lockfileVersion": 2,
|
||||
"requires": true,
|
||||
"packages": {
|
||||
"": {
|
||||
"name": "@jsdoc/core",
|
||||
"version": "0.4.0",
|
||||
"license": "Apache-2.0",
|
||||
"dependencies": {
|
||||
"cosmiconfig": "^7.0.1",
|
||||
"escape-string-regexp": "^4.0.0",
|
||||
"lodash": "^4.17.21",
|
||||
"strip-bom": "^4.0.0",
|
||||
"strip-json-comments": "^3.1.1"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=v14.17.6"
|
||||
}
|
||||
},
|
||||
"node_modules/@babel/code-frame": {
|
||||
"version": "7.14.5",
|
||||
"resolved": "https://registry.npmjs.org/@babel/code-frame/-/code-frame-7.14.5.tgz",
|
||||
"integrity": "sha512-9pzDqyc6OLDaqe+zbACgFkb6fKMNG6CObKpnYXChRsvYGyEdc7CA2BaqeOM+vOtCS5ndmJicPJhKAwYRI6UfFw==",
|
||||
"dependencies": {
|
||||
"@babel/highlight": "^7.14.5"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=6.9.0"
|
||||
}
|
||||
},
|
||||
"node_modules/@babel/helper-validator-identifier": {
|
||||
"version": "7.14.9",
|
||||
"resolved": "https://registry.npmjs.org/@babel/helper-validator-identifier/-/helper-validator-identifier-7.14.9.tgz",
|
||||
"integrity": "sha512-pQYxPY0UP6IHISRitNe8bsijHex4TWZXi2HwKVsjPiltzlhse2znVcm9Ace510VT1kxIHjGJCZZQBX2gJDbo0g==",
|
||||
"engines": {
|
||||
"node": ">=6.9.0"
|
||||
}
|
||||
},
|
||||
"node_modules/@babel/highlight": {
|
||||
"version": "7.14.5",
|
||||
"resolved": "https://registry.npmjs.org/@babel/highlight/-/highlight-7.14.5.tgz",
|
||||
"integrity": "sha512-qf9u2WFWVV0MppaL877j2dBtQIDgmidgjGk5VIMw3OadXvYaXn66U1BFlH2t4+t3i+8PhedppRv+i40ABzd+gg==",
|
||||
"dependencies": {
|
||||
"@babel/helper-validator-identifier": "^7.14.5",
|
||||
"chalk": "^2.0.0",
|
||||
"js-tokens": "^4.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=6.9.0"
|
||||
}
|
||||
},
|
||||
"node_modules/@types/parse-json": {
|
||||
"version": "4.0.0",
|
||||
"resolved": "https://registry.npmjs.org/@types/parse-json/-/parse-json-4.0.0.tgz",
|
||||
"integrity": "sha512-//oorEZjL6sbPcKUaCdIGlIUeH26mgzimjBB77G6XRgnDl/L5wOnpyBGRe/Mmf5CVW3PwEBE1NjiMZ/ssFh4wA=="
|
||||
},
|
||||
"node_modules/ansi-styles": {
|
||||
"version": "3.2.1",
|
||||
"resolved": "https://registry.npmjs.org/ansi-styles/-/ansi-styles-3.2.1.tgz",
|
||||
"integrity": "sha512-VT0ZI6kZRdTh8YyJw3SMbYm/u+NqfsAxEpWO0Pf9sq8/e94WxxOpPKx9FR1FlyCtOVDNOQ+8ntlqFxiRc+r5qA==",
|
||||
"dependencies": {
|
||||
"color-convert": "^1.9.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=4"
|
||||
}
|
||||
},
|
||||
"node_modules/callsites": {
|
||||
"version": "3.1.0",
|
||||
"resolved": "https://registry.npmjs.org/callsites/-/callsites-3.1.0.tgz",
|
||||
"integrity": "sha512-P8BjAsXvZS+VIDUI11hHCQEv74YT67YUi5JJFNWIqL235sBmjX4+qx9Muvls5ivyNENctx46xQLQ3aTuE7ssaQ==",
|
||||
"engines": {
|
||||
"node": ">=6"
|
||||
}
|
||||
},
|
||||
"node_modules/chalk": {
|
||||
"version": "2.4.2",
|
||||
"resolved": "https://registry.npmjs.org/chalk/-/chalk-2.4.2.tgz",
|
||||
"integrity": "sha512-Mti+f9lpJNcwF4tWV8/OrTTtF1gZi+f8FqlyAdouralcFWFQWF2+NgCHShjkCb+IFBLq9buZwE1xckQU4peSuQ==",
|
||||
"dependencies": {
|
||||
"ansi-styles": "^3.2.1",
|
||||
"escape-string-regexp": "^1.0.5",
|
||||
"supports-color": "^5.3.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=4"
|
||||
}
|
||||
},
|
||||
"node_modules/chalk/node_modules/escape-string-regexp": {
|
||||
"version": "1.0.5",
|
||||
"resolved": "https://registry.npmjs.org/escape-string-regexp/-/escape-string-regexp-1.0.5.tgz",
|
||||
"integrity": "sha1-G2HAViGQqN/2rjuyzwIAyhMLhtQ=",
|
||||
"engines": {
|
||||
"node": ">=0.8.0"
|
||||
}
|
||||
},
|
||||
"node_modules/color-convert": {
|
||||
"version": "1.9.3",
|
||||
"resolved": "https://registry.npmjs.org/color-convert/-/color-convert-1.9.3.tgz",
|
||||
"integrity": "sha512-QfAUtd+vFdAtFQcC8CCyYt1fYWxSqAiK2cSD6zDB8N3cpsEBAvRxp9zOGg6G/SHHJYAT88/az/IuDGALsNVbGg==",
|
||||
"dependencies": {
|
||||
"color-name": "1.1.3"
|
||||
}
|
||||
},
|
||||
"node_modules/color-name": {
|
||||
"version": "1.1.3",
|
||||
"resolved": "https://registry.npmjs.org/color-name/-/color-name-1.1.3.tgz",
|
||||
"integrity": "sha1-p9BVi9icQveV3UIyj3QIMcpTvCU="
|
||||
},
|
||||
"node_modules/cosmiconfig": {
|
||||
"version": "7.0.1",
|
||||
"resolved": "https://registry.npmjs.org/cosmiconfig/-/cosmiconfig-7.0.1.tgz",
|
||||
"integrity": "sha512-a1YWNUV2HwGimB7dU2s1wUMurNKjpx60HxBB6xUM8Re+2s1g1IIfJvFR0/iCF+XHdE0GMTKTuLR32UQff4TEyQ==",
|
||||
"dependencies": {
|
||||
"@types/parse-json": "^4.0.0",
|
||||
"import-fresh": "^3.2.1",
|
||||
"parse-json": "^5.0.0",
|
||||
"path-type": "^4.0.0",
|
||||
"yaml": "^1.10.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=10"
|
||||
}
|
||||
},
|
||||
"node_modules/error-ex": {
|
||||
"version": "1.3.2",
|
||||
"resolved": "https://registry.npmjs.org/error-ex/-/error-ex-1.3.2.tgz",
|
||||
"integrity": "sha512-7dFHNmqeFSEt2ZBsCriorKnn3Z2pj+fd9kmI6QoWw4//DL+icEBfc0U7qJCisqrTsKTjw4fNFy2pW9OqStD84g==",
|
||||
"dependencies": {
|
||||
"is-arrayish": "^0.2.1"
|
||||
}
|
||||
},
|
||||
"node_modules/escape-string-regexp": {
|
||||
"version": "4.0.0",
|
||||
"resolved": "https://registry.npmjs.org/escape-string-regexp/-/escape-string-regexp-4.0.0.tgz",
|
||||
"integrity": "sha512-TtpcNJ3XAzx3Gq8sWRzJaVajRs0uVxA2YAkdb1jm2YkPz4G6egUFAyA3n5vtEIZefPk5Wa4UXbKuS5fKkJWdgA==",
|
||||
"engines": {
|
||||
"node": ">=10"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/sindresorhus"
|
||||
}
|
||||
},
|
||||
"node_modules/has-flag": {
|
||||
"version": "3.0.0",
|
||||
"resolved": "https://registry.npmjs.org/has-flag/-/has-flag-3.0.0.tgz",
|
||||
"integrity": "sha1-tdRU3CGZriJWmfNGfloH87lVuv0=",
|
||||
"engines": {
|
||||
"node": ">=4"
|
||||
}
|
||||
},
|
||||
"node_modules/import-fresh": {
|
||||
"version": "3.3.0",
|
||||
"resolved": "https://registry.npmjs.org/import-fresh/-/import-fresh-3.3.0.tgz",
|
||||
"integrity": "sha512-veYYhQa+D1QBKznvhUHxb8faxlrwUnxseDAbAp457E0wLNio2bOSKnjYDhMj+YiAq61xrMGhQk9iXVk5FzgQMw==",
|
||||
"dependencies": {
|
||||
"parent-module": "^1.0.0",
|
||||
"resolve-from": "^4.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=6"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/sindresorhus"
|
||||
}
|
||||
},
|
||||
"node_modules/is-arrayish": {
|
||||
"version": "0.2.1",
|
||||
"resolved": "https://registry.npmjs.org/is-arrayish/-/is-arrayish-0.2.1.tgz",
|
||||
"integrity": "sha1-d8mYQFJ6qOyxqLppe4BkWnqSap0="
|
||||
},
|
||||
"node_modules/js-tokens": {
|
||||
"version": "4.0.0",
|
||||
"resolved": "https://registry.npmjs.org/js-tokens/-/js-tokens-4.0.0.tgz",
|
||||
"integrity": "sha512-RdJUflcE3cUzKiMqQgsCu06FPu9UdIJO0beYbPhHN4k6apgJtifcoCtT9bcxOpYBtpD2kCM6Sbzg4CausW/PKQ=="
|
||||
},
|
||||
"node_modules/json-parse-even-better-errors": {
|
||||
"version": "2.3.1",
|
||||
"resolved": "https://registry.npmjs.org/json-parse-even-better-errors/-/json-parse-even-better-errors-2.3.1.tgz",
|
||||
"integrity": "sha512-xyFwyhro/JEof6Ghe2iz2NcXoj2sloNsWr/XsERDK/oiPCfaNhl5ONfp+jQdAZRQQ0IJWNzH9zIZF7li91kh2w=="
|
||||
},
|
||||
"node_modules/lines-and-columns": {
|
||||
"version": "1.1.6",
|
||||
"resolved": "https://registry.npmjs.org/lines-and-columns/-/lines-and-columns-1.1.6.tgz",
|
||||
"integrity": "sha1-HADHQ7QzzQpOgHWPe2SldEDZ/wA="
|
||||
},
|
||||
"node_modules/lodash": {
|
||||
"version": "4.17.21",
|
||||
"resolved": "https://registry.npmjs.org/lodash/-/lodash-4.17.21.tgz",
|
||||
"integrity": "sha512-v2kDEe57lecTulaDIuNTPy3Ry4gLGJ6Z1O3vE1krgXZNrsQ+LFTGHVxVjcXPs17LhbZVGedAJv8XZ1tvj5FvSg=="
|
||||
},
|
||||
"node_modules/parent-module": {
|
||||
"version": "1.0.1",
|
||||
"resolved": "https://registry.npmjs.org/parent-module/-/parent-module-1.0.1.tgz",
|
||||
"integrity": "sha512-GQ2EWRpQV8/o+Aw8YqtfZZPfNRWZYkbidE9k5rpl/hC3vtHHBfGm2Ifi6qWV+coDGkrUKZAxE3Lot5kcsRlh+g==",
|
||||
"dependencies": {
|
||||
"callsites": "^3.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=6"
|
||||
}
|
||||
},
|
||||
"node_modules/parse-json": {
|
||||
"version": "5.2.0",
|
||||
"resolved": "https://registry.npmjs.org/parse-json/-/parse-json-5.2.0.tgz",
|
||||
"integrity": "sha512-ayCKvm/phCGxOkYRSCM82iDwct8/EonSEgCSxWxD7ve6jHggsFl4fZVQBPRNgQoKiuV/odhFrGzQXZwbifC8Rg==",
|
||||
"dependencies": {
|
||||
"@babel/code-frame": "^7.0.0",
|
||||
"error-ex": "^1.3.1",
|
||||
"json-parse-even-better-errors": "^2.3.0",
|
||||
"lines-and-columns": "^1.1.6"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/sindresorhus"
|
||||
}
|
||||
},
|
||||
"node_modules/path-type": {
|
||||
"version": "4.0.0",
|
||||
"resolved": "https://registry.npmjs.org/path-type/-/path-type-4.0.0.tgz",
|
||||
"integrity": "sha512-gDKb8aZMDeD/tZWs9P6+q0J9Mwkdl6xMV8TjnGP3qJVJ06bdMgkbBlLU8IdfOsIsFz2BW1rNVT3XuNEl8zPAvw==",
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/resolve-from": {
|
||||
"version": "4.0.0",
|
||||
"resolved": "https://registry.npmjs.org/resolve-from/-/resolve-from-4.0.0.tgz",
|
||||
"integrity": "sha512-pb/MYmXstAkysRFx8piNI1tGFNQIFA3vkE3Gq4EuA1dF6gHp/+vgZqsCGJapvy8N3Q+4o7FwvquPJcnZ7RYy4g==",
|
||||
"engines": {
|
||||
"node": ">=4"
|
||||
}
|
||||
},
|
||||
"node_modules/strip-bom": {
|
||||
"version": "4.0.0",
|
||||
"resolved": "https://registry.npmjs.org/strip-bom/-/strip-bom-4.0.0.tgz",
|
||||
"integrity": "sha512-3xurFv5tEgii33Zi8Jtp55wEIILR9eh34FAW00PZf+JnSsTmV/ioewSgQl97JHvgjoRGwPShsWm+IdrxB35d0w==",
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/strip-json-comments": {
|
||||
"version": "3.1.1",
|
||||
"resolved": "https://registry.npmjs.org/strip-json-comments/-/strip-json-comments-3.1.1.tgz",
|
||||
"integrity": "sha512-6fPc+R4ihwqP6N/aIv2f1gMH8lOVtWQHoqC4yK6oSDVVocumAsfCqjkXnqiYMhmMwS/mEHLp7Vehlt3ql6lEig==",
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/sindresorhus"
|
||||
}
|
||||
},
|
||||
"node_modules/supports-color": {
|
||||
"version": "5.5.0",
|
||||
"resolved": "https://registry.npmjs.org/supports-color/-/supports-color-5.5.0.tgz",
|
||||
"integrity": "sha512-QjVjwdXIt408MIiAqCX4oUKsgU2EqAGzs2Ppkm4aQYbjm+ZEWEcW4SfFNTr4uMNZma0ey4f5lgLrkB0aX0QMow==",
|
||||
"dependencies": {
|
||||
"has-flag": "^3.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=4"
|
||||
}
|
||||
},
|
||||
"node_modules/yaml": {
|
||||
"version": "1.10.2",
|
||||
"resolved": "https://registry.npmjs.org/yaml/-/yaml-1.10.2.tgz",
|
||||
"integrity": "sha512-r3vXyErRCYJ7wg28yvBY5VSoAF8ZvlcW9/BwUzEtUsjvX/DKs24dIkuwjtuprwJJHsbyUbLApepYTR1BN4uHrg==",
|
||||
"engines": {
|
||||
"node": ">= 6"
|
||||
}
|
||||
}
|
||||
},
|
||||
"dependencies": {
|
||||
"@babel/code-frame": {
|
||||
"version": "7.14.5",
|
||||
|
||||
@ -19,7 +19,7 @@ describe('@jsdoc/core/lib/config', () => {
|
||||
|
||||
it('returns an object with `config` and `filepath` properties', () => {
|
||||
mockFs({
|
||||
'conf.json': '{}'
|
||||
'conf.json': '{}',
|
||||
});
|
||||
|
||||
const conf = config.loadSync('conf.json');
|
||||
@ -30,7 +30,7 @@ describe('@jsdoc/core/lib/config', () => {
|
||||
|
||||
it('loads settings from the specified filepath if there is one', () => {
|
||||
mockFs({
|
||||
'conf.json': '{"foo":"bar"}'
|
||||
'conf.json': '{"foo":"bar"}',
|
||||
});
|
||||
|
||||
const conf = config.loadSync('conf.json');
|
||||
@ -40,7 +40,7 @@ describe('@jsdoc/core/lib/config', () => {
|
||||
|
||||
it('finds the config file when no filepath is specified', () => {
|
||||
mockFs({
|
||||
'package.json': '{"jsdoc":{"foo":"bar"}}'
|
||||
'package.json': '{"jsdoc":{"foo":"bar"}}',
|
||||
});
|
||||
|
||||
const conf = config.loadSync();
|
||||
@ -50,7 +50,7 @@ describe('@jsdoc/core/lib/config', () => {
|
||||
|
||||
it('parses JSON config files that have an extension and contain comments', () => {
|
||||
mockFs({
|
||||
'.jsdocrc.json': '// comment\n{"foo":"bar"}'
|
||||
'.jsdocrc.json': '// comment\n{"foo":"bar"}',
|
||||
});
|
||||
|
||||
const conf = config.loadSync();
|
||||
@ -60,7 +60,7 @@ describe('@jsdoc/core/lib/config', () => {
|
||||
|
||||
it('parses JSON files that start with a BOM', () => {
|
||||
mockFs({
|
||||
'.jsdocrc.json': '\uFEFF{"foo":"bar"}'
|
||||
'.jsdocrc.json': '\uFEFF{"foo":"bar"}',
|
||||
});
|
||||
|
||||
const conf = config.loadSync();
|
||||
@ -70,7 +70,7 @@ describe('@jsdoc/core/lib/config', () => {
|
||||
|
||||
it('parses YAML files that start with a BOM', () => {
|
||||
mockFs({
|
||||
'.jsdocrc.yaml': '\uFEFF{"foo":"bar"}'
|
||||
'.jsdocrc.yaml': '\uFEFF{"foo":"bar"}',
|
||||
});
|
||||
|
||||
const conf = config.loadSync();
|
||||
@ -80,7 +80,7 @@ describe('@jsdoc/core/lib/config', () => {
|
||||
|
||||
it('provides the default config if the user config is an empty object', () => {
|
||||
mockFs({
|
||||
'.jsdocrc.json': '{}'
|
||||
'.jsdocrc.json': '{}',
|
||||
});
|
||||
|
||||
const conf = config.loadSync();
|
||||
@ -96,7 +96,7 @@ describe('@jsdoc/core/lib/config', () => {
|
||||
|
||||
it('merges nested defaults with nested user settings as expected', () => {
|
||||
mockFs({
|
||||
'.jsdocrc.json': '{"tags":{"foo":"bar"}}'
|
||||
'.jsdocrc.json': '{"tags":{"foo":"bar"}}',
|
||||
});
|
||||
|
||||
const conf = config.loadSync();
|
||||
|
||||
@ -88,13 +88,13 @@ describe('@jsdoc/core.name', () => {
|
||||
|
||||
// TODO(hegemonic): This has never worked
|
||||
xit('handles longnames with quoted portions', () => {
|
||||
expect(name.applyNamespace('foo."*don\'t.look~in#here!"', 'event'))
|
||||
.toBe('foo.event:"*don\'t.look~in#here!"');
|
||||
expect(name.applyNamespace('foo."*don\'t.look~in#here!"', 'event')).toBe(
|
||||
'foo.event:"*don\'t.look~in#here!"'
|
||||
);
|
||||
});
|
||||
|
||||
it('handles longnames that already have namespaces', () => {
|
||||
expect(name.applyNamespace('lib.Panel#event:open', 'event'))
|
||||
.toBe('lib.Panel#event:open');
|
||||
expect(name.applyNamespace('lib.Panel#event:open', 'event')).toBe('lib.Panel#event:open');
|
||||
});
|
||||
});
|
||||
|
||||
@ -190,8 +190,7 @@ describe('@jsdoc/core.name', () => {
|
||||
|
||||
describe('PUNC_TO_SCOPE', () => {
|
||||
it('has the same number of properties as SCOPE_TO_PUNC', () => {
|
||||
expect(Object.keys(name.PUNC_TO_SCOPE).length)
|
||||
.toBe(Object.keys(name.SCOPE_TO_PUNC).length);
|
||||
expect(Object.keys(name.PUNC_TO_SCOPE).length).toBe(Object.keys(name.SCOPE_TO_PUNC).length);
|
||||
});
|
||||
});
|
||||
|
||||
@ -265,18 +264,14 @@ describe('@jsdoc/core.name', () => {
|
||||
});
|
||||
|
||||
it('strips a separator when it starts on the same line as the name', () => {
|
||||
const parts = name.splitNameAndDescription(
|
||||
'socket - The networking kind, not the wrench.'
|
||||
);
|
||||
const parts = name.splitNameAndDescription('socket - The networking kind, not the wrench.');
|
||||
|
||||
expect(parts.name).toBe('socket');
|
||||
expect(parts.description).toBe('The networking kind, not the wrench.');
|
||||
});
|
||||
|
||||
it('does not strip a separator that is preceded by a line break', () => {
|
||||
const parts = name.splitNameAndDescription(
|
||||
'socket\n - The networking kind, not the wrench.'
|
||||
);
|
||||
const parts = name.splitNameAndDescription('socket\n - The networking kind, not the wrench.');
|
||||
|
||||
expect(parts.name).toBe('socket');
|
||||
expect(parts.description).toBe('- The networking kind, not the wrench.');
|
||||
|
||||
@ -2,12 +2,12 @@ module.exports = {
|
||||
env: {
|
||||
es6: true,
|
||||
jasmine: true,
|
||||
node: true
|
||||
node: true,
|
||||
},
|
||||
|
||||
parserOptions: {
|
||||
ecmaVersion: 2018,
|
||||
sourceType: 'module'
|
||||
sourceType: 'module',
|
||||
},
|
||||
|
||||
rules: {
|
||||
@ -167,8 +167,8 @@ module.exports = {
|
||||
'error',
|
||||
{
|
||||
before: false,
|
||||
after: true
|
||||
}
|
||||
after: true,
|
||||
},
|
||||
],
|
||||
'comma-style': ['error', 'last'],
|
||||
'computed-property-spacing': ['error', 'never'],
|
||||
@ -186,25 +186,25 @@ module.exports = {
|
||||
'implicit-arrow-linebreak': 'off',
|
||||
indent: [
|
||||
'error',
|
||||
4,
|
||||
2,
|
||||
{
|
||||
SwitchCase: 1
|
||||
}
|
||||
SwitchCase: 1,
|
||||
},
|
||||
],
|
||||
'jsx-quotes': ['error', 'prefer-double'],
|
||||
'key-spacing': [
|
||||
'error',
|
||||
{
|
||||
beforeColon: false,
|
||||
afterColon: true
|
||||
}
|
||||
afterColon: true,
|
||||
},
|
||||
],
|
||||
'keyword-spacing': [
|
||||
'error',
|
||||
{
|
||||
before: true,
|
||||
after: true
|
||||
}
|
||||
after: true,
|
||||
},
|
||||
],
|
||||
'line-comment-position': 'off',
|
||||
'linebreak-style': 'off',
|
||||
@ -234,8 +234,8 @@ module.exports = {
|
||||
'no-multiple-empty-lines': [
|
||||
'error',
|
||||
{
|
||||
max: 2
|
||||
}
|
||||
max: 2,
|
||||
},
|
||||
],
|
||||
'no-negated-condition': 'off',
|
||||
'no-nested-ternary': 'error',
|
||||
@ -262,18 +262,18 @@ module.exports = {
|
||||
{
|
||||
blankLine: 'always',
|
||||
prev: '*',
|
||||
next: 'return'
|
||||
next: 'return',
|
||||
},
|
||||
{
|
||||
blankLine: 'always',
|
||||
prev: ['const', 'let', 'var'],
|
||||
next: '*'
|
||||
next: '*',
|
||||
},
|
||||
{
|
||||
blankLine: 'any',
|
||||
prev: ['const', 'let', 'var'],
|
||||
next: ['const', 'let', 'var']
|
||||
}
|
||||
next: ['const', 'let', 'var'],
|
||||
},
|
||||
],
|
||||
'prefer-exponentiation-operator': 'error',
|
||||
'prefer-object-spread': 'off',
|
||||
@ -285,11 +285,14 @@ module.exports = {
|
||||
'sort-keys': 'off',
|
||||
'sort-vars': 'off', // TODO: enable?
|
||||
'space-before-blocks': ['error', 'always'],
|
||||
'space-before-function-paren': ['error', {
|
||||
'space-before-function-paren': [
|
||||
'error',
|
||||
{
|
||||
anonymous: 'never',
|
||||
named: 'never',
|
||||
asyncArrow: 'always'
|
||||
}],
|
||||
asyncArrow: 'always',
|
||||
},
|
||||
],
|
||||
'space-in-parens': 'off', // TODO: enable?
|
||||
'space-infix-ops': 'error',
|
||||
'space-unary-ops': 'error',
|
||||
@ -298,8 +301,8 @@ module.exports = {
|
||||
'error',
|
||||
{
|
||||
after: true,
|
||||
before: false
|
||||
}
|
||||
before: false,
|
||||
},
|
||||
],
|
||||
'template-tag-spacing': ['error', 'never'],
|
||||
'unicode-bom': ['error', 'never'],
|
||||
@ -312,16 +315,16 @@ module.exports = {
|
||||
'error',
|
||||
{
|
||||
before: true,
|
||||
after: true
|
||||
}
|
||||
after: true,
|
||||
},
|
||||
],
|
||||
'constructor-super': 'error',
|
||||
'generator-star-spacing': [
|
||||
'error',
|
||||
{
|
||||
before: true,
|
||||
after: false
|
||||
}
|
||||
after: false,
|
||||
},
|
||||
],
|
||||
'no-class-assign': 'error',
|
||||
'no-confusing-arrow': 'error',
|
||||
@ -330,8 +333,8 @@ module.exports = {
|
||||
'no-duplicate-imports': [
|
||||
'error',
|
||||
{
|
||||
includeExports: true
|
||||
}
|
||||
includeExports: true,
|
||||
},
|
||||
],
|
||||
'no-new-symbol': 'error',
|
||||
'no-restricted-exports': 'off',
|
||||
@ -354,6 +357,6 @@ module.exports = {
|
||||
'sort-imports': 'error',
|
||||
'symbol-description': 'error',
|
||||
'template-curly-spacing': ['error', 'never'],
|
||||
'yield-star-spacing': ['error', 'before']
|
||||
}
|
||||
'yield-star-spacing': ['error', 'before'],
|
||||
},
|
||||
};
|
||||
|
||||
@ -5,5 +5,5 @@ const { Syntax } = require('./lib/syntax');
|
||||
module.exports = {
|
||||
AstBuilder,
|
||||
astNode,
|
||||
Syntax
|
||||
Syntax,
|
||||
};
|
||||
|
||||
@ -3,7 +3,7 @@ const babelParser = require('@babel/parser');
|
||||
const { log } = require('@jsdoc/util');
|
||||
|
||||
// Exported so we can use them in tests.
|
||||
const parserOptions = exports.parserOptions = {
|
||||
const parserOptions = (exports.parserOptions = {
|
||||
allowAwaitOutsideFunction: true,
|
||||
allowImportExportEverywhere: true,
|
||||
allowReturnOutsideFunction: true,
|
||||
@ -15,9 +15,12 @@ const parserOptions = exports.parserOptions = {
|
||||
'classPrivateMethods',
|
||||
'classPrivateProperties',
|
||||
'classProperties',
|
||||
['decorators', {
|
||||
decoratorsBeforeExport: true
|
||||
}],
|
||||
[
|
||||
'decorators',
|
||||
{
|
||||
decoratorsBeforeExport: true,
|
||||
},
|
||||
],
|
||||
'doExpressions',
|
||||
'dynamicImport',
|
||||
'estree',
|
||||
@ -33,22 +36,24 @@ const parserOptions = exports.parserOptions = {
|
||||
'objectRestSpread',
|
||||
'optionalCatchBinding',
|
||||
'optionalChaining',
|
||||
['pipelineOperator', {
|
||||
proposal: 'minimal'
|
||||
}],
|
||||
'throwExpressions'
|
||||
[
|
||||
'pipelineOperator',
|
||||
{
|
||||
proposal: 'minimal',
|
||||
},
|
||||
],
|
||||
ranges: true
|
||||
};
|
||||
'throwExpressions',
|
||||
],
|
||||
ranges: true,
|
||||
});
|
||||
|
||||
function parse(source, filename, sourceType) {
|
||||
let ast;
|
||||
const options = _.defaults({}, parserOptions, {sourceType});
|
||||
const options = _.defaults({}, parserOptions, { sourceType });
|
||||
|
||||
try {
|
||||
ast = babelParser.parse(source, options);
|
||||
}
|
||||
catch (e) {
|
||||
} catch (e) {
|
||||
log.error(`Unable to parse ${filename}: ${e.message}`);
|
||||
}
|
||||
|
||||
|
||||
@ -15,7 +15,7 @@ let uid = 100000000;
|
||||
* @param {(Object|string)} node - The AST node to check, or the `type` property of a node.
|
||||
* @return {boolean} Set to `true` if the node is a function or `false` in all other cases.
|
||||
*/
|
||||
const isFunction = exports.isFunction = node => {
|
||||
const isFunction = (exports.isFunction = (node) => {
|
||||
let type;
|
||||
|
||||
if (!node) {
|
||||
@ -24,14 +24,17 @@ const isFunction = exports.isFunction = node => {
|
||||
|
||||
if (typeof node === 'string') {
|
||||
type = node;
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
type = node.type;
|
||||
}
|
||||
|
||||
return type === Syntax.FunctionDeclaration || type === Syntax.FunctionExpression ||
|
||||
type === Syntax.MethodDefinition || type === Syntax.ArrowFunctionExpression;
|
||||
};
|
||||
return (
|
||||
type === Syntax.FunctionDeclaration ||
|
||||
type === Syntax.FunctionExpression ||
|
||||
type === Syntax.MethodDefinition ||
|
||||
type === Syntax.ArrowFunctionExpression
|
||||
);
|
||||
});
|
||||
|
||||
/**
|
||||
* Check whether an AST node creates a new scope.
|
||||
@ -40,12 +43,18 @@ const isFunction = exports.isFunction = node => {
|
||||
* @param {Object} node - The AST node to check.
|
||||
* @return {Boolean} Set to `true` if the node creates a new scope, or `false` in all other cases.
|
||||
*/
|
||||
exports.isScope = node => // TODO: handle blocks with "let" declarations
|
||||
Boolean(node) && typeof node === 'object' && (node.type === Syntax.CatchClause ||
|
||||
node.type === Syntax.ClassDeclaration || node.type === Syntax.ClassExpression || isFunction(node));
|
||||
exports.isScope = (
|
||||
node // TODO: handle blocks with "let" declarations
|
||||
) =>
|
||||
Boolean(node) &&
|
||||
typeof node === 'object' &&
|
||||
(node.type === Syntax.CatchClause ||
|
||||
node.type === Syntax.ClassDeclaration ||
|
||||
node.type === Syntax.ClassExpression ||
|
||||
isFunction(node));
|
||||
|
||||
// TODO: docs
|
||||
exports.addNodeProperties = node => {
|
||||
exports.addNodeProperties = (node) => {
|
||||
const newProperties = {};
|
||||
|
||||
if (!node || typeof node !== 'object') {
|
||||
@ -55,7 +64,7 @@ exports.addNodeProperties = node => {
|
||||
if (!node.nodeId) {
|
||||
newProperties.nodeId = {
|
||||
value: `astnode${uid++}`,
|
||||
enumerable: true
|
||||
enumerable: true,
|
||||
};
|
||||
}
|
||||
|
||||
@ -63,7 +72,7 @@ exports.addNodeProperties = node => {
|
||||
newProperties.parent = {
|
||||
// `null` means 'no parent', so use `undefined` for now
|
||||
value: undefined,
|
||||
writable: true
|
||||
writable: true,
|
||||
};
|
||||
}
|
||||
|
||||
@ -71,7 +80,7 @@ exports.addNodeProperties = node => {
|
||||
newProperties.enclosingScope = {
|
||||
// `null` means 'no enclosing scope', so use `undefined` for now
|
||||
value: undefined,
|
||||
writable: true
|
||||
writable: true,
|
||||
};
|
||||
}
|
||||
|
||||
@ -80,7 +89,7 @@ exports.addNodeProperties = node => {
|
||||
enumerable: true,
|
||||
get() {
|
||||
return this.parent ? this.parent.nodeId : null;
|
||||
}
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
@ -89,7 +98,7 @@ exports.addNodeProperties = node => {
|
||||
enumerable: true,
|
||||
get() {
|
||||
return this.enclosingScope ? this.enclosingScope.nodeId : null;
|
||||
}
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
@ -99,7 +108,7 @@ exports.addNodeProperties = node => {
|
||||
};
|
||||
|
||||
// TODO: docs
|
||||
const nodeToValue = exports.nodeToValue = node => {
|
||||
const nodeToValue = (exports.nodeToValue = (node) => {
|
||||
let key;
|
||||
let parent;
|
||||
let str;
|
||||
@ -113,8 +122,7 @@ const nodeToValue = exports.nodeToValue = node => {
|
||||
// JSON.stringify([,]).
|
||||
if (!el) {
|
||||
tempObject[i] = null;
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
tempObject[i] = nodeToValue(el);
|
||||
}
|
||||
});
|
||||
@ -164,8 +172,7 @@ const nodeToValue = exports.nodeToValue = node => {
|
||||
// we need a single value, so we use the first variable name
|
||||
if (node.declaration.declarations) {
|
||||
str = `exports.${nodeToValue(node.declaration.declarations[0])}`;
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
str = `exports.${nodeToValue(node.declaration)}`;
|
||||
}
|
||||
}
|
||||
@ -202,8 +209,7 @@ const nodeToValue = exports.nodeToValue = node => {
|
||||
str = nodeToValue(node.object);
|
||||
if (node.computed) {
|
||||
str += `[${node.property.raw}]`;
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
str += `.${nodeToValue(node.property)}`;
|
||||
}
|
||||
break;
|
||||
@ -218,14 +224,20 @@ const nodeToValue = exports.nodeToValue = node => {
|
||||
str = nodeToValue(parent.parent) || '';
|
||||
}
|
||||
// for the constructor of a module's default export, use a special name
|
||||
else if (node.kind === 'constructor' && parent.parent &&
|
||||
parent.parent.type === Syntax.ExportDefaultDeclaration) {
|
||||
else if (
|
||||
node.kind === 'constructor' &&
|
||||
parent.parent &&
|
||||
parent.parent.type === Syntax.ExportDefaultDeclaration
|
||||
) {
|
||||
str = 'module.exports';
|
||||
}
|
||||
// for the constructor of a module's named export, use the name of the export
|
||||
// declaration
|
||||
else if (node.kind === 'constructor' && parent.parent &&
|
||||
parent.parent.type === Syntax.ExportNamedDeclaration) {
|
||||
else if (
|
||||
node.kind === 'constructor' &&
|
||||
parent.parent &&
|
||||
parent.parent.type === Syntax.ExportNamedDeclaration
|
||||
) {
|
||||
str = nodeToValue(parent.parent);
|
||||
}
|
||||
// for other constructors, use the name of the parent class
|
||||
@ -252,7 +264,7 @@ const nodeToValue = exports.nodeToValue = node => {
|
||||
|
||||
case Syntax.ObjectExpression:
|
||||
tempObject = {};
|
||||
node.properties.forEach(prop => {
|
||||
node.properties.forEach((prop) => {
|
||||
// ExperimentalSpreadProperty have no key
|
||||
// like var hello = {...hi};
|
||||
if (!prop.key) {
|
||||
@ -264,8 +276,7 @@ const nodeToValue = exports.nodeToValue = node => {
|
||||
// preserve literal values so that the JSON form shows the correct type
|
||||
if (prop.value.type === Syntax.Literal) {
|
||||
tempObject[key] = prop.value.value;
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
tempObject[key] = nodeToValue(prop);
|
||||
}
|
||||
});
|
||||
@ -288,8 +299,7 @@ const nodeToValue = exports.nodeToValue = node => {
|
||||
|
||||
if (node.prefix === true) {
|
||||
str = cast(node.operator + str);
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
// this shouldn't happen
|
||||
throw new Error(`Found a UnaryExpression with a postfix operator: ${node}`);
|
||||
}
|
||||
@ -304,13 +314,13 @@ const nodeToValue = exports.nodeToValue = node => {
|
||||
}
|
||||
|
||||
return str;
|
||||
};
|
||||
});
|
||||
|
||||
// backwards compatibility
|
||||
exports.nodeToString = nodeToValue;
|
||||
|
||||
// TODO: docs
|
||||
const getParamNames = exports.getParamNames = node => {
|
||||
const getParamNames = (exports.getParamNames = (node) => {
|
||||
let params;
|
||||
|
||||
if (!node || !node.params) {
|
||||
@ -319,23 +329,27 @@ const getParamNames = exports.getParamNames = node => {
|
||||
|
||||
params = node.params.slice(0);
|
||||
|
||||
return params.map(param => nodeToValue(param));
|
||||
};
|
||||
return params.map((param) => nodeToValue(param));
|
||||
});
|
||||
|
||||
// TODO: docs
|
||||
const isAccessor = exports.isAccessor = node => Boolean(node) && typeof node === 'object' &&
|
||||
const isAccessor = (exports.isAccessor = (node) =>
|
||||
Boolean(node) &&
|
||||
typeof node === 'object' &&
|
||||
(node.type === Syntax.Property || node.type === Syntax.MethodDefinition) &&
|
||||
(node.kind === 'get' || node.kind === 'set');
|
||||
(node.kind === 'get' || node.kind === 'set'));
|
||||
|
||||
// TODO: docs
|
||||
exports.isAssignment = node => Boolean(node) && typeof node === 'object' &&
|
||||
exports.isAssignment = (node) =>
|
||||
Boolean(node) &&
|
||||
typeof node === 'object' &&
|
||||
(node.type === Syntax.AssignmentExpression || node.type === Syntax.VariableDeclarator);
|
||||
|
||||
// TODO: docs
|
||||
/**
|
||||
* Retrieve information about the node, including its name and type.
|
||||
*/
|
||||
exports.getInfo = node => {
|
||||
exports.getInfo = (node) => {
|
||||
const info = {};
|
||||
|
||||
switch (node.type) {
|
||||
@ -374,14 +388,13 @@ exports.getInfo = node => {
|
||||
// if this class is the default export, we need to use a special name
|
||||
if (node.parent && node.parent.type === Syntax.ExportDefaultDeclaration) {
|
||||
info.name = 'module.exports';
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
info.name = node.id ? nodeToValue(node.id) : '';
|
||||
}
|
||||
info.type = info.node.type;
|
||||
info.paramnames = [];
|
||||
|
||||
node.body.body.some(({kind, value}) => {
|
||||
node.body.body.some(({ kind, value }) => {
|
||||
if (kind === 'constructor') {
|
||||
info.paramnames = getParamNames(value);
|
||||
|
||||
@ -420,7 +433,7 @@ exports.getInfo = node => {
|
||||
info.name = nodeToValue(node);
|
||||
info.type = info.node.type;
|
||||
|
||||
if ( isFunction(info.node) ) {
|
||||
if (isFunction(info.node)) {
|
||||
info.paramnames = getParamNames(info.node);
|
||||
}
|
||||
|
||||
@ -431,11 +444,10 @@ exports.getInfo = node => {
|
||||
case Syntax.ExportNamedDeclaration:
|
||||
info.node = node;
|
||||
info.name = nodeToValue(info.node);
|
||||
info.type = info.node.declaration ? info.node.declaration.type :
|
||||
Syntax.ObjectExpression;
|
||||
info.type = info.node.declaration ? info.node.declaration.type : Syntax.ObjectExpression;
|
||||
|
||||
if (info.node.declaration) {
|
||||
if ( isFunction(info.node.declaration) ) {
|
||||
if (isFunction(info.node.declaration)) {
|
||||
info.paramnames = getParamNames(info.node.declaration);
|
||||
}
|
||||
|
||||
@ -455,7 +467,7 @@ exports.getInfo = node => {
|
||||
info.name = nodeToValue(info.node);
|
||||
info.type = info.node.local.type;
|
||||
|
||||
if ( isFunction(info.node.local) ) {
|
||||
if (isFunction(info.node.local)) {
|
||||
info.paramnames = getParamNames(info.node.local);
|
||||
}
|
||||
|
||||
@ -508,15 +520,14 @@ exports.getInfo = node => {
|
||||
info.value = nodeToValue(info.node);
|
||||
|
||||
// property names with unsafe characters must be quoted
|
||||
if ( !/^[$_a-zA-Z0-9]*$/.test(info.name) ) {
|
||||
if (!/^[$_a-zA-Z0-9]*$/.test(info.name)) {
|
||||
info.name = `"${String(info.name).replace(/"/g, '\\"')}"`;
|
||||
}
|
||||
|
||||
if ( isAccessor(node) ) {
|
||||
if (isAccessor(node)) {
|
||||
info.type = nodeToValue(info.node);
|
||||
info.paramnames = getParamNames(info.node);
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
info.type = info.node.type;
|
||||
}
|
||||
|
||||
|
||||
@ -92,5 +92,5 @@ exports.Syntax = {
|
||||
VariableDeclarator: 'VariableDeclarator',
|
||||
WhileStatement: 'WhileStatement',
|
||||
WithStatement: 'WithStatement',
|
||||
YieldExpression: 'YieldExpression'
|
||||
YieldExpression: 'YieldExpression',
|
||||
};
|
||||
|
||||
@ -24,7 +24,7 @@ describe('@jsdoc/parse/lib/ast-node', () => {
|
||||
const literal = parse('1;').expression;
|
||||
const memberExpression = parse('foo.bar;').expression;
|
||||
const memberExpressionComputed1 = parse('foo["bar"];').expression;
|
||||
const memberExpressionComputed2 = parse('foo[\'bar\'];').expression;
|
||||
const memberExpressionComputed2 = parse("foo['bar'];").expression;
|
||||
const methodDefinition1 = parse('class Foo { bar() {} }').body.body[0];
|
||||
const methodDefinition2 = parse('var foo = () => class { bar() {} };').declarations[0].init.body
|
||||
.body[0];
|
||||
@ -79,15 +79,15 @@ describe('@jsdoc/parse/lib/ast-node', () => {
|
||||
|
||||
describe('addNodeProperties', () => {
|
||||
it('should return null for undefined input', () => {
|
||||
expect( astNode.addNodeProperties() ).toBe(null);
|
||||
expect(astNode.addNodeProperties()).toBe(null);
|
||||
});
|
||||
|
||||
it('should return null if the input is not an object', () => {
|
||||
expect( astNode.addNodeProperties('foo') ).toBe(null);
|
||||
expect(astNode.addNodeProperties('foo')).toBe(null);
|
||||
});
|
||||
|
||||
it('should preserve existing properties that are not "node properties"', () => {
|
||||
const node = astNode.addNodeProperties({foo: 1});
|
||||
const node = astNode.addNodeProperties({ foo: 1 });
|
||||
|
||||
expect(node).toBeObject();
|
||||
expect(node.foo).toBe(1);
|
||||
@ -104,7 +104,7 @@ describe('@jsdoc/parse/lib/ast-node', () => {
|
||||
|
||||
it('should not overwrite an existing nodeId', () => {
|
||||
const nodeId = 'foo';
|
||||
const node = astNode.addNodeProperties({nodeId: nodeId});
|
||||
const node = astNode.addNodeProperties({ nodeId: nodeId });
|
||||
|
||||
expect(node.nodeId).toBe(nodeId);
|
||||
});
|
||||
@ -121,13 +121,13 @@ describe('@jsdoc/parse/lib/ast-node', () => {
|
||||
|
||||
it('should not overwrite an existing parent', () => {
|
||||
const parent = {};
|
||||
const node = astNode.addNodeProperties({parent: parent});
|
||||
const node = astNode.addNodeProperties({ parent: parent });
|
||||
|
||||
expect(node.parent).toBe(parent);
|
||||
});
|
||||
|
||||
it('should not overwrite a null parent', () => {
|
||||
const node = astNode.addNodeProperties({parent: null});
|
||||
const node = astNode.addNodeProperties({ parent: null });
|
||||
|
||||
expect(node.parent).toBeNull();
|
||||
});
|
||||
@ -167,13 +167,13 @@ describe('@jsdoc/parse/lib/ast-node', () => {
|
||||
|
||||
it('should not overwrite an existing enclosingScope', () => {
|
||||
const enclosingScope = {};
|
||||
const node = astNode.addNodeProperties({enclosingScope: enclosingScope});
|
||||
const node = astNode.addNodeProperties({ enclosingScope: enclosingScope });
|
||||
|
||||
expect(node.enclosingScope).toBe(enclosingScope);
|
||||
});
|
||||
|
||||
it('should not overwrite a null enclosingScope', () => {
|
||||
const node = astNode.addNodeProperties({enclosingScope: null});
|
||||
const node = astNode.addNodeProperties({ enclosingScope: null });
|
||||
|
||||
expect(node.enclosingScope).toBeNull();
|
||||
});
|
||||
@ -373,11 +373,7 @@ describe('@jsdoc/parse/lib/ast-node', () => {
|
||||
it('should return a multi-item array if the input has multiple params', () => {
|
||||
const params = astNode.getParamNames(functionDeclaration3);
|
||||
|
||||
expect(params).toEqual([
|
||||
'bar',
|
||||
'baz',
|
||||
'qux'
|
||||
]);
|
||||
expect(params).toEqual(['bar', 'baz', 'qux']);
|
||||
});
|
||||
|
||||
it('should include rest parameters', () => {
|
||||
@ -389,97 +385,97 @@ describe('@jsdoc/parse/lib/ast-node', () => {
|
||||
|
||||
describe('isAccessor', () => {
|
||||
it('should return false for undefined values', () => {
|
||||
expect( astNode.isAccessor() ).toBeFalse();
|
||||
expect(astNode.isAccessor()).toBeFalse();
|
||||
});
|
||||
|
||||
it('should return false if the parameter is not an object', () => {
|
||||
expect( astNode.isAccessor('foo') ).toBeFalse();
|
||||
expect(astNode.isAccessor('foo')).toBeFalse();
|
||||
});
|
||||
|
||||
it('should return false for non-Property nodes', () => {
|
||||
expect( astNode.isAccessor(binaryExpression) ).toBeFalse();
|
||||
expect(astNode.isAccessor(binaryExpression)).toBeFalse();
|
||||
});
|
||||
|
||||
it('should return false for Property nodes whose kind is "init"', () => {
|
||||
expect( astNode.isAccessor(propertyInit) ).toBeFalse();
|
||||
expect(astNode.isAccessor(propertyInit)).toBeFalse();
|
||||
});
|
||||
|
||||
it('should return true for Property nodes whose kind is "get"', () => {
|
||||
expect( astNode.isAccessor(propertyGet) ).toBeTrue();
|
||||
expect(astNode.isAccessor(propertyGet)).toBeTrue();
|
||||
});
|
||||
|
||||
it('should return true for Property nodes whose kind is "set"', () => {
|
||||
expect( astNode.isAccessor(propertySet) ).toBeTrue();
|
||||
expect(astNode.isAccessor(propertySet)).toBeTrue();
|
||||
});
|
||||
});
|
||||
|
||||
describe('isAssignment', () => {
|
||||
it('should return false for undefined values', () => {
|
||||
expect( astNode.isAssignment() ).toBeFalse();
|
||||
expect(astNode.isAssignment()).toBeFalse();
|
||||
});
|
||||
|
||||
it('should return false if the parameter is not an object', () => {
|
||||
expect( astNode.isAssignment('foo') ).toBeFalse();
|
||||
expect(astNode.isAssignment('foo')).toBeFalse();
|
||||
});
|
||||
|
||||
it('should return false for nodes that are not assignments', () => {
|
||||
expect( astNode.isAssignment(binaryExpression) ).toBeFalse();
|
||||
expect(astNode.isAssignment(binaryExpression)).toBeFalse();
|
||||
});
|
||||
|
||||
it('should return true for AssignmentExpression nodes', () => {
|
||||
expect( astNode.isAssignment(assignmentExpression) ).toBeTrue();
|
||||
expect(astNode.isAssignment(assignmentExpression)).toBeTrue();
|
||||
});
|
||||
|
||||
it('should return true for VariableDeclarator nodes', () => {
|
||||
expect( astNode.isAssignment(variableDeclarator1) ).toBeTrue();
|
||||
expect(astNode.isAssignment(variableDeclarator1)).toBeTrue();
|
||||
});
|
||||
});
|
||||
|
||||
describe('isFunction', () => {
|
||||
it('should recognize function declarations as functions', () => {
|
||||
expect( astNode.isFunction(functionDeclaration1) ).toBeTrue();
|
||||
expect(astNode.isFunction(functionDeclaration1)).toBeTrue();
|
||||
});
|
||||
|
||||
it('should recognize function expressions as functions', () => {
|
||||
expect( astNode.isFunction(functionExpression1) ).toBeTrue();
|
||||
expect(astNode.isFunction(functionExpression1)).toBeTrue();
|
||||
});
|
||||
|
||||
it('should recognize method definitions as functions', () => {
|
||||
expect( astNode.isFunction(methodDefinition1) ).toBeTrue();
|
||||
expect(astNode.isFunction(methodDefinition1)).toBeTrue();
|
||||
});
|
||||
|
||||
it('should recognize arrow function expressions as functions', () => {
|
||||
expect( astNode.isFunction(arrowFunctionExpression) ).toBeTrue();
|
||||
expect(astNode.isFunction(arrowFunctionExpression)).toBeTrue();
|
||||
});
|
||||
|
||||
it('should recognize non-functions', () => {
|
||||
expect( astNode.isFunction(arrayExpression) ).toBeFalse();
|
||||
expect(astNode.isFunction(arrayExpression)).toBeFalse();
|
||||
});
|
||||
});
|
||||
|
||||
describe('isScope', () => {
|
||||
it('should return false for undefined values', () => {
|
||||
expect( astNode.isScope() ).toBeFalse();
|
||||
expect(astNode.isScope()).toBeFalse();
|
||||
});
|
||||
|
||||
it('should return false if the parameter is not an object', () => {
|
||||
expect( astNode.isScope('foo') ).toBeFalse();
|
||||
expect(astNode.isScope('foo')).toBeFalse();
|
||||
});
|
||||
|
||||
it('should return true for CatchClause nodes', () => {
|
||||
expect( astNode.isScope({type: Syntax.CatchClause}) ).toBeTrue();
|
||||
expect(astNode.isScope({ type: Syntax.CatchClause })).toBeTrue();
|
||||
});
|
||||
|
||||
it('should return true for FunctionDeclaration nodes', () => {
|
||||
expect( astNode.isScope({type: Syntax.FunctionDeclaration}) ).toBeTrue();
|
||||
expect(astNode.isScope({ type: Syntax.FunctionDeclaration })).toBeTrue();
|
||||
});
|
||||
|
||||
it('should return true for FunctionExpression nodes', () => {
|
||||
expect( astNode.isScope({type: Syntax.FunctionExpression}) ).toBeTrue();
|
||||
expect(astNode.isScope({ type: Syntax.FunctionExpression })).toBeTrue();
|
||||
});
|
||||
|
||||
it('should return false for other nodes', () => {
|
||||
expect( astNode.isScope({type: Syntax.NameExpression}) ).toBeFalse();
|
||||
expect(astNode.isScope({ type: Syntax.NameExpression })).toBeFalse();
|
||||
});
|
||||
});
|
||||
|
||||
@ -491,56 +487,61 @@ describe('@jsdoc/parse/lib/ast-node', () => {
|
||||
|
||||
describe('nodeToValue', () => {
|
||||
it('should return `[null]` for the sparse array `[,]`', () => {
|
||||
expect( astNode.nodeToValue(arrayExpression) ).toBe('[null]');
|
||||
expect(astNode.nodeToValue(arrayExpression)).toBe('[null]');
|
||||
});
|
||||
|
||||
it('should return the variable name for assignment expressions', () => {
|
||||
expect( astNode.nodeToValue(assignmentExpression) ).toBe('foo');
|
||||
expect(astNode.nodeToValue(assignmentExpression)).toBe('foo');
|
||||
});
|
||||
|
||||
it('should return the function name for function declarations', () => {
|
||||
expect( astNode.nodeToValue(functionDeclaration1) ).toBe('foo');
|
||||
expect(astNode.nodeToValue(functionDeclaration1)).toBe('foo');
|
||||
});
|
||||
|
||||
it('should return undefined for anonymous function expressions', () => {
|
||||
expect( astNode.nodeToValue(functionExpression1) ).toBeUndefined();
|
||||
expect(astNode.nodeToValue(functionExpression1)).toBeUndefined();
|
||||
});
|
||||
|
||||
it('should return the identifier name for identifiers', () => {
|
||||
expect( astNode.nodeToValue(identifier) ).toBe('foo');
|
||||
expect(astNode.nodeToValue(identifier)).toBe('foo');
|
||||
});
|
||||
|
||||
it('should return the literal value for literals', () => {
|
||||
expect( astNode.nodeToValue(literal) ).toBe(1);
|
||||
expect(astNode.nodeToValue(literal)).toBe(1);
|
||||
});
|
||||
|
||||
it('should return the object and property for noncomputed member expressions', () => {
|
||||
expect( astNode.nodeToValue(memberExpression) ).toBe('foo.bar');
|
||||
expect(astNode.nodeToValue(memberExpression)).toBe('foo.bar');
|
||||
});
|
||||
|
||||
it('should return the object and property, with a computed property that uses the same ' +
|
||||
'quote character as the original source, for computed member expressions', () => {
|
||||
expect( astNode.nodeToValue(memberExpressionComputed1) ).toBe('foo["bar"]');
|
||||
expect( astNode.nodeToValue(memberExpressionComputed2) ).toBe('foo[\'bar\']');
|
||||
});
|
||||
it(
|
||||
'should return the object and property, with a computed property that uses the same ' +
|
||||
'quote character as the original source, for computed member expressions',
|
||||
() => {
|
||||
expect(astNode.nodeToValue(memberExpressionComputed1)).toBe('foo["bar"]');
|
||||
expect(astNode.nodeToValue(memberExpressionComputed2)).toBe("foo['bar']");
|
||||
}
|
||||
);
|
||||
|
||||
// TODO: we can't test this here because JSDoc, not Babylon, adds the `parent` property to
|
||||
// nodes. also, we currently return an empty string instead of `<anonymous>` in this case;
|
||||
// see `module:@jsdoc/parse.astNode.nodeToValue` and the comment on
|
||||
// `Syntax.MethodDefinition` for details
|
||||
xit('should return `<anonymous>` for method definitions inside classes that were ' +
|
||||
'returned by an arrow function expression', () => {
|
||||
expect( astNode.nodeToValue(methodDefinition2) ).toBe('<anonymous>');
|
||||
});
|
||||
xit(
|
||||
'should return `<anonymous>` for method definitions inside classes that were ' +
|
||||
'returned by an arrow function expression',
|
||||
() => {
|
||||
expect(astNode.nodeToValue(methodDefinition2)).toBe('<anonymous>');
|
||||
}
|
||||
);
|
||||
|
||||
it('should return "this" for this expressions', () => {
|
||||
expect( astNode.nodeToValue(thisExpression) ).toBe('this');
|
||||
expect(astNode.nodeToValue(thisExpression)).toBe('this');
|
||||
});
|
||||
|
||||
it('should return the operator and nodeToValue value for prefix unary expressions',
|
||||
() => {
|
||||
expect( astNode.nodeToValue(unaryExpression1) ).toBe('+1');
|
||||
expect( astNode.nodeToValue(unaryExpression2) ).toBe('+foo');
|
||||
it('should return the operator and nodeToValue value for prefix unary expressions', () => {
|
||||
expect(astNode.nodeToValue(unaryExpression1)).toBe('+1');
|
||||
expect(astNode.nodeToValue(unaryExpression2)).toBe('+foo');
|
||||
});
|
||||
|
||||
it('should throw an error for postfix unary expressions', () => {
|
||||
@ -561,15 +562,15 @@ describe('@jsdoc/parse/lib/ast-node', () => {
|
||||
});
|
||||
|
||||
it('should return the variable name for variable declarators', () => {
|
||||
expect( astNode.nodeToValue(variableDeclarator1) ).toBe('foo');
|
||||
expect(astNode.nodeToValue(variableDeclarator1)).toBe('foo');
|
||||
});
|
||||
|
||||
it('should return an empty string for all other nodes', () => {
|
||||
expect( astNode.nodeToValue(binaryExpression) ).toBe('');
|
||||
expect(astNode.nodeToValue(binaryExpression)).toBe('');
|
||||
});
|
||||
|
||||
it('should understand and ignore ExperimentalSpreadProperty', () => {
|
||||
expect( astNode.nodeToValue(experimentalObjectRestSpread) ).toBe('{"three":4}');
|
||||
expect(astNode.nodeToValue(experimentalObjectRestSpread)).toBe('{"three":4}');
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
14
packages/jsdoc-prettier-config/.npmignore
Normal file
14
packages/jsdoc-prettier-config/.npmignore
Normal file
@ -0,0 +1,14 @@
|
||||
.editorconfig
|
||||
.eslintignore
|
||||
.eslintrc.js
|
||||
.gitignore
|
||||
.github/
|
||||
.renovaterc.json
|
||||
.travis.yml
|
||||
CHANGES.md
|
||||
CODE_OF_CONDUCT.md
|
||||
CONTRIBUTING.md
|
||||
gulpfile.js
|
||||
lerna.json
|
||||
packages/
|
||||
test/
|
||||
202
packages/jsdoc-prettier-config/LICENSE
Normal file
202
packages/jsdoc-prettier-config/LICENSE
Normal file
@ -0,0 +1,202 @@
|
||||
|
||||
Apache License
|
||||
Version 2.0, January 2004
|
||||
http://www.apache.org/licenses/
|
||||
|
||||
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
||||
|
||||
1. Definitions.
|
||||
|
||||
"License" shall mean the terms and conditions for use, reproduction,
|
||||
and distribution as defined by Sections 1 through 9 of this document.
|
||||
|
||||
"Licensor" shall mean the copyright owner or entity authorized by
|
||||
the copyright owner that is granting the License.
|
||||
|
||||
"Legal Entity" shall mean the union of the acting entity and all
|
||||
other entities that control, are controlled by, or are under common
|
||||
control with that entity. For the purposes of this definition,
|
||||
"control" means (i) the power, direct or indirect, to cause the
|
||||
direction or management of such entity, whether by contract or
|
||||
otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
||||
outstanding shares, or (iii) beneficial ownership of such entity.
|
||||
|
||||
"You" (or "Your") shall mean an individual or Legal Entity
|
||||
exercising permissions granted by this License.
|
||||
|
||||
"Source" form shall mean the preferred form for making modifications,
|
||||
including but not limited to software source code, documentation
|
||||
source, and configuration files.
|
||||
|
||||
"Object" form shall mean any form resulting from mechanical
|
||||
transformation or translation of a Source form, including but
|
||||
not limited to compiled object code, generated documentation,
|
||||
and conversions to other media types.
|
||||
|
||||
"Work" shall mean the work of authorship, whether in Source or
|
||||
Object form, made available under the License, as indicated by a
|
||||
copyright notice that is included in or attached to the work
|
||||
(an example is provided in the Appendix below).
|
||||
|
||||
"Derivative Works" shall mean any work, whether in Source or Object
|
||||
form, that is based on (or derived from) the Work and for which the
|
||||
editorial revisions, annotations, elaborations, or other modifications
|
||||
represent, as a whole, an original work of authorship. For the purposes
|
||||
of this License, Derivative Works shall not include works that remain
|
||||
separable from, or merely link (or bind by name) to the interfaces of,
|
||||
the Work and Derivative Works thereof.
|
||||
|
||||
"Contribution" shall mean any work of authorship, including
|
||||
the original version of the Work and any modifications or additions
|
||||
to that Work or Derivative Works thereof, that is intentionally
|
||||
submitted to Licensor for inclusion in the Work by the copyright owner
|
||||
or by an individual or Legal Entity authorized to submit on behalf of
|
||||
the copyright owner. For the purposes of this definition, "submitted"
|
||||
means any form of electronic, verbal, or written communication sent
|
||||
to the Licensor or its representatives, including but not limited to
|
||||
communication on electronic mailing lists, source code control systems,
|
||||
and issue tracking systems that are managed by, or on behalf of, the
|
||||
Licensor for the purpose of discussing and improving the Work, but
|
||||
excluding communication that is conspicuously marked or otherwise
|
||||
designated in writing by the copyright owner as "Not a Contribution."
|
||||
|
||||
"Contributor" shall mean Licensor and any individual or Legal Entity
|
||||
on behalf of whom a Contribution has been received by Licensor and
|
||||
subsequently incorporated within the Work.
|
||||
|
||||
2. Grant of Copyright License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
copyright license to reproduce, prepare Derivative Works of,
|
||||
publicly display, publicly perform, sublicense, and distribute the
|
||||
Work and such Derivative Works in Source or Object form.
|
||||
|
||||
3. Grant of Patent License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
(except as stated in this section) patent license to make, have made,
|
||||
use, offer to sell, sell, import, and otherwise transfer the Work,
|
||||
where such license applies only to those patent claims licensable
|
||||
by such Contributor that are necessarily infringed by their
|
||||
Contribution(s) alone or by combination of their Contribution(s)
|
||||
with the Work to which such Contribution(s) was submitted. If You
|
||||
institute patent litigation against any entity (including a
|
||||
cross-claim or counterclaim in a lawsuit) alleging that the Work
|
||||
or a Contribution incorporated within the Work constitutes direct
|
||||
or contributory patent infringement, then any patent licenses
|
||||
granted to You under this License for that Work shall terminate
|
||||
as of the date such litigation is filed.
|
||||
|
||||
4. Redistribution. You may reproduce and distribute copies of the
|
||||
Work or Derivative Works thereof in any medium, with or without
|
||||
modifications, and in Source or Object form, provided that You
|
||||
meet the following conditions:
|
||||
|
||||
(a) You must give any other recipients of the Work or
|
||||
Derivative Works a copy of this License; and
|
||||
|
||||
(b) You must cause any modified files to carry prominent notices
|
||||
stating that You changed the files; and
|
||||
|
||||
(c) You must retain, in the Source form of any Derivative Works
|
||||
that You distribute, all copyright, patent, trademark, and
|
||||
attribution notices from the Source form of the Work,
|
||||
excluding those notices that do not pertain to any part of
|
||||
the Derivative Works; and
|
||||
|
||||
(d) If the Work includes a "NOTICE" text file as part of its
|
||||
distribution, then any Derivative Works that You distribute must
|
||||
include a readable copy of the attribution notices contained
|
||||
within such NOTICE file, excluding those notices that do not
|
||||
pertain to any part of the Derivative Works, in at least one
|
||||
of the following places: within a NOTICE text file distributed
|
||||
as part of the Derivative Works; within the Source form or
|
||||
documentation, if provided along with the Derivative Works; or,
|
||||
within a display generated by the Derivative Works, if and
|
||||
wherever such third-party notices normally appear. The contents
|
||||
of the NOTICE file are for informational purposes only and
|
||||
do not modify the License. You may add Your own attribution
|
||||
notices within Derivative Works that You distribute, alongside
|
||||
or as an addendum to the NOTICE text from the Work, provided
|
||||
that such additional attribution notices cannot be construed
|
||||
as modifying the License.
|
||||
|
||||
You may add Your own copyright statement to Your modifications and
|
||||
may provide additional or different license terms and conditions
|
||||
for use, reproduction, or distribution of Your modifications, or
|
||||
for any such Derivative Works as a whole, provided Your use,
|
||||
reproduction, and distribution of the Work otherwise complies with
|
||||
the conditions stated in this License.
|
||||
|
||||
5. Submission of Contributions. Unless You explicitly state otherwise,
|
||||
any Contribution intentionally submitted for inclusion in the Work
|
||||
by You to the Licensor shall be under the terms and conditions of
|
||||
this License, without any additional terms or conditions.
|
||||
Notwithstanding the above, nothing herein shall supersede or modify
|
||||
the terms of any separate license agreement you may have executed
|
||||
with Licensor regarding such Contributions.
|
||||
|
||||
6. Trademarks. This License does not grant permission to use the trade
|
||||
names, trademarks, service marks, or product names of the Licensor,
|
||||
except as required for reasonable and customary use in describing the
|
||||
origin of the Work and reproducing the content of the NOTICE file.
|
||||
|
||||
7. Disclaimer of Warranty. Unless required by applicable law or
|
||||
agreed to in writing, Licensor provides the Work (and each
|
||||
Contributor provides its Contributions) on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
implied, including, without limitation, any warranties or conditions
|
||||
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
|
||||
PARTICULAR PURPOSE. You are solely responsible for determining the
|
||||
appropriateness of using or redistributing the Work and assume any
|
||||
risks associated with Your exercise of permissions under this License.
|
||||
|
||||
8. Limitation of Liability. In no event and under no legal theory,
|
||||
whether in tort (including negligence), contract, or otherwise,
|
||||
unless required by applicable law (such as deliberate and grossly
|
||||
negligent acts) or agreed to in writing, shall any Contributor be
|
||||
liable to You for damages, including any direct, indirect, special,
|
||||
incidental, or consequential damages of any character arising as a
|
||||
result of this License or out of the use or inability to use the
|
||||
Work (including but not limited to damages for loss of goodwill,
|
||||
work stoppage, computer failure or malfunction, or any and all
|
||||
other commercial damages or losses), even if such Contributor
|
||||
has been advised of the possibility of such damages.
|
||||
|
||||
9. Accepting Warranty or Additional Liability. While redistributing
|
||||
the Work or Derivative Works thereof, You may choose to offer,
|
||||
and charge a fee for, acceptance of support, warranty, indemnity,
|
||||
or other liability obligations and/or rights consistent with this
|
||||
License. However, in accepting such obligations, You may act only
|
||||
on Your own behalf and on Your sole responsibility, not on behalf
|
||||
of any other Contributor, and only if You agree to indemnify,
|
||||
defend, and hold each Contributor harmless for any liability
|
||||
incurred by, or claims asserted against, such Contributor by reason
|
||||
of your accepting any such warranty or additional liability.
|
||||
|
||||
END OF TERMS AND CONDITIONS
|
||||
|
||||
APPENDIX: How to apply the Apache License to your work.
|
||||
|
||||
To apply the Apache License to your work, attach the following
|
||||
boilerplate notice, with the fields enclosed by brackets "[]"
|
||||
replaced with your own identifying information. (Don't include
|
||||
the brackets!) The text should be enclosed in the appropriate
|
||||
comment syntax for the file format. We also recommend that a
|
||||
file or class name and description of purpose be included on the
|
||||
same "printed page" as the copyright notice for easier
|
||||
identification within third-party archives.
|
||||
|
||||
Copyright [yyyy] [name of copyright owner]
|
||||
|
||||
Licensed under the Apache License, Version 2.0 (the "License");
|
||||
you may not use this file except in compliance with the License.
|
||||
You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing, software
|
||||
distributed under the License is distributed on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
See the License for the specific language governing permissions and
|
||||
limitations under the License.
|
||||
3
packages/jsdoc-prettier-config/README.md
Normal file
3
packages/jsdoc-prettier-config/README.md
Normal file
@ -0,0 +1,3 @@
|
||||
# `@jsdoc/prettier-config`
|
||||
|
||||
A Prettier (https://prettier.io/) configuration for JSDoc.
|
||||
5
packages/jsdoc-prettier-config/index.js
Normal file
5
packages/jsdoc-prettier-config/index.js
Normal file
@ -0,0 +1,5 @@
|
||||
// https://prettier.io/docs/en/options.html
|
||||
module.exports = {
|
||||
printWidth: 100,
|
||||
singleQuote: true,
|
||||
};
|
||||
31
packages/jsdoc-prettier-config/package.json
Normal file
31
packages/jsdoc-prettier-config/package.json
Normal file
@ -0,0 +1,31 @@
|
||||
{
|
||||
"name": "@jsdoc/prettier-config",
|
||||
"version": "0.0.1",
|
||||
"description": "A Prettier (https://prettier.io/) configuration for JSDoc.",
|
||||
"keywords": [
|
||||
"prettier",
|
||||
"jsdoc"
|
||||
],
|
||||
"author": "Jeff Williams <jeffrey.l.williams@gmail.com>",
|
||||
"homepage": "https://github.com/jsdoc/jsdoc",
|
||||
"license": "Apache-2.0",
|
||||
"main": "index.js",
|
||||
"peerDependencies": {
|
||||
"eslint-config-prettier": "^8.3.0",
|
||||
"eslint-plugin-prettier": "^4.0.0",
|
||||
"prettier": "^2.4.1"
|
||||
},
|
||||
"publishConfig": {
|
||||
"access": "public"
|
||||
},
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "git+https://github.com/jsdoc/jsdoc.git"
|
||||
},
|
||||
"scripts": {
|
||||
"test": "echo \"Error: run tests from root\" && exit 1"
|
||||
},
|
||||
"bugs": {
|
||||
"url": "https://github.com/jsdoc/jsdoc/issues"
|
||||
}
|
||||
}
|
||||
@ -3,5 +3,5 @@ const type = require('./lib/type');
|
||||
|
||||
module.exports = {
|
||||
inline,
|
||||
type
|
||||
type,
|
||||
};
|
||||
|
||||
@ -70,14 +70,14 @@ exports.isInlineTag = (string, tagName) => regExpFactory(tagName, '^', '$').test
|
||||
* @return {module:@jsdoc/tag.inline.InlineTagResult} The updated string, as well as information
|
||||
* about the inline tags that were found.
|
||||
*/
|
||||
const replaceInlineTags = exports.replaceInlineTags = (string, replacers) => {
|
||||
const replaceInlineTags = (exports.replaceInlineTags = (string, replacers) => {
|
||||
const tagInfo = [];
|
||||
|
||||
function replaceMatch(replacer, tag, match, text) {
|
||||
const matchedTag = {
|
||||
completeTag: match,
|
||||
tag: tag,
|
||||
text: text
|
||||
text: text,
|
||||
};
|
||||
|
||||
tagInfo.push(matchedTag);
|
||||
@ -87,7 +87,7 @@ const replaceInlineTags = exports.replaceInlineTags = (string, replacers) => {
|
||||
|
||||
string = string || '';
|
||||
|
||||
Object.keys(replacers).forEach(replacer => {
|
||||
Object.keys(replacers).forEach((replacer) => {
|
||||
const tagRegExp = regExpFactory(replacer);
|
||||
let matches;
|
||||
let previousString;
|
||||
@ -104,9 +104,9 @@ const replaceInlineTags = exports.replaceInlineTags = (string, replacers) => {
|
||||
|
||||
return {
|
||||
tags: tagInfo,
|
||||
newString: string.trim()
|
||||
newString: string.trim(),
|
||||
};
|
||||
};
|
||||
});
|
||||
|
||||
/**
|
||||
* Replace all instances of an inline tag with other text.
|
||||
@ -118,13 +118,13 @@ const replaceInlineTags = exports.replaceInlineTags = (string, replacers) => {
|
||||
* @return {module:@jsdoc/tag.inline.InlineTagResult} The updated string, as well as information
|
||||
* about the inline tags that were found.
|
||||
*/
|
||||
const replaceInlineTag = exports.replaceInlineTag = (string, tag, replacer) => {
|
||||
const replaceInlineTag = (exports.replaceInlineTag = (string, tag, replacer) => {
|
||||
const replacers = {};
|
||||
|
||||
replacers[tag] = replacer;
|
||||
|
||||
return replaceInlineTags(string, replacers);
|
||||
};
|
||||
});
|
||||
|
||||
/**
|
||||
* Extract inline tags from a string, replacing them with an empty string.
|
||||
@ -135,5 +135,4 @@ const replaceInlineTag = exports.replaceInlineTag = (string, tag, replacer) => {
|
||||
* about the inline tags that were found.
|
||||
*/
|
||||
exports.extractInlineTag = (string, tag) =>
|
||||
replaceInlineTag(string, tag, (str, {completeTag}) =>
|
||||
str.replace(completeTag, ''));
|
||||
replaceInlineTag(string, tag, (str, { completeTag }) => str.replace(completeTag, ''));
|
||||
|
||||
@ -18,8 +18,7 @@ const { splitNameAndDescription } = require('@jsdoc/core').name;
|
||||
|
||||
/** @private */
|
||||
function unescapeBraces(text) {
|
||||
return text.replace(/\\\{/g, '{')
|
||||
.replace(/\\\}/g, '}');
|
||||
return text.replace(/\\\{/g, '{').replace(/\\\}/g, '}');
|
||||
}
|
||||
|
||||
/**
|
||||
@ -72,7 +71,7 @@ function extractTypeExpression(string) {
|
||||
|
||||
return {
|
||||
expression: unescapeBraces(expression),
|
||||
newString: string.trim()
|
||||
newString: string.trim(),
|
||||
};
|
||||
}
|
||||
|
||||
@ -109,7 +108,7 @@ function getTagInfo(tagValue, canHaveName, canHaveType) {
|
||||
return {
|
||||
name: name,
|
||||
typeExpression: typeExpression,
|
||||
text: text
|
||||
text: text,
|
||||
};
|
||||
}
|
||||
|
||||
@ -144,7 +143,7 @@ function getTagInfo(tagValue, canHaveName, canHaveType) {
|
||||
function parseName(tagInfo) {
|
||||
// like '[foo]' or '[ foo ]' or '[foo=bar]' or '[ foo=bar ]' or '[ foo = bar ]'
|
||||
// or 'foo=bar' or 'foo = bar'
|
||||
if ( /^(\[)?\s*(.+?)\s*(\])?$/.test(tagInfo.name) ) {
|
||||
if (/^(\[)?\s*(.+?)\s*(\])?$/.test(tagInfo.name)) {
|
||||
tagInfo.name = RegExp.$2;
|
||||
// were the "optional" brackets present?
|
||||
if (RegExp.$1 && RegExp.$3) {
|
||||
@ -152,7 +151,7 @@ function parseName(tagInfo) {
|
||||
}
|
||||
|
||||
// like 'foo=bar' or 'foo = bar'
|
||||
if ( /^(.+?)\s*=\s*(.+)$/.test(tagInfo.name) ) {
|
||||
if (/^(.+?)\s*=\s*(.+)$/.test(tagInfo.name)) {
|
||||
tagInfo.name = RegExp.$1;
|
||||
tagInfo.defaultvalue = cast(RegExp.$2);
|
||||
}
|
||||
@ -189,19 +188,19 @@ function getTypeStrings(parsedType, isOutermostType) {
|
||||
case TYPES.TypeApplication:
|
||||
// if this is the outermost type, we strip the modifiers; otherwise, we keep them
|
||||
if (isOutermostType) {
|
||||
applications = parsedType.applications.map(application =>
|
||||
catharsis.stringify(application)).join(', ');
|
||||
applications = parsedType.applications
|
||||
.map((application) => catharsis.stringify(application))
|
||||
.join(', ');
|
||||
typeString = `${getTypeStrings(parsedType.expression)[0]}.<${applications}>`;
|
||||
|
||||
types.push(typeString);
|
||||
}
|
||||
else {
|
||||
types.push( catharsis.stringify(parsedType) );
|
||||
} else {
|
||||
types.push(catharsis.stringify(parsedType));
|
||||
}
|
||||
break;
|
||||
case TYPES.TypeUnion:
|
||||
parsedType.elements.forEach(element => {
|
||||
types = types.concat( getTypeStrings(element) );
|
||||
parsedType.elements.forEach((element) => {
|
||||
types = types.concat(getTypeStrings(element));
|
||||
});
|
||||
break;
|
||||
case TYPES.UndefinedLiteral:
|
||||
@ -237,19 +236,18 @@ function parseTypeExpression(tagInfo) {
|
||||
try {
|
||||
parsedType = catharsis.parse(tagInfo.typeExpression, {
|
||||
jsdoc: true,
|
||||
useCache: false
|
||||
useCache: false,
|
||||
});
|
||||
}
|
||||
catch (e) {
|
||||
} catch (e) {
|
||||
// always re-throw so the caller has a chance to report which file was bad
|
||||
throw new Error(`Invalid type expression "${tagInfo.typeExpression}": ${e.message}`);
|
||||
}
|
||||
|
||||
tagInfo.type = tagInfo.type.concat( getTypeStrings(parsedType, true) );
|
||||
tagInfo.type = tagInfo.type.concat(getTypeStrings(parsedType, true));
|
||||
tagInfo.parsedType = parsedType;
|
||||
|
||||
// Catharsis and JSDoc use the same names for 'optional' and 'nullable'...
|
||||
['optional', 'nullable'].forEach(key => {
|
||||
['optional', 'nullable'].forEach((key) => {
|
||||
if (parsedType[key] !== null && parsedType[key] !== undefined) {
|
||||
tagInfo[key] = parsedType[key];
|
||||
}
|
||||
@ -287,7 +285,7 @@ exports.parse = (tagValue, canHaveName, canHaveType) => {
|
||||
tagInfo = getTagInfo(tagValue, canHaveName, canHaveType);
|
||||
tagInfo.type = tagInfo.type || [];
|
||||
|
||||
typeParsers.forEach(parser => {
|
||||
typeParsers.forEach((parser) => {
|
||||
tagInfo = parser(tagInfo);
|
||||
});
|
||||
|
||||
|
||||
@ -41,7 +41,7 @@ describe('@jsdoc/tag/lib/inline', () => {
|
||||
});
|
||||
|
||||
it('allows regexp characters in the tag name', () => {
|
||||
expect( isInlineTag('{@mytags hooray}', 'mytag\\S') ).toBeTrue();
|
||||
expect(isInlineTag('{@mytags hooray}', 'mytag\\S')).toBeTrue();
|
||||
});
|
||||
|
||||
it('returns false (rather than throwing) with invalid input', () => {
|
||||
@ -85,7 +85,7 @@ describe('@jsdoc/tag/lib/inline', () => {
|
||||
});
|
||||
|
||||
it('works if the tag is the entire string', () => {
|
||||
function replacer(string, {completeTag, text}) {
|
||||
function replacer(string, { completeTag, text }) {
|
||||
expect(string).toBe('{@foo text in braces}');
|
||||
expect(completeTag).toBe('{@foo text in braces}');
|
||||
expect(text).toBe('text in braces');
|
||||
@ -93,8 +93,7 @@ describe('@jsdoc/tag/lib/inline', () => {
|
||||
return completeTag;
|
||||
}
|
||||
|
||||
const result = inline.replaceInlineTag('{@foo text in braces}', 'foo',
|
||||
replacer);
|
||||
const result = inline.replaceInlineTag('{@foo text in braces}', 'foo', replacer);
|
||||
|
||||
expect(result.tags[0]).toBeObject();
|
||||
expect(result.tags[0].tag).toBe('foo');
|
||||
@ -103,7 +102,7 @@ describe('@jsdoc/tag/lib/inline', () => {
|
||||
});
|
||||
|
||||
it('works if the tag is at the beginning of the string', () => {
|
||||
function replacer(string, {completeTag, text}) {
|
||||
function replacer(string, { completeTag, text }) {
|
||||
expect(string).toBe('{@foo test string} ahoy');
|
||||
expect(completeTag).toBe('{@foo test string}');
|
||||
expect(text).toBe('test string');
|
||||
@ -111,8 +110,7 @@ describe('@jsdoc/tag/lib/inline', () => {
|
||||
return string;
|
||||
}
|
||||
|
||||
const result = inline.replaceInlineTag('{@foo test string} ahoy', 'foo',
|
||||
replacer);
|
||||
const result = inline.replaceInlineTag('{@foo test string} ahoy', 'foo', replacer);
|
||||
|
||||
expect(result.tags[0]).toBeObject();
|
||||
expect(result.tags[0].tag).toBe('foo');
|
||||
@ -121,7 +119,7 @@ describe('@jsdoc/tag/lib/inline', () => {
|
||||
});
|
||||
|
||||
it('works if the tag is in the middle of the string', () => {
|
||||
function replacer(string, {completeTag, text}) {
|
||||
function replacer(string, { completeTag, text }) {
|
||||
expect(string).toBe('a {@foo test string} yay');
|
||||
expect(completeTag).toBe('{@foo test string}');
|
||||
expect(text).toBe('test string');
|
||||
@ -129,8 +127,7 @@ describe('@jsdoc/tag/lib/inline', () => {
|
||||
return string;
|
||||
}
|
||||
|
||||
const result = inline.replaceInlineTag('a {@foo test string} yay', 'foo',
|
||||
replacer);
|
||||
const result = inline.replaceInlineTag('a {@foo test string} yay', 'foo', replacer);
|
||||
|
||||
expect(result.tags[0]).toBeObject();
|
||||
expect(result.tags[0].tag).toBe('foo');
|
||||
@ -139,7 +136,7 @@ describe('@jsdoc/tag/lib/inline', () => {
|
||||
});
|
||||
|
||||
it('works if the tag is at the end of the string', () => {
|
||||
function replacer(string, {completeTag, text}) {
|
||||
function replacer(string, { completeTag, text }) {
|
||||
expect(string).toBe('a {@foo test string}');
|
||||
expect(completeTag).toBe('{@foo test string}');
|
||||
expect(text).toBe('test string');
|
||||
@ -166,12 +163,15 @@ describe('@jsdoc/tag/lib/inline', () => {
|
||||
});
|
||||
|
||||
it('processes all occurrences of a tag', () => {
|
||||
function replacer(string, {completeTag}) {
|
||||
function replacer(string, { completeTag }) {
|
||||
return string.replace(completeTag, 'stuff');
|
||||
}
|
||||
|
||||
const result = inline.replaceInlineTag('some {@foo text} with multiple ' +
|
||||
'{@foo tags}, {@foo like} {@foo this}', 'foo', replacer);
|
||||
const result = inline.replaceInlineTag(
|
||||
'some {@foo text} with multiple ' + '{@foo tags}, {@foo like} {@foo this}',
|
||||
'foo',
|
||||
replacer
|
||||
);
|
||||
|
||||
expect(result.tags.length).toBe(4);
|
||||
|
||||
@ -213,7 +213,7 @@ describe('@jsdoc/tag/lib/inline', () => {
|
||||
expect(tagInfo.text).toBe('text');
|
||||
|
||||
return string.replace(tagInfo.completeTag, 'stuff');
|
||||
}
|
||||
},
|
||||
};
|
||||
const result = inline.replaceInlineTags(text, replacers);
|
||||
|
||||
@ -234,7 +234,7 @@ describe('@jsdoc/tag/lib/inline', () => {
|
||||
expect(tagInfo.text).toBe('multiple');
|
||||
|
||||
return string.replace(tagInfo.completeTag, 'awesome');
|
||||
}
|
||||
},
|
||||
};
|
||||
const result = inline.replaceInlineTags(text, replacers);
|
||||
|
||||
|
||||
@ -54,7 +54,7 @@ describe('@jsdoc/tag/lib/type', () => {
|
||||
it('extracts a name, but not a type, if canHaveName is true and canHaveType is false', () => {
|
||||
const name = 'bar';
|
||||
const desc = 'The bar parameter.';
|
||||
const info = type.parse( buildText(null, name, desc), true, false );
|
||||
const info = type.parse(buildText(null, name, desc), true, false);
|
||||
|
||||
expect(info.type).toBeEmptyArray();
|
||||
expect(info.name).toBe(name);
|
||||
@ -101,26 +101,26 @@ describe('@jsdoc/tag/lib/type', () => {
|
||||
const desc = '{string} foo';
|
||||
const info = type.parse(desc, true, true);
|
||||
|
||||
expect(info.type).toEqual( ['string'] );
|
||||
expect(info.type).toEqual(['string']);
|
||||
});
|
||||
|
||||
it('recognizes the entire list of possible types', () => {
|
||||
let desc = '{(string|number)} foo';
|
||||
let info = type.parse(desc, true, true);
|
||||
|
||||
expect(info.type).toEqual( ['string', 'number'] );
|
||||
expect(info.type).toEqual(['string', 'number']);
|
||||
|
||||
desc = '{ ( string | number ) } foo';
|
||||
info = type.parse(desc, true, true);
|
||||
expect(info.type).toEqual( ['string', 'number'] );
|
||||
expect(info.type).toEqual(['string', 'number']);
|
||||
|
||||
desc = '{ ( string | number)} foo';
|
||||
info = type.parse(desc, true, true);
|
||||
expect(info.type).toEqual( ['string', 'number'] );
|
||||
expect(info.type).toEqual(['string', 'number']);
|
||||
|
||||
desc = '{(string|number|boolean|function)} foo';
|
||||
info = type.parse(desc, true, true);
|
||||
expect(info.type).toEqual( ['string', 'number', 'boolean', 'function'] );
|
||||
expect(info.type).toEqual(['string', 'number', 'boolean', 'function']);
|
||||
});
|
||||
|
||||
it('does not find any type if there is no text in braces', () => {
|
||||
@ -189,27 +189,27 @@ describe('@jsdoc/tag/lib/type', () => {
|
||||
let desc = '{Object} cookie {@type Monster}';
|
||||
let info = type.parse(desc, true, true);
|
||||
|
||||
expect(info.type).toEqual( ['Monster'] );
|
||||
expect(info.type).toEqual(['Monster']);
|
||||
expect(info.text).toBe('');
|
||||
|
||||
desc = '{Object} cookie - {@type Monster}';
|
||||
info = type.parse(desc, true, true);
|
||||
expect(info.type).toEqual( ['Monster'] );
|
||||
expect(info.type).toEqual(['Monster']);
|
||||
expect(info.text).toBe('');
|
||||
|
||||
desc = '{Object} cookie - The cookie parameter. {@type Monster}';
|
||||
info = type.parse(desc, true, true);
|
||||
expect(info.type).toEqual( ['Monster'] );
|
||||
expect(info.type).toEqual(['Monster']);
|
||||
expect(info.text).toBe('The cookie parameter.');
|
||||
|
||||
desc = '{Object} cookie - The cookie parameter. {@type (Monster|Jar)}';
|
||||
info = type.parse(desc, true, true);
|
||||
expect(info.type).toEqual( ['Monster', 'Jar'] );
|
||||
expect(info.type).toEqual(['Monster', 'Jar']);
|
||||
expect(info.text).toBe('The cookie parameter.');
|
||||
|
||||
desc = '{Object} cookie - The cookie parameter. {@type (Monster|Jar)} Mmm, cookie.';
|
||||
info = type.parse(desc, true, true);
|
||||
expect(info.type).toEqual( ['Monster', 'Jar'] );
|
||||
expect(info.type).toEqual(['Monster', 'Jar']);
|
||||
expect(info.text).toBe('The cookie parameter. Mmm, cookie.');
|
||||
});
|
||||
|
||||
@ -217,27 +217,27 @@ describe('@jsdoc/tag/lib/type', () => {
|
||||
it('parses JSDoc-style optional parameters', () => {
|
||||
let name = '[qux]';
|
||||
const desc = 'The qux parameter.';
|
||||
let info = type.parse( buildText(null, name, desc), true, false );
|
||||
let info = type.parse(buildText(null, name, desc), true, false);
|
||||
|
||||
expect(info.name).toBe('qux');
|
||||
expect(info.text).toBe(desc);
|
||||
expect(info.optional).toBeTrue();
|
||||
|
||||
name = '[ qux ]';
|
||||
info = type.parse( buildText(null, name, desc), true, false );
|
||||
info = type.parse(buildText(null, name, desc), true, false);
|
||||
expect(info.name).toBe('qux');
|
||||
expect(info.text).toBe(desc);
|
||||
expect(info.optional).toBeTrue();
|
||||
|
||||
name = '[qux=hooray]';
|
||||
info = type.parse( buildText(null, name, desc), true, false );
|
||||
info = type.parse(buildText(null, name, desc), true, false);
|
||||
expect(info.name).toBe('qux');
|
||||
expect(info.text).toBe(desc);
|
||||
expect(info.optional).toBeTrue();
|
||||
expect(info.defaultvalue).toBe('hooray');
|
||||
|
||||
name = '[ qux = hooray ]';
|
||||
info = type.parse( buildText(null, name, desc), true, false );
|
||||
info = type.parse(buildText(null, name, desc), true, false);
|
||||
expect(info.name).toBe('qux');
|
||||
expect(info.text).toBe(desc);
|
||||
expect(info.optional).toBeTrue();
|
||||
@ -251,7 +251,7 @@ describe('@jsdoc/tag/lib/type', () => {
|
||||
const desc = '{...string} foo - Foo.';
|
||||
const info = type.parse(desc, true, true);
|
||||
|
||||
expect(info.type).toEqual( ['string'] );
|
||||
expect(info.type).toEqual(['string']);
|
||||
expect(info.variable).toBeTrue();
|
||||
});
|
||||
|
||||
|
||||
@ -3,5 +3,5 @@ const TaskRunner = require('./lib/task-runner');
|
||||
|
||||
module.exports = {
|
||||
Task,
|
||||
TaskRunner
|
||||
TaskRunner,
|
||||
};
|
||||
|
||||
@ -1,7 +1,7 @@
|
||||
const _ = require('lodash');
|
||||
const { DepGraph } = require('dependency-graph');
|
||||
const Emittery = require('emittery');
|
||||
const {default: ow} = require('ow');
|
||||
const { default: ow } = require('ow');
|
||||
const Queue = require('p-queue').default;
|
||||
const v = require('./validators');
|
||||
|
||||
@ -44,18 +44,18 @@ module.exports = class TaskRunner extends Emittery {
|
||||
_addTaskEmitters(task) {
|
||||
const u = {};
|
||||
|
||||
u.start = task.on('start', t => this.emit('taskStart', t));
|
||||
u.end = task.on('end', t => this.emit('taskEnd', t));
|
||||
u.error = task.on('error', (e => {
|
||||
u.start = task.on('start', (t) => this.emit('taskStart', t));
|
||||
u.end = task.on('end', (t) => this.emit('taskEnd', t));
|
||||
u.error = task.on('error', (e) => {
|
||||
this.emit('taskError', {
|
||||
task: e.task,
|
||||
error: e.error
|
||||
error: e.error,
|
||||
});
|
||||
|
||||
if (!this._error) {
|
||||
this._error = e.error;
|
||||
}
|
||||
}));
|
||||
});
|
||||
|
||||
this._unsubscribers.set(task.name, u);
|
||||
}
|
||||
@ -69,13 +69,11 @@ module.exports = class TaskRunner extends Emittery {
|
||||
return null;
|
||||
}
|
||||
|
||||
return () => tasks.reduce((p, taskName) => {
|
||||
return () =>
|
||||
tasks.reduce((p, taskName) => {
|
||||
const task = this._nameToTask.get(taskName);
|
||||
|
||||
return p.then(
|
||||
this._bindTaskFunc(task),
|
||||
e => Promise.reject(e)
|
||||
);
|
||||
return p.then(this._bindTaskFunc(task), (e) => Promise.reject(e));
|
||||
}, Promise.resolve());
|
||||
}
|
||||
|
||||
@ -112,8 +110,9 @@ module.exports = class TaskRunner extends Emittery {
|
||||
errorText = 'unknown tasks';
|
||||
}
|
||||
|
||||
return new v.UnknownDependencyError(`The task ${dependent} depends on ${errorText}: ` +
|
||||
`${unknownDeps.join(', ')}`);
|
||||
return new v.UnknownDependencyError(
|
||||
`The task ${dependent} depends on ${errorText}: ` + `${unknownDeps.join(', ')}`
|
||||
);
|
||||
}
|
||||
|
||||
_orderTasks() {
|
||||
@ -146,10 +145,9 @@ module.exports = class TaskRunner extends Emittery {
|
||||
if (!error) {
|
||||
try {
|
||||
// Get standalone tasks with no dependencies and no dependents.
|
||||
parallel = graph.overallOrder(true)
|
||||
.filter(task => !(graph.dependentsOf(task).length));
|
||||
parallel = graph.overallOrder(true).filter((task) => !graph.dependentsOf(task).length);
|
||||
// Get tasks with dependencies, in a correctly ordered list.
|
||||
sequential = graph.overallOrder().filter(task => !parallel.includes(task));
|
||||
sequential = graph.overallOrder().filter((task) => !parallel.includes(task));
|
||||
} catch (e) {
|
||||
error = this._newDependencyCycleError(e.cyclePath);
|
||||
}
|
||||
@ -158,7 +156,7 @@ module.exports = class TaskRunner extends Emittery {
|
||||
return {
|
||||
error,
|
||||
parallel,
|
||||
sequential
|
||||
sequential,
|
||||
};
|
||||
}
|
||||
|
||||
@ -208,7 +206,7 @@ module.exports = class TaskRunner extends Emittery {
|
||||
|
||||
end() {
|
||||
this.emit('end', {
|
||||
error: this._error
|
||||
error: this._error,
|
||||
});
|
||||
this._queue.clear();
|
||||
this._init();
|
||||
@ -283,7 +281,8 @@ module.exports = class TaskRunner extends Emittery {
|
||||
taskFuncs.push(taskSequence);
|
||||
}
|
||||
|
||||
endPromise = this._queue.addAll(taskFuncs).then(() => {
|
||||
endPromise = this._queue.addAll(taskFuncs).then(
|
||||
() => {
|
||||
this.end();
|
||||
|
||||
if (this._error) {
|
||||
@ -291,11 +290,13 @@ module.exports = class TaskRunner extends Emittery {
|
||||
} else {
|
||||
return Promise.resolve();
|
||||
}
|
||||
}, e => {
|
||||
},
|
||||
(e) => {
|
||||
this.end();
|
||||
|
||||
return Promise.reject(e);
|
||||
});
|
||||
}
|
||||
);
|
||||
|
||||
this.emit('start');
|
||||
this._running = true;
|
||||
|
||||
@ -1,5 +1,5 @@
|
||||
const Emittery = require('emittery');
|
||||
const {default: ow} = require('ow');
|
||||
const { default: ow } = require('ow');
|
||||
|
||||
module.exports = class Task extends Emittery {
|
||||
constructor(opts = {}) {
|
||||
@ -9,10 +9,7 @@ module.exports = class Task extends Emittery {
|
||||
|
||||
ow(opts.name, ow.optional.string);
|
||||
ow(opts.func, ow.optional.function);
|
||||
ow(opts.dependsOn, ow.any(
|
||||
ow.optional.string,
|
||||
ow.optional.array.ofType(ow.string)
|
||||
));
|
||||
ow(opts.dependsOn, ow.any(ow.optional.string, ow.optional.array.ofType(ow.string)));
|
||||
|
||||
if (typeof opts.dependsOn === 'string') {
|
||||
deps = [opts.dependsOn];
|
||||
@ -38,10 +35,10 @@ module.exports = class Task extends Emittery {
|
||||
|
||||
return Promise.resolve();
|
||||
},
|
||||
error => {
|
||||
(error) => {
|
||||
this.emit('error', {
|
||||
task: this,
|
||||
error
|
||||
error,
|
||||
});
|
||||
this.emit('end', this);
|
||||
|
||||
|
||||
@ -1,10 +1,10 @@
|
||||
const {default: ow} = require('ow');
|
||||
const { default: ow } = require('ow');
|
||||
const Task = require('./task');
|
||||
|
||||
function checkTask(t) {
|
||||
return {
|
||||
validator: t instanceof Task,
|
||||
message: `Expected ${t} to be a Task object`
|
||||
message: `Expected ${t} to be a Task object`,
|
||||
};
|
||||
}
|
||||
|
||||
@ -43,5 +43,5 @@ module.exports = {
|
||||
|
||||
this.name = 'UnknownTaskError';
|
||||
}
|
||||
}
|
||||
},
|
||||
};
|
||||
|
||||
160
packages/jsdoc-task-runner/package-lock.json
generated
160
packages/jsdoc-task-runner/package-lock.json
generated
@ -1,8 +1,166 @@
|
||||
{
|
||||
"name": "@jsdoc/task-runner",
|
||||
"version": "0.1.10",
|
||||
"lockfileVersion": 1,
|
||||
"lockfileVersion": 2,
|
||||
"requires": true,
|
||||
"packages": {
|
||||
"": {
|
||||
"name": "@jsdoc/task-runner",
|
||||
"version": "0.1.10",
|
||||
"license": "Apache-2.0",
|
||||
"dependencies": {
|
||||
"dependency-graph": "^0.11.0",
|
||||
"emittery": "^0.10.0",
|
||||
"ow": "^0.27.0",
|
||||
"p-queue": "^6.6.2"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=v14.17.6"
|
||||
}
|
||||
},
|
||||
"node_modules/@sindresorhus/is": {
|
||||
"version": "4.0.1",
|
||||
"resolved": "https://registry.npmjs.org/@sindresorhus/is/-/is-4.0.1.tgz",
|
||||
"integrity": "sha512-Qm9hBEBu18wt1PO2flE7LPb30BHMQt1eQgbV76YntdNk73XZGpn3izvGTYxbGgzXKgbCjiia0uxTd3aTNQrY/g==",
|
||||
"engines": {
|
||||
"node": ">=10"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sindresorhus/is?sponsor=1"
|
||||
}
|
||||
},
|
||||
"node_modules/callsites": {
|
||||
"version": "3.1.0",
|
||||
"resolved": "https://registry.npmjs.org/callsites/-/callsites-3.1.0.tgz",
|
||||
"integrity": "sha512-P8BjAsXvZS+VIDUI11hHCQEv74YT67YUi5JJFNWIqL235sBmjX4+qx9Muvls5ivyNENctx46xQLQ3aTuE7ssaQ==",
|
||||
"engines": {
|
||||
"node": ">=6"
|
||||
}
|
||||
},
|
||||
"node_modules/dependency-graph": {
|
||||
"version": "0.11.0",
|
||||
"resolved": "https://registry.npmjs.org/dependency-graph/-/dependency-graph-0.11.0.tgz",
|
||||
"integrity": "sha512-JeMq7fEshyepOWDfcfHK06N3MhyPhz++vtqWhMT5O9A3K42rdsEDpfdVqjaqaAhsw6a+ZqeDvQVtD0hFHQWrzg==",
|
||||
"engines": {
|
||||
"node": ">= 0.6.0"
|
||||
}
|
||||
},
|
||||
"node_modules/dot-prop": {
|
||||
"version": "6.0.1",
|
||||
"resolved": "https://registry.npmjs.org/dot-prop/-/dot-prop-6.0.1.tgz",
|
||||
"integrity": "sha512-tE7ztYzXHIeyvc7N+hR3oi7FIbf/NIjVP9hmAt3yMXzrQ072/fpjGLx2GxNxGxUl5V73MEqYzioOMoVhGMJ5cA==",
|
||||
"dependencies": {
|
||||
"is-obj": "^2.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=10"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/sindresorhus"
|
||||
}
|
||||
},
|
||||
"node_modules/emittery": {
|
||||
"version": "0.10.0",
|
||||
"resolved": "https://registry.npmjs.org/emittery/-/emittery-0.10.0.tgz",
|
||||
"integrity": "sha512-AGvFfs+d0JKCJQ4o01ASQLGPmSCxgfU9RFXvzPvZdjKK8oscynksuJhWrSTSw7j7Ep/sZct5b5ZhYCi8S/t0HQ==",
|
||||
"engines": {
|
||||
"node": ">=12"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sindresorhus/emittery?sponsor=1"
|
||||
}
|
||||
},
|
||||
"node_modules/eventemitter3": {
|
||||
"version": "4.0.7",
|
||||
"resolved": "https://registry.npmjs.org/eventemitter3/-/eventemitter3-4.0.7.tgz",
|
||||
"integrity": "sha512-8guHBZCwKnFhYdHr2ysuRWErTwhoN2X8XELRlrRwpmfeY2jjuUN4taQMsULKUVo1K4DvZl+0pgfyoysHxvmvEw=="
|
||||
},
|
||||
"node_modules/is-obj": {
|
||||
"version": "2.0.0",
|
||||
"resolved": "https://registry.npmjs.org/is-obj/-/is-obj-2.0.0.tgz",
|
||||
"integrity": "sha512-drqDG3cbczxxEJRoOXcOjtdp1J/lyp1mNn0xaznRs8+muBhgQcrnbspox5X5fOw0HnMnbfDzvnEMEtqDEJEo8w==",
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/lodash.isequal": {
|
||||
"version": "4.5.0",
|
||||
"resolved": "https://registry.npmjs.org/lodash.isequal/-/lodash.isequal-4.5.0.tgz",
|
||||
"integrity": "sha1-QVxEePK8wwEgwizhDtMib30+GOA="
|
||||
},
|
||||
"node_modules/ow": {
|
||||
"version": "0.27.0",
|
||||
"resolved": "https://registry.npmjs.org/ow/-/ow-0.27.0.tgz",
|
||||
"integrity": "sha512-SGnrGUbhn4VaUGdU0EJLMwZWSupPmF46hnTRII7aCLCrqixTAC5eKo8kI4/XXf1eaaI8YEVT+3FeGNJI9himAQ==",
|
||||
"dependencies": {
|
||||
"@sindresorhus/is": "^4.0.1",
|
||||
"callsites": "^3.1.0",
|
||||
"dot-prop": "^6.0.1",
|
||||
"lodash.isequal": "^4.5.0",
|
||||
"type-fest": "^1.2.1",
|
||||
"vali-date": "^1.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=12"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/sindresorhus"
|
||||
}
|
||||
},
|
||||
"node_modules/p-finally": {
|
||||
"version": "1.0.0",
|
||||
"resolved": "https://registry.npmjs.org/p-finally/-/p-finally-1.0.0.tgz",
|
||||
"integrity": "sha1-P7z7FbiZpEEjs0ttzBi3JDNqLK4=",
|
||||
"engines": {
|
||||
"node": ">=4"
|
||||
}
|
||||
},
|
||||
"node_modules/p-queue": {
|
||||
"version": "6.6.2",
|
||||
"resolved": "https://registry.npmjs.org/p-queue/-/p-queue-6.6.2.tgz",
|
||||
"integrity": "sha512-RwFpb72c/BhQLEXIZ5K2e+AhgNVmIejGlTgiB9MzZ0e93GRvqZ7uSi0dvRF7/XIXDeNkra2fNHBxTyPDGySpjQ==",
|
||||
"dependencies": {
|
||||
"eventemitter3": "^4.0.4",
|
||||
"p-timeout": "^3.2.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/sindresorhus"
|
||||
}
|
||||
},
|
||||
"node_modules/p-timeout": {
|
||||
"version": "3.2.0",
|
||||
"resolved": "https://registry.npmjs.org/p-timeout/-/p-timeout-3.2.0.tgz",
|
||||
"integrity": "sha512-rhIwUycgwwKcP9yTOOFK/AKsAopjjCakVqLHePO3CC6Mir1Z99xT+R63jZxAT5lFZLa2inS5h+ZS2GvR99/FBg==",
|
||||
"dependencies": {
|
||||
"p-finally": "^1.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/type-fest": {
|
||||
"version": "1.2.2",
|
||||
"resolved": "https://registry.npmjs.org/type-fest/-/type-fest-1.2.2.tgz",
|
||||
"integrity": "sha512-pfkPYCcuV0TJoo/jlsUeWNV8rk7uMU6ocnYNvca1Vu+pyKi8Rl8Zo2scPt9O72gCsXIm+dMxOOWuA3VFDSdzWA==",
|
||||
"engines": {
|
||||
"node": ">=10"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/sindresorhus"
|
||||
}
|
||||
},
|
||||
"node_modules/vali-date": {
|
||||
"version": "1.0.0",
|
||||
"resolved": "https://registry.npmjs.org/vali-date/-/vali-date-1.0.0.tgz",
|
||||
"integrity": "sha1-G5BKWWCfsyjvB4E4Qgk09rhnCaY=",
|
||||
"engines": {
|
||||
"node": ">=0.10.0"
|
||||
}
|
||||
}
|
||||
},
|
||||
"dependencies": {
|
||||
"@sindresorhus/is": {
|
||||
"version": "4.0.1",
|
||||
|
||||
@ -20,7 +20,7 @@ describe('@jsdoc/task-runner/lib/task-runner', () => {
|
||||
let barResult;
|
||||
const fakeTask = {
|
||||
name: 'foo',
|
||||
func: () => Promise.resolve()
|
||||
func: () => Promise.resolve(),
|
||||
};
|
||||
let foo;
|
||||
let fooResult;
|
||||
@ -30,23 +30,25 @@ describe('@jsdoc/task-runner/lib/task-runner', () => {
|
||||
runner = new TaskRunner({});
|
||||
foo = new Task({
|
||||
name: 'foo',
|
||||
func: () => new Promise(resolve => {
|
||||
func: () =>
|
||||
new Promise((resolve) => {
|
||||
fooResult = true;
|
||||
resolve();
|
||||
})
|
||||
}),
|
||||
});
|
||||
fooResult = null;
|
||||
bar = new Task({
|
||||
name: 'bar',
|
||||
func: () => new Promise(resolve => {
|
||||
func: () =>
|
||||
new Promise((resolve) => {
|
||||
barResult = true;
|
||||
resolve();
|
||||
})
|
||||
}),
|
||||
});
|
||||
barResult = null;
|
||||
badTask = new Task({
|
||||
name: 'badTask',
|
||||
func: () => Promise.reject(new Error())
|
||||
func: () => Promise.reject(new Error()),
|
||||
});
|
||||
});
|
||||
|
||||
@ -116,7 +118,7 @@ describe('@jsdoc/task-runner/lib/task-runner', () => {
|
||||
promise = runner.once('taskStart');
|
||||
|
||||
foo.run();
|
||||
await promise.then(event => {
|
||||
await promise.then((event) => {
|
||||
expect(event).toBe(foo);
|
||||
});
|
||||
});
|
||||
@ -203,7 +205,7 @@ describe('@jsdoc/task-runner/lib/task-runner', () => {
|
||||
|
||||
runner.addTasks({
|
||||
foo,
|
||||
bar
|
||||
bar,
|
||||
});
|
||||
tasks = runner.tasks;
|
||||
|
||||
@ -214,10 +216,7 @@ describe('@jsdoc/task-runner/lib/task-runner', () => {
|
||||
it('adds all the tasks in an array', () => {
|
||||
let tasks;
|
||||
|
||||
runner.addTasks([
|
||||
foo,
|
||||
bar
|
||||
]);
|
||||
runner.addTasks([foo, bar]);
|
||||
tasks = runner.tasks;
|
||||
|
||||
expect(tasks.foo).toBe(foo);
|
||||
@ -227,7 +226,7 @@ describe('@jsdoc/task-runner/lib/task-runner', () => {
|
||||
it('returns `this`', () => {
|
||||
const result = runner.addTasks({
|
||||
foo,
|
||||
bar
|
||||
bar,
|
||||
});
|
||||
|
||||
expect(result).toBe(runner);
|
||||
@ -319,10 +318,10 @@ describe('@jsdoc/task-runner/lib/task-runner', () => {
|
||||
let startEvent;
|
||||
let endEvent;
|
||||
|
||||
runner.on('taskStart', e => {
|
||||
runner.on('taskStart', (e) => {
|
||||
startEvent = e;
|
||||
});
|
||||
runner.on('taskEnd', e => {
|
||||
runner.on('taskEnd', (e) => {
|
||||
endEvent = e;
|
||||
});
|
||||
runner.addTask(foo);
|
||||
@ -340,10 +339,10 @@ describe('@jsdoc/task-runner/lib/task-runner', () => {
|
||||
runner.addTask(badTask);
|
||||
runner.removeTask(badTask);
|
||||
|
||||
badTask.on('error', e => {
|
||||
badTask.on('error', (e) => {
|
||||
errorEvent = e;
|
||||
});
|
||||
runner.on('taskError', e => {
|
||||
runner.on('taskError', (e) => {
|
||||
taskErrorEvent = e;
|
||||
});
|
||||
|
||||
@ -436,7 +435,7 @@ describe('@jsdoc/task-runner/lib/task-runner', () => {
|
||||
it('removes all the tasks in an object', () => {
|
||||
const tasks = {
|
||||
foo,
|
||||
bar
|
||||
bar,
|
||||
};
|
||||
|
||||
runner.addTasks(tasks);
|
||||
@ -446,10 +445,7 @@ describe('@jsdoc/task-runner/lib/task-runner', () => {
|
||||
});
|
||||
|
||||
it('removes all the tasks in an array', () => {
|
||||
const tasks = [
|
||||
foo,
|
||||
bar
|
||||
];
|
||||
const tasks = [foo, bar];
|
||||
|
||||
runner.addTasks(tasks);
|
||||
runner.removeTasks(tasks);
|
||||
@ -483,7 +479,7 @@ describe('@jsdoc/task-runner/lib/task-runner', () => {
|
||||
a = 5;
|
||||
|
||||
return Promise.resolve();
|
||||
}
|
||||
},
|
||||
});
|
||||
taskB = new Task({
|
||||
name: 'b',
|
||||
@ -492,7 +488,7 @@ describe('@jsdoc/task-runner/lib/task-runner', () => {
|
||||
|
||||
return Promise.resolve();
|
||||
},
|
||||
dependsOn: ['a']
|
||||
dependsOn: ['a'],
|
||||
});
|
||||
taskC = new Task({
|
||||
name: 'c',
|
||||
@ -501,7 +497,7 @@ describe('@jsdoc/task-runner/lib/task-runner', () => {
|
||||
|
||||
return Promise.resolve();
|
||||
},
|
||||
dependsOn: ['b']
|
||||
dependsOn: ['b'],
|
||||
});
|
||||
});
|
||||
|
||||
@ -573,14 +569,16 @@ describe('@jsdoc/task-runner/lib/task-runner', () => {
|
||||
const context = {};
|
||||
const t = new TaskRunner({});
|
||||
|
||||
t.addTask(new Task({
|
||||
t.addTask(
|
||||
new Task({
|
||||
name: 'foo',
|
||||
func: ctx => {
|
||||
func: (ctx) => {
|
||||
ctx.foo = 'foo';
|
||||
|
||||
return Promise.resolve();
|
||||
}
|
||||
}));
|
||||
},
|
||||
})
|
||||
);
|
||||
await t.run(context);
|
||||
|
||||
expect(context.foo).toBe('foo');
|
||||
@ -590,14 +588,16 @@ describe('@jsdoc/task-runner/lib/task-runner', () => {
|
||||
const context = {};
|
||||
const t = new TaskRunner(context);
|
||||
|
||||
t.addTask(new Task({
|
||||
t.addTask(
|
||||
new Task({
|
||||
name: 'foo',
|
||||
func: ctx => {
|
||||
func: (ctx) => {
|
||||
ctx.foo = 'foo';
|
||||
|
||||
return Promise.resolve();
|
||||
}
|
||||
}));
|
||||
},
|
||||
})
|
||||
);
|
||||
await t.run();
|
||||
|
||||
expect(context.foo).toBe('foo');
|
||||
@ -619,14 +619,16 @@ describe('@jsdoc/task-runner/lib/task-runner', () => {
|
||||
const context = {};
|
||||
const r = new TaskRunner(context);
|
||||
|
||||
r.addTask(new Task({
|
||||
r.addTask(
|
||||
new Task({
|
||||
name: 'usesContext',
|
||||
func: ctx => {
|
||||
func: (ctx) => {
|
||||
ctx.foo = 'bar';
|
||||
|
||||
return Promise.resolve();
|
||||
}
|
||||
}));
|
||||
},
|
||||
})
|
||||
);
|
||||
|
||||
await r.run();
|
||||
expect(context.foo).toBe('bar');
|
||||
@ -639,21 +641,21 @@ describe('@jsdoc/task-runner/lib/task-runner', () => {
|
||||
r.addTasks([
|
||||
new Task({
|
||||
name: 'one',
|
||||
func: ctx => {
|
||||
func: (ctx) => {
|
||||
ctx.foo = 'bar';
|
||||
|
||||
return Promise.resolve();
|
||||
}
|
||||
},
|
||||
}),
|
||||
new Task({
|
||||
name: 'two',
|
||||
func: ctx => {
|
||||
func: (ctx) => {
|
||||
ctx.bar = ctx.foo + ' baz';
|
||||
|
||||
return Promise.resolve();
|
||||
},
|
||||
dependsOn: ['one']
|
||||
})
|
||||
dependsOn: ['one'],
|
||||
}),
|
||||
]);
|
||||
|
||||
await r.run();
|
||||
@ -665,11 +667,13 @@ describe('@jsdoc/task-runner/lib/task-runner', () => {
|
||||
it('errors if a task depends on an unknown task', async () => {
|
||||
let error;
|
||||
|
||||
runner.addTask(new Task({
|
||||
runner.addTask(
|
||||
new Task({
|
||||
name: 'badDependsOn',
|
||||
func: () => Promise.resolve(),
|
||||
dependsOn: ['mysteryTask']
|
||||
}));
|
||||
dependsOn: ['mysteryTask'],
|
||||
})
|
||||
);
|
||||
|
||||
try {
|
||||
await runner.run();
|
||||
@ -687,13 +691,13 @@ describe('@jsdoc/task-runner/lib/task-runner', () => {
|
||||
new Task({
|
||||
name: 'one',
|
||||
func: () => Promise.resolve(),
|
||||
dependsOn: ['two']
|
||||
dependsOn: ['two'],
|
||||
}),
|
||||
new Task({
|
||||
name: 'two',
|
||||
func: () => Promise.resolve(),
|
||||
dependsOn: ['one']
|
||||
})
|
||||
dependsOn: ['one'],
|
||||
}),
|
||||
]);
|
||||
|
||||
try {
|
||||
@ -723,7 +727,7 @@ describe('@jsdoc/task-runner/lib/task-runner', () => {
|
||||
let emitted;
|
||||
|
||||
runner.addTask(foo);
|
||||
runner.on('end', e => {
|
||||
runner.on('end', (e) => {
|
||||
emitted = e;
|
||||
});
|
||||
await runner.run();
|
||||
@ -737,7 +741,7 @@ describe('@jsdoc/task-runner/lib/task-runner', () => {
|
||||
let error;
|
||||
|
||||
runner.addTask(badTask);
|
||||
runner.on('end', e => {
|
||||
runner.on('end', (e) => {
|
||||
endError = e.error;
|
||||
});
|
||||
|
||||
@ -757,14 +761,16 @@ describe('@jsdoc/task-runner/lib/task-runner', () => {
|
||||
it('is true when the task runner is running', async () => {
|
||||
let running;
|
||||
|
||||
runner.addTask(new Task({
|
||||
runner.addTask(
|
||||
new Task({
|
||||
name: 'checkRunning',
|
||||
func: () => {
|
||||
running = runner.running;
|
||||
|
||||
return Promise.resolve();
|
||||
}
|
||||
}));
|
||||
},
|
||||
})
|
||||
);
|
||||
await runner.run();
|
||||
|
||||
expect(running).toBeTrue();
|
||||
|
||||
@ -39,7 +39,7 @@ describe('@jsdoc/task-runner/lib/task', () => {
|
||||
const task = new Task({
|
||||
name: 'foo',
|
||||
func: () => Promise.resolve(),
|
||||
dependsOn
|
||||
dependsOn,
|
||||
});
|
||||
|
||||
expect(task.dependsOn).toEqual(dependsOn);
|
||||
@ -49,7 +49,7 @@ describe('@jsdoc/task-runner/lib/task', () => {
|
||||
const task = new Task({
|
||||
name: 'foo',
|
||||
func: () => Promise.resolve(),
|
||||
dependsOn: 'bar'
|
||||
dependsOn: 'bar',
|
||||
});
|
||||
|
||||
expect(task.dependsOn).toEqual(['bar']);
|
||||
@ -60,7 +60,7 @@ describe('@jsdoc/task-runner/lib/task', () => {
|
||||
return new Task({
|
||||
name: 'foo',
|
||||
func: () => Promise.resolve(),
|
||||
dependsOn: 7
|
||||
dependsOn: 7,
|
||||
});
|
||||
}
|
||||
|
||||
@ -72,7 +72,7 @@ describe('@jsdoc/task-runner/lib/task', () => {
|
||||
return new Task({
|
||||
name: 'foo',
|
||||
func: () => Promise.resolve(),
|
||||
dependsOn: [7]
|
||||
dependsOn: [7],
|
||||
});
|
||||
}
|
||||
|
||||
@ -93,7 +93,7 @@ describe('@jsdoc/task-runner/lib/task', () => {
|
||||
|
||||
async function start() {
|
||||
const task = new Task({
|
||||
func: () => Promise.resolve()
|
||||
func: () => Promise.resolve(),
|
||||
});
|
||||
|
||||
await task.run();
|
||||
@ -113,7 +113,7 @@ describe('@jsdoc/task-runner/lib/task', () => {
|
||||
|
||||
async function run() {
|
||||
const task = new Task({
|
||||
name: 'foo'
|
||||
name: 'foo',
|
||||
});
|
||||
|
||||
await task.run();
|
||||
@ -132,11 +132,11 @@ describe('@jsdoc/task-runner/lib/task', () => {
|
||||
const context = {};
|
||||
const task = new Task({
|
||||
name: 'foo',
|
||||
func: c => {
|
||||
func: (c) => {
|
||||
c.foo = 'bar';
|
||||
|
||||
return Promise.resolve();
|
||||
}
|
||||
},
|
||||
});
|
||||
|
||||
await task.run(context);
|
||||
@ -149,10 +149,10 @@ describe('@jsdoc/task-runner/lib/task', () => {
|
||||
let event;
|
||||
const task = new Task({
|
||||
name: 'foo',
|
||||
func: () => Promise.resolve()
|
||||
func: () => Promise.resolve(),
|
||||
});
|
||||
|
||||
task.on('start', e => {
|
||||
task.on('start', (e) => {
|
||||
event = e;
|
||||
});
|
||||
|
||||
@ -165,10 +165,10 @@ describe('@jsdoc/task-runner/lib/task', () => {
|
||||
let event;
|
||||
const task = new Task({
|
||||
name: 'foo',
|
||||
func: () => Promise.resolve()
|
||||
func: () => Promise.resolve(),
|
||||
});
|
||||
|
||||
task.on('end', e => {
|
||||
task.on('end', (e) => {
|
||||
event = e;
|
||||
});
|
||||
|
||||
@ -182,10 +182,10 @@ describe('@jsdoc/task-runner/lib/task', () => {
|
||||
let event;
|
||||
const task = new Task({
|
||||
name: 'foo',
|
||||
func: () => Promise.reject(error)
|
||||
func: () => Promise.reject(error),
|
||||
});
|
||||
|
||||
task.on('error', e => {
|
||||
task.on('error', (e) => {
|
||||
event = e;
|
||||
});
|
||||
|
||||
@ -195,7 +195,6 @@ describe('@jsdoc/task-runner/lib/task', () => {
|
||||
// Expected behavior.
|
||||
}
|
||||
|
||||
|
||||
expect(event.error).toBe(error);
|
||||
expect(event.task).toBe(task);
|
||||
});
|
||||
|
||||
@ -30,5 +30,5 @@ addMatchers({
|
||||
}
|
||||
|
||||
return valueName === otherName;
|
||||
}
|
||||
},
|
||||
});
|
||||
|
||||
29
packages/jsdoc-test-matchers/package-lock.json
generated
29
packages/jsdoc-test-matchers/package-lock.json
generated
@ -1,8 +1,35 @@
|
||||
{
|
||||
"name": "@jsdoc/test-matchers",
|
||||
"version": "0.1.6",
|
||||
"lockfileVersion": 1,
|
||||
"lockfileVersion": 2,
|
||||
"requires": true,
|
||||
"packages": {
|
||||
"": {
|
||||
"name": "@jsdoc/test-matchers",
|
||||
"version": "0.1.6",
|
||||
"license": "Apache-2.0",
|
||||
"dependencies": {
|
||||
"add-matchers": "^0.6.2",
|
||||
"jasmine-expect": "^5.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=v14.17.6"
|
||||
}
|
||||
},
|
||||
"node_modules/add-matchers": {
|
||||
"version": "0.6.2",
|
||||
"resolved": "https://registry.npmjs.org/add-matchers/-/add-matchers-0.6.2.tgz",
|
||||
"integrity": "sha512-hVO2wodMei9RF00qe+506MoeJ/NEOdCMEkSJ12+fC3hx/5Z4zmhNiP92nJEF6XhmXokeB0hOtuQrjHCx2vmXrQ=="
|
||||
},
|
||||
"node_modules/jasmine-expect": {
|
||||
"version": "5.0.0",
|
||||
"resolved": "https://registry.npmjs.org/jasmine-expect/-/jasmine-expect-5.0.0.tgz",
|
||||
"integrity": "sha512-byn1zq0EQBA9UKs5A+H6gk5TRcanV+TqQMRxrjurGuqKkclaqgjw/vV6aT/jtf5tabXGonTH6VDZJ33Z1pxSxw==",
|
||||
"dependencies": {
|
||||
"add-matchers": "0.6.2"
|
||||
}
|
||||
}
|
||||
},
|
||||
"dependencies": {
|
||||
"add-matchers": {
|
||||
"version": "0.6.2",
|
||||
|
||||
@ -13,5 +13,5 @@ module.exports = {
|
||||
cast,
|
||||
EventBus,
|
||||
fs,
|
||||
log
|
||||
log,
|
||||
};
|
||||
|
||||
@ -1,6 +1,6 @@
|
||||
const _ = require('lodash');
|
||||
const EventEmitter = require('events').EventEmitter;
|
||||
const {default: ow} = require('ow');
|
||||
const { default: ow } = require('ow');
|
||||
|
||||
let cache = {};
|
||||
const hasOwnProp = Object.prototype.hasOwnProperty;
|
||||
|
||||
@ -41,15 +41,13 @@ function castString(str) {
|
||||
if (typeof str === 'string') {
|
||||
if (str.includes('.')) {
|
||||
number = parseFloat(str);
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
number = parseInt(str, 10);
|
||||
}
|
||||
|
||||
if (String(number) === str && !isNaN(number)) {
|
||||
result = number;
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
result = str;
|
||||
}
|
||||
}
|
||||
@ -69,7 +67,7 @@ function castString(str) {
|
||||
* @param {(string|Object|Array)} item - The item whose type or types will be converted.
|
||||
* @return {*?} The converted value.
|
||||
*/
|
||||
const cast = module.exports = item => {
|
||||
const cast = (module.exports = (item) => {
|
||||
let result;
|
||||
|
||||
if (Array.isArray(item)) {
|
||||
@ -77,19 +75,16 @@ const cast = module.exports = item => {
|
||||
for (let i = 0, l = item.length; i < l; i++) {
|
||||
result[i] = cast(item[i]);
|
||||
}
|
||||
}
|
||||
else if (typeof item === 'object' && item !== null) {
|
||||
} else if (typeof item === 'object' && item !== null) {
|
||||
result = {};
|
||||
Object.keys(item).forEach(prop => {
|
||||
Object.keys(item).forEach((prop) => {
|
||||
result[prop] = cast(item[prop]);
|
||||
});
|
||||
}
|
||||
else if (typeof item === 'string') {
|
||||
} else if (typeof item === 'string') {
|
||||
result = castString(item);
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
result = item;
|
||||
}
|
||||
|
||||
return result;
|
||||
};
|
||||
});
|
||||
|
||||
@ -6,14 +6,14 @@ const _ = require('lodash');
|
||||
const klawSync = require('klaw-sync');
|
||||
const path = require('path');
|
||||
|
||||
exports.lsSync = ((dir, opts = {}) => {
|
||||
exports.lsSync = (dir, opts = {}) => {
|
||||
const depth = _.has(opts, 'depth') ? opts.depth : -1;
|
||||
|
||||
const files = klawSync(dir, {
|
||||
depthLimit: depth,
|
||||
filter: (f => !path.basename(f.path).startsWith('.')),
|
||||
nodir: true
|
||||
filter: (f) => !path.basename(f.path).startsWith('.'),
|
||||
nodir: true,
|
||||
});
|
||||
|
||||
return files.map(f => f.path);
|
||||
});
|
||||
return files.map((f) => f.path);
|
||||
};
|
||||
|
||||
@ -3,7 +3,7 @@ const EventBus = require('./bus');
|
||||
const bus = new EventBus('jsdoc');
|
||||
const loggerFuncs = {};
|
||||
|
||||
['debug', 'error', 'info', 'fatal', 'verbose', 'warn'].forEach(fn => {
|
||||
['debug', 'error', 'info', 'fatal', 'verbose', 'warn'].forEach((fn) => {
|
||||
loggerFuncs[fn] = (...args) => bus.emit(`logger:${fn}`, ...args);
|
||||
});
|
||||
|
||||
|
||||
119
packages/jsdoc-util/package-lock.json
generated
119
packages/jsdoc-util/package-lock.json
generated
@ -1,8 +1,125 @@
|
||||
{
|
||||
"name": "@jsdoc/util",
|
||||
"version": "0.2.4",
|
||||
"lockfileVersion": 1,
|
||||
"lockfileVersion": 2,
|
||||
"requires": true,
|
||||
"packages": {
|
||||
"": {
|
||||
"name": "@jsdoc/util",
|
||||
"version": "0.2.4",
|
||||
"license": "Apache-2.0",
|
||||
"dependencies": {
|
||||
"klaw-sync": "^6.0.0",
|
||||
"lodash": "^4.17.21",
|
||||
"ow": "^0.27.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=v14.17.6"
|
||||
}
|
||||
},
|
||||
"node_modules/@sindresorhus/is": {
|
||||
"version": "4.0.1",
|
||||
"resolved": "https://registry.npmjs.org/@sindresorhus/is/-/is-4.0.1.tgz",
|
||||
"integrity": "sha512-Qm9hBEBu18wt1PO2flE7LPb30BHMQt1eQgbV76YntdNk73XZGpn3izvGTYxbGgzXKgbCjiia0uxTd3aTNQrY/g==",
|
||||
"engines": {
|
||||
"node": ">=10"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sindresorhus/is?sponsor=1"
|
||||
}
|
||||
},
|
||||
"node_modules/callsites": {
|
||||
"version": "3.1.0",
|
||||
"resolved": "https://registry.npmjs.org/callsites/-/callsites-3.1.0.tgz",
|
||||
"integrity": "sha512-P8BjAsXvZS+VIDUI11hHCQEv74YT67YUi5JJFNWIqL235sBmjX4+qx9Muvls5ivyNENctx46xQLQ3aTuE7ssaQ==",
|
||||
"engines": {
|
||||
"node": ">=6"
|
||||
}
|
||||
},
|
||||
"node_modules/dot-prop": {
|
||||
"version": "6.0.1",
|
||||
"resolved": "https://registry.npmjs.org/dot-prop/-/dot-prop-6.0.1.tgz",
|
||||
"integrity": "sha512-tE7ztYzXHIeyvc7N+hR3oi7FIbf/NIjVP9hmAt3yMXzrQ072/fpjGLx2GxNxGxUl5V73MEqYzioOMoVhGMJ5cA==",
|
||||
"dependencies": {
|
||||
"is-obj": "^2.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=10"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/sindresorhus"
|
||||
}
|
||||
},
|
||||
"node_modules/graceful-fs": {
|
||||
"version": "4.2.3",
|
||||
"resolved": "https://registry.npmjs.org/graceful-fs/-/graceful-fs-4.2.3.tgz",
|
||||
"integrity": "sha512-a30VEBm4PEdx1dRB7MFK7BejejvCvBronbLjht+sHuGYj8PHs7M/5Z+rt5lw551vZ7yfTCj4Vuyy3mSJytDWRQ=="
|
||||
},
|
||||
"node_modules/is-obj": {
|
||||
"version": "2.0.0",
|
||||
"resolved": "https://registry.npmjs.org/is-obj/-/is-obj-2.0.0.tgz",
|
||||
"integrity": "sha512-drqDG3cbczxxEJRoOXcOjtdp1J/lyp1mNn0xaznRs8+muBhgQcrnbspox5X5fOw0HnMnbfDzvnEMEtqDEJEo8w==",
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/klaw-sync": {
|
||||
"version": "6.0.0",
|
||||
"resolved": "https://registry.npmjs.org/klaw-sync/-/klaw-sync-6.0.0.tgz",
|
||||
"integrity": "sha512-nIeuVSzdCCs6TDPTqI8w1Yre34sSq7AkZ4B3sfOBbI2CgVSB4Du4aLQijFU2+lhAFCwt9+42Hel6lQNIv6AntQ==",
|
||||
"dependencies": {
|
||||
"graceful-fs": "^4.1.11"
|
||||
}
|
||||
},
|
||||
"node_modules/lodash": {
|
||||
"version": "4.17.21",
|
||||
"resolved": "https://registry.npmjs.org/lodash/-/lodash-4.17.21.tgz",
|
||||
"integrity": "sha512-v2kDEe57lecTulaDIuNTPy3Ry4gLGJ6Z1O3vE1krgXZNrsQ+LFTGHVxVjcXPs17LhbZVGedAJv8XZ1tvj5FvSg=="
|
||||
},
|
||||
"node_modules/lodash.isequal": {
|
||||
"version": "4.5.0",
|
||||
"resolved": "https://registry.npmjs.org/lodash.isequal/-/lodash.isequal-4.5.0.tgz",
|
||||
"integrity": "sha1-QVxEePK8wwEgwizhDtMib30+GOA="
|
||||
},
|
||||
"node_modules/ow": {
|
||||
"version": "0.27.0",
|
||||
"resolved": "https://registry.npmjs.org/ow/-/ow-0.27.0.tgz",
|
||||
"integrity": "sha512-SGnrGUbhn4VaUGdU0EJLMwZWSupPmF46hnTRII7aCLCrqixTAC5eKo8kI4/XXf1eaaI8YEVT+3FeGNJI9himAQ==",
|
||||
"dependencies": {
|
||||
"@sindresorhus/is": "^4.0.1",
|
||||
"callsites": "^3.1.0",
|
||||
"dot-prop": "^6.0.1",
|
||||
"lodash.isequal": "^4.5.0",
|
||||
"type-fest": "^1.2.1",
|
||||
"vali-date": "^1.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=12"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/sindresorhus"
|
||||
}
|
||||
},
|
||||
"node_modules/type-fest": {
|
||||
"version": "1.2.2",
|
||||
"resolved": "https://registry.npmjs.org/type-fest/-/type-fest-1.2.2.tgz",
|
||||
"integrity": "sha512-pfkPYCcuV0TJoo/jlsUeWNV8rk7uMU6ocnYNvca1Vu+pyKi8Rl8Zo2scPt9O72gCsXIm+dMxOOWuA3VFDSdzWA==",
|
||||
"engines": {
|
||||
"node": ">=10"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/sindresorhus"
|
||||
}
|
||||
},
|
||||
"node_modules/vali-date": {
|
||||
"version": "1.0.0",
|
||||
"resolved": "https://registry.npmjs.org/vali-date/-/vali-date-1.0.0.tgz",
|
||||
"integrity": "sha1-G5BKWWCfsyjvB4E4Qgk09rhnCaY=",
|
||||
"engines": {
|
||||
"node": ">=0.10.0"
|
||||
}
|
||||
}
|
||||
},
|
||||
"dependencies": {
|
||||
"@sindresorhus/is": {
|
||||
"version": "4.0.1",
|
||||
|
||||
@ -45,14 +45,14 @@ describe('@jsdoc/util/lib/cast', () => {
|
||||
it('casts values of properties in nested objects', () => {
|
||||
const result = cast({
|
||||
foo: {
|
||||
bar: 'true'
|
||||
}
|
||||
bar: 'true',
|
||||
},
|
||||
});
|
||||
|
||||
expect(result).toEqual({
|
||||
foo: {
|
||||
bar: true
|
||||
}
|
||||
bar: true,
|
||||
},
|
||||
});
|
||||
});
|
||||
|
||||
|
||||
@ -22,18 +22,18 @@ describe('@jsdoc/util/lib/fs', () => {
|
||||
meniscus: '',
|
||||
toes: {
|
||||
phalanx: '',
|
||||
'.big-toe-phalanx': ''
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
'.big-toe-phalanx': '',
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
});
|
||||
});
|
||||
|
||||
const cwd = process.cwd();
|
||||
|
||||
function resolvePaths(files) {
|
||||
return files.map(f => path.join(cwd, f)).sort();
|
||||
return files.map((f) => path.join(cwd, f)).sort();
|
||||
}
|
||||
|
||||
const allFiles = resolvePaths([
|
||||
@ -42,7 +42,7 @@ describe('@jsdoc/util/lib/fs', () => {
|
||||
'head/mouth',
|
||||
'head/nose',
|
||||
'head/shoulders/knees/meniscus',
|
||||
'head/shoulders/knees/toes/phalanx'
|
||||
'head/shoulders/knees/toes/phalanx',
|
||||
]);
|
||||
|
||||
it('gets all non-hidden files from all levels by default', () => {
|
||||
@ -54,12 +54,7 @@ describe('@jsdoc/util/lib/fs', () => {
|
||||
it('limits recursion depth when asked', () => {
|
||||
const files = fsUtil.lsSync(cwd, { depth: 1 }).sort();
|
||||
|
||||
expect(files).toEqual(resolvePaths([
|
||||
'head/eyes',
|
||||
'head/ears',
|
||||
'head/mouth',
|
||||
'head/nose'
|
||||
]));
|
||||
expect(files).toEqual(resolvePaths(['head/eyes', 'head/ears', 'head/mouth', 'head/nose']));
|
||||
});
|
||||
|
||||
it('treats a depth of -1 as infinite', () => {
|
||||
|
||||
@ -9,7 +9,7 @@ describe('@jsdoc/util/lib/log', () => {
|
||||
});
|
||||
|
||||
it('provides the expected functions', () => {
|
||||
fns.forEach(fn => {
|
||||
fns.forEach((fn) => {
|
||||
expect(log[fn]).toBeFunction();
|
||||
});
|
||||
});
|
||||
@ -18,10 +18,10 @@ describe('@jsdoc/util/lib/log', () => {
|
||||
const bus = new EventBus('jsdoc');
|
||||
|
||||
it('sends events to the event bus', () => {
|
||||
fns.forEach(fn => {
|
||||
fns.forEach((fn) => {
|
||||
let event;
|
||||
|
||||
bus.once(`logger:${fn}`, e => {
|
||||
bus.once(`logger:${fn}`, (e) => {
|
||||
event = e;
|
||||
});
|
||||
log[fn]('testing');
|
||||
|
||||
@ -20,14 +20,14 @@ module.exports = (() => {
|
||||
packageJson: null,
|
||||
shouldExitWithError: false,
|
||||
shouldPrintHelp: false,
|
||||
tmpdir: null
|
||||
tmpdir: null,
|
||||
};
|
||||
|
||||
const bus = new EventBus('jsdoc');
|
||||
const cli = {};
|
||||
const engine = new Engine();
|
||||
const FATAL_ERROR_MESSAGE = 'Exiting JSDoc because an error occurred. See the previous log ' +
|
||||
'messages for details.';
|
||||
const FATAL_ERROR_MESSAGE =
|
||||
'Exiting JSDoc because an error occurred. See the previous log ' + 'messages for details.';
|
||||
const LOG_LEVELS = Engine.LOG_LEVELS;
|
||||
|
||||
// TODO: docs
|
||||
@ -35,13 +35,14 @@ module.exports = (() => {
|
||||
const fs = require('fs');
|
||||
|
||||
// allow this to throw--something is really wrong if we can't read our own package file
|
||||
const info = JSON.parse(stripBom(fs.readFileSync(path.join(env.dirname, 'package.json'),
|
||||
'utf8')));
|
||||
const info = JSON.parse(
|
||||
stripBom(fs.readFileSync(path.join(env.dirname, 'package.json'), 'utf8'))
|
||||
);
|
||||
const revision = new Date(parseInt(info.revision, 10));
|
||||
|
||||
env.version = {
|
||||
number: info.version,
|
||||
revision: revision.toUTCString()
|
||||
revision: revision.toUTCString(),
|
||||
};
|
||||
|
||||
engine.version = env.version.number;
|
||||
@ -57,13 +58,9 @@ module.exports = (() => {
|
||||
|
||||
try {
|
||||
env.opts = engine.parseFlags(env.args);
|
||||
}
|
||||
catch (e) {
|
||||
} catch (e) {
|
||||
props.shouldPrintHelp = true;
|
||||
cli.exit(
|
||||
1,
|
||||
`${e.message}\n`
|
||||
);
|
||||
cli.exit(1, `${e.message}\n`);
|
||||
|
||||
return cli;
|
||||
}
|
||||
@ -71,12 +68,8 @@ module.exports = (() => {
|
||||
try {
|
||||
conf = config.loadSync(env.opts.configure);
|
||||
env.conf = conf.config;
|
||||
}
|
||||
catch (e) {
|
||||
cli.exit(
|
||||
1,
|
||||
`Cannot parse the config file ${conf.filepath}: ${e}\n${FATAL_ERROR_MESSAGE}`
|
||||
);
|
||||
} catch (e) {
|
||||
cli.exit(1, `Cannot parse the config file ${conf.filepath}: ${e}\n${FATAL_ERROR_MESSAGE}`);
|
||||
|
||||
return cli;
|
||||
}
|
||||
@ -102,16 +95,14 @@ module.exports = (() => {
|
||||
} else {
|
||||
if (env.opts.debug) {
|
||||
engine.logLevel = LOG_LEVELS.DEBUG;
|
||||
}
|
||||
else if (env.opts.verbose) {
|
||||
} else if (env.opts.verbose) {
|
||||
engine.logLevel = LOG_LEVELS.INFO;
|
||||
}
|
||||
|
||||
if (env.opts.pedantic) {
|
||||
bus.once('logger:warn', recoverableError);
|
||||
bus.once('logger:error', fatalError);
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
bus.once('logger:error', recoverableError);
|
||||
}
|
||||
|
||||
@ -127,8 +118,8 @@ module.exports = (() => {
|
||||
log.debug('Environment info: %j', {
|
||||
env: {
|
||||
conf: env.conf,
|
||||
opts: env.opts
|
||||
}
|
||||
opts: env.opts,
|
||||
},
|
||||
});
|
||||
};
|
||||
|
||||
@ -155,21 +146,17 @@ module.exports = (() => {
|
||||
// If we already need to exit with an error, don't do any more work.
|
||||
if (props.shouldExitWithError) {
|
||||
cmd = () => Promise.resolve(0);
|
||||
}
|
||||
else if (opts.help) {
|
||||
} else if (opts.help) {
|
||||
cmd = cli.printHelp;
|
||||
}
|
||||
else if (opts.test) {
|
||||
} else if (opts.test) {
|
||||
cmd = cli.runTests;
|
||||
}
|
||||
else if (opts.version) {
|
||||
} else if (opts.version) {
|
||||
cmd = cli.printVersion;
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
cmd = cli.main;
|
||||
}
|
||||
|
||||
return cmd().then(errorCode => {
|
||||
return cmd().then((errorCode) => {
|
||||
if (!errorCode && props.shouldExitWithError) {
|
||||
errorCode = 1;
|
||||
}
|
||||
@ -206,7 +193,8 @@ module.exports = (() => {
|
||||
|
||||
return Promise.resolve(0);
|
||||
} else {
|
||||
return cli.createParser()
|
||||
return cli
|
||||
.createParser()
|
||||
.parseFiles()
|
||||
.processParseResults()
|
||||
.then(() => {
|
||||
@ -221,9 +209,8 @@ module.exports = (() => {
|
||||
const fs = require('fs');
|
||||
|
||||
try {
|
||||
return stripJsonComments( fs.readFileSync(filepath, 'utf8') );
|
||||
}
|
||||
catch (e) {
|
||||
return stripJsonComments(fs.readFileSync(filepath, 'utf8'));
|
||||
} catch (e) {
|
||||
log.error(`Unable to read the package file ${filepath}`);
|
||||
|
||||
return null;
|
||||
@ -249,12 +236,12 @@ module.exports = (() => {
|
||||
for (let i = 0, l = sourceFiles.length; i < l; i++) {
|
||||
sourceFile = sourceFiles[i];
|
||||
|
||||
if ( !env.opts.package && /\bpackage\.json$/i.test(sourceFile) ) {
|
||||
if (!env.opts.package && /\bpackage\.json$/i.test(sourceFile)) {
|
||||
packageJson = readPackageJson(sourceFile);
|
||||
sourceFiles.splice(i--, 1);
|
||||
}
|
||||
|
||||
if ( !env.opts.readme && /(\bREADME|\.md)$/i.test(sourceFile) ) {
|
||||
if (!env.opts.readme && /(\bREADME|\.md)$/i.test(sourceFile)) {
|
||||
env.opts.readme = sourceFile;
|
||||
sourceFiles.splice(i--, 1);
|
||||
}
|
||||
@ -285,8 +272,11 @@ module.exports = (() => {
|
||||
filter = new Filter(env.conf.source);
|
||||
scanner = new Scanner();
|
||||
|
||||
env.sourceFiles = scanner.scan(env.opts._,
|
||||
(env.opts.recurse ? env.conf.recurseDepth : undefined), filter);
|
||||
env.sourceFiles = scanner.scan(
|
||||
env.opts._,
|
||||
env.opts.recurse ? env.conf.recurseDepth : undefined,
|
||||
filter
|
||||
);
|
||||
}
|
||||
|
||||
return cli;
|
||||
@ -339,8 +329,7 @@ module.exports = (() => {
|
||||
cli.dumpParseResults();
|
||||
|
||||
return Promise.resolve();
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
return cli.generateDocs();
|
||||
}
|
||||
};
|
||||
@ -363,8 +352,7 @@ module.exports = (() => {
|
||||
// TODO: Just look for a `publish` function in the specified module, not a `publish.js`
|
||||
// file _and_ a `publish` function.
|
||||
template = require(`${env.opts.template}/publish`);
|
||||
}
|
||||
catch (e) {
|
||||
} catch (e) {
|
||||
log.fatal(`Unable to load template: ${e.message}` || e);
|
||||
}
|
||||
|
||||
@ -373,15 +361,12 @@ module.exports = (() => {
|
||||
let publishPromise;
|
||||
|
||||
log.info('Generating output files...');
|
||||
publishPromise = template.publish(
|
||||
taffy(props.docs),
|
||||
env.opts
|
||||
);
|
||||
publishPromise = template.publish(taffy(props.docs), env.opts);
|
||||
|
||||
return Promise.resolve(publishPromise);
|
||||
}
|
||||
else {
|
||||
message = `${env.opts.template} does not export a "publish" function. ` +
|
||||
} else {
|
||||
message =
|
||||
`${env.opts.template} does not export a "publish" function. ` +
|
||||
'Global "publish" functions are no longer supported.';
|
||||
log.fatal(message);
|
||||
|
||||
|
||||
@ -16,15 +16,15 @@
|
||||
require = require('requizzle')({
|
||||
requirePaths: {
|
||||
before: [path.join(__dirname, 'lib')],
|
||||
after: [path.join(__dirname, 'node_modules')]
|
||||
after: [path.join(__dirname, 'node_modules')],
|
||||
},
|
||||
infect: true
|
||||
infect: true,
|
||||
});
|
||||
/* eslint-enable no-global-assign, no-redeclare */
|
||||
|
||||
// resolve the path if it's a symlink
|
||||
if ( fs.statSync(jsdocPath).isSymbolicLink() ) {
|
||||
jsdocPath = path.resolve( path.dirname(jsdocPath), fs.readlinkSync(jsdocPath) );
|
||||
if (fs.statSync(jsdocPath).isSymbolicLink()) {
|
||||
jsdocPath = path.resolve(path.dirname(jsdocPath), fs.readlinkSync(jsdocPath));
|
||||
}
|
||||
|
||||
env = require('./lib/jsdoc/env');
|
||||
@ -46,10 +46,7 @@ global.env = (() => require('./lib/jsdoc/env'))();
|
||||
(async () => {
|
||||
const cli = require('./cli');
|
||||
|
||||
cli.setVersionInfo()
|
||||
.loadConfig()
|
||||
.configureLogger()
|
||||
.logStart();
|
||||
cli.setVersionInfo().loadConfig().configureLogger().logStart();
|
||||
|
||||
await cli.runCommand();
|
||||
})();
|
||||
|
||||
@ -6,7 +6,7 @@
|
||||
const _ = require('lodash');
|
||||
const { fromParts, SCOPE, toParts } = require('@jsdoc/core').name;
|
||||
const jsdoc = {
|
||||
doclet: require('jsdoc/doclet')
|
||||
doclet: require('jsdoc/doclet'),
|
||||
};
|
||||
|
||||
const hasOwnProp = Object.prototype.hasOwnProperty;
|
||||
@ -18,7 +18,7 @@ function mapDependencies(index, propertyName) {
|
||||
const kinds = ['class', 'external', 'interface', 'mixin'];
|
||||
let len = 0;
|
||||
|
||||
Object.keys(index).forEach(indexName => {
|
||||
Object.keys(index).forEach((indexName) => {
|
||||
doclets = index[indexName];
|
||||
for (let i = 0, ii = doclets.length; i < ii; i++) {
|
||||
doc = doclets[i];
|
||||
@ -49,7 +49,7 @@ class Sorter {
|
||||
this.visited[key] = true;
|
||||
|
||||
if (this.dependencies[key]) {
|
||||
Object.keys(this.dependencies[key]).forEach(path => {
|
||||
Object.keys(this.dependencies[key]).forEach((path) => {
|
||||
this.visit(path);
|
||||
});
|
||||
}
|
||||
@ -59,7 +59,7 @@ class Sorter {
|
||||
}
|
||||
|
||||
sort() {
|
||||
Object.keys(this.dependencies).forEach(key => {
|
||||
Object.keys(this.dependencies).forEach((key) => {
|
||||
this.visit(key);
|
||||
});
|
||||
|
||||
@ -73,11 +73,11 @@ function sort(dependencies) {
|
||||
return sorter.sort();
|
||||
}
|
||||
|
||||
function getMembers(longname, {index}, scopes) {
|
||||
function getMembers(longname, { index }, scopes) {
|
||||
const memberof = index.memberof[longname] || [];
|
||||
const members = [];
|
||||
|
||||
memberof.forEach(candidate => {
|
||||
memberof.forEach((candidate) => {
|
||||
if (scopes.includes(candidate.scope)) {
|
||||
members.push(candidate);
|
||||
}
|
||||
@ -86,7 +86,7 @@ function getMembers(longname, {index}, scopes) {
|
||||
return members;
|
||||
}
|
||||
|
||||
function getDocumentedLongname(longname, {index}) {
|
||||
function getDocumentedLongname(longname, { index }) {
|
||||
const doclets = index.documented[longname] || [];
|
||||
|
||||
return doclets[doclets.length - 1];
|
||||
@ -98,7 +98,7 @@ function addDocletProperty(doclets, propName, value) {
|
||||
}
|
||||
}
|
||||
|
||||
function reparentDoclet({longname}, child) {
|
||||
function reparentDoclet({ longname }, child) {
|
||||
const parts = toParts(child.longname);
|
||||
|
||||
parts.memberof = longname;
|
||||
@ -106,7 +106,7 @@ function reparentDoclet({longname}, child) {
|
||||
child.longname = fromParts(parts);
|
||||
}
|
||||
|
||||
function parentIsClass({kind}) {
|
||||
function parentIsClass({ kind }) {
|
||||
return kind === 'class';
|
||||
}
|
||||
|
||||
@ -140,8 +140,7 @@ function updateAddedDoclets(doclet, additions, indexes) {
|
||||
if (typeof indexes[doclet.longname] !== 'undefined') {
|
||||
// replace the existing doclet
|
||||
additions[indexes[doclet.longname]] = doclet;
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
// add the doclet to the array, and track its index
|
||||
additions.push(doclet);
|
||||
indexes[doclet.longname] = additions.length - 1;
|
||||
@ -158,7 +157,7 @@ function updateAddedDoclets(doclet, additions, indexes) {
|
||||
* @return {void}
|
||||
*/
|
||||
function updateDocumentedDoclets(doclet, documented) {
|
||||
if ( !hasOwnProp.call(documented, doclet.longname) ) {
|
||||
if (!hasOwnProp.call(documented, doclet.longname)) {
|
||||
documented[doclet.longname] = [];
|
||||
}
|
||||
|
||||
@ -176,7 +175,7 @@ function updateDocumentedDoclets(doclet, documented) {
|
||||
*/
|
||||
function updateMemberofDoclets(doclet, memberof) {
|
||||
if (doclet.memberof) {
|
||||
if ( !hasOwnProp.call(memberof, doclet.memberof) ) {
|
||||
if (!hasOwnProp.call(memberof, doclet.memberof)) {
|
||||
memberof[doclet.memberof] = [];
|
||||
}
|
||||
|
||||
@ -208,7 +207,7 @@ function changeMemberof(longname, newMemberof) {
|
||||
}
|
||||
|
||||
// TODO: try to reduce overlap with similar methods
|
||||
function getInheritedAdditions(doclets, docs, {documented, memberof}) {
|
||||
function getInheritedAdditions(doclets, docs, { documented, memberof }) {
|
||||
let additionIndexes;
|
||||
const additions = [];
|
||||
let childDoclet;
|
||||
@ -227,7 +226,7 @@ function getInheritedAdditions(doclets, docs, {documented, memberof}) {
|
||||
doc = doclets[i];
|
||||
parents = doc.augments;
|
||||
|
||||
if ( parents && (doc.kind === 'class' || doc.kind === 'interface') ) {
|
||||
if (parents && (doc.kind === 'class' || doc.kind === 'interface')) {
|
||||
// reset the lookup table of added doclet indexes by longname
|
||||
additionIndexes = {};
|
||||
|
||||
@ -266,23 +265,22 @@ function getInheritedAdditions(doclets, docs, {documented, memberof}) {
|
||||
// Indicate what the descendant is overriding. (We only care about the closest
|
||||
// ancestor. For classes A > B > C, if B#a overrides A#a, and C#a inherits B#a,
|
||||
// we don't want the doclet for C#a to say that it overrides A#a.)
|
||||
if ( hasOwnProp.call(docs.index.longname, member.longname) ) {
|
||||
if (hasOwnProp.call(docs.index.longname, member.longname)) {
|
||||
member.overrides = parentDoclet.longname;
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
delete member.overrides;
|
||||
}
|
||||
|
||||
// Add the ancestor's docs unless the descendant overrides the ancestor AND
|
||||
// documents the override.
|
||||
if ( !hasOwnProp.call(documented, member.longname) ) {
|
||||
if (!hasOwnProp.call(documented, member.longname)) {
|
||||
updateAddedDoclets(member, additions, additionIndexes);
|
||||
updateDocumentedDoclets(member, documented);
|
||||
updateMemberofDoclets(member, memberof);
|
||||
}
|
||||
// If the descendant used an @inheritdoc or @override tag, add the ancestor's
|
||||
// docs, and ignore the existing doclets.
|
||||
else if ( explicitlyInherits(documented[member.longname]) ) {
|
||||
else if (explicitlyInherits(documented[member.longname])) {
|
||||
// Ignore any existing doclets. (This is safe because we only get here if
|
||||
// `member.longname` is an own property of `documented`.)
|
||||
addDocletProperty(documented[member.longname], 'ignore', true);
|
||||
@ -306,8 +304,7 @@ function getInheritedAdditions(doclets, docs, {documented, memberof}) {
|
||||
// If the descendant overrides the ancestor and documents the override,
|
||||
// update the doclets to indicate what the descendant is overriding.
|
||||
else {
|
||||
addDocletProperty(documented[member.longname], 'overrides',
|
||||
parentDoclet.longname);
|
||||
addDocletProperty(documented[member.longname], 'overrides', parentDoclet.longname);
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -325,8 +322,7 @@ function updateMixes(mixedDoclet, mixedLongname) {
|
||||
// take the fast path if there's no array of mixed-in longnames
|
||||
if (!mixedDoclet.mixes) {
|
||||
mixedDoclet.mixes = [mixedLongname];
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
// find the short name of the longname we're mixing in
|
||||
mixedName = toParts(mixedLongname).name;
|
||||
// find the short name of each previously mixed-in symbol
|
||||
@ -345,7 +341,7 @@ function updateMixes(mixedDoclet, mixedLongname) {
|
||||
}
|
||||
|
||||
// TODO: try to reduce overlap with similar methods
|
||||
function getMixedInAdditions(mixinDoclets, allDoclets, {documented, memberof}) {
|
||||
function getMixedInAdditions(mixinDoclets, allDoclets, { documented, memberof }) {
|
||||
let additionIndexes;
|
||||
const additions = [];
|
||||
const commentedDoclets = documented;
|
||||
@ -398,12 +394,12 @@ function getMixedInAdditions(mixinDoclets, allDoclets, {documented, memberof}) {
|
||||
}
|
||||
|
||||
function updateImplements(implDoclets, implementedLongname) {
|
||||
if ( !Array.isArray(implDoclets) ) {
|
||||
if (!Array.isArray(implDoclets)) {
|
||||
implDoclets = [implDoclets];
|
||||
}
|
||||
|
||||
implDoclets.forEach(implDoclet => {
|
||||
if ( !hasOwnProp.call(implDoclet, 'implements') ) {
|
||||
implDoclets.forEach((implDoclet) => {
|
||||
if (!hasOwnProp.call(implDoclet, 'implements')) {
|
||||
implDoclet.implements = [];
|
||||
}
|
||||
|
||||
@ -414,7 +410,7 @@ function updateImplements(implDoclets, implementedLongname) {
|
||||
}
|
||||
|
||||
// TODO: try to reduce overlap with similar methods
|
||||
function getImplementedAdditions(implDoclets, allDoclets, {documented, memberof}) {
|
||||
function getImplementedAdditions(implDoclets, allDoclets, { documented, memberof }) {
|
||||
let additionIndexes;
|
||||
const additions = [];
|
||||
let childDoclet;
|
||||
@ -464,26 +460,24 @@ function getImplementedAdditions(implDoclets, allDoclets, {documented, memberof}
|
||||
updateImplements(implementationDoclet, parentDoclet.longname);
|
||||
|
||||
// If there's no implementation, move along.
|
||||
implExists = hasOwnProp.call(allDoclets.index.longname,
|
||||
implementationDoclet.longname);
|
||||
implExists = hasOwnProp.call(allDoclets.index.longname, implementationDoclet.longname);
|
||||
if (!implExists) {
|
||||
continue;
|
||||
}
|
||||
|
||||
// Add the interface's docs unless the implementation is already documented.
|
||||
if ( !hasOwnProp.call(commentedDoclets, implementationDoclet.longname) ) {
|
||||
if (!hasOwnProp.call(commentedDoclets, implementationDoclet.longname)) {
|
||||
updateAddedDoclets(implementationDoclet, additions, additionIndexes);
|
||||
updateDocumentedDoclets(implementationDoclet, commentedDoclets);
|
||||
updateMemberofDoclets(implementationDoclet, memberof);
|
||||
}
|
||||
// If the implementation used an @inheritdoc or @override tag, add the
|
||||
// interface's docs, and ignore the existing doclets.
|
||||
else if ( explicitlyInherits(commentedDoclets[implementationDoclet.longname]) ) {
|
||||
else if (explicitlyInherits(commentedDoclets[implementationDoclet.longname])) {
|
||||
// Ignore any existing doclets. (This is safe because we only get here if
|
||||
// `implementationDoclet.longname` is an own property of
|
||||
// `commentedDoclets`.)
|
||||
addDocletProperty(commentedDoclets[implementationDoclet.longname], 'ignore',
|
||||
true);
|
||||
addDocletProperty(commentedDoclets[implementationDoclet.longname], 'ignore', true);
|
||||
|
||||
updateAddedDoclets(implementationDoclet, additions, additionIndexes);
|
||||
updateDocumentedDoclets(implementationDoclet, commentedDoclets);
|
||||
@ -504,8 +498,10 @@ function getImplementedAdditions(implDoclets, allDoclets, {documented, memberof}
|
||||
// If there's an implementation, and it's documented, update the doclets to
|
||||
// indicate what the implementation is implementing.
|
||||
else {
|
||||
updateImplements(commentedDoclets[implementationDoclet.longname],
|
||||
parentDoclet.longname);
|
||||
updateImplements(
|
||||
commentedDoclets[implementationDoclet.longname],
|
||||
parentDoclet.longname
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -517,15 +513,15 @@ function getImplementedAdditions(implDoclets, allDoclets, {documented, memberof}
|
||||
|
||||
function augment(doclets, propertyName, docletFinder) {
|
||||
const index = doclets.index.longname;
|
||||
const dependencies = sort( mapDependencies(index, propertyName) );
|
||||
const dependencies = sort(mapDependencies(index, propertyName));
|
||||
|
||||
dependencies.forEach(depName => {
|
||||
dependencies.forEach((depName) => {
|
||||
const additions = docletFinder(index[depName], doclets, doclets.index);
|
||||
|
||||
additions.forEach(addition => {
|
||||
additions.forEach((addition) => {
|
||||
const longname = addition.longname;
|
||||
|
||||
if ( !hasOwnProp.call(index, longname) ) {
|
||||
if (!hasOwnProp.call(index, longname)) {
|
||||
index[longname] = [];
|
||||
}
|
||||
index[longname].push(addition);
|
||||
@ -544,7 +540,7 @@ function augment(doclets, propertyName, docletFinder) {
|
||||
* @param {!Object} doclets.index - The doclet index.
|
||||
* @return {void}
|
||||
*/
|
||||
exports.addInherited = doclets => {
|
||||
exports.addInherited = (doclets) => {
|
||||
augment(doclets, 'augments', getInheritedAdditions);
|
||||
};
|
||||
|
||||
@ -563,7 +559,7 @@ exports.addInherited = doclets => {
|
||||
* @param {!Object} doclets.index - The doclet index.
|
||||
* @return {void}
|
||||
*/
|
||||
exports.addMixedIn = doclets => {
|
||||
exports.addMixedIn = (doclets) => {
|
||||
augment(doclets, 'mixes', getMixedInAdditions);
|
||||
};
|
||||
|
||||
@ -584,7 +580,7 @@ exports.addMixedIn = doclets => {
|
||||
* @param {!Object} doclets.index - The doclet index.
|
||||
* @return {void}
|
||||
*/
|
||||
exports.addImplemented = doclets => {
|
||||
exports.addImplemented = (doclets) => {
|
||||
augment(doclets, 'implements', getImplementedAdditions);
|
||||
};
|
||||
|
||||
@ -599,7 +595,7 @@ exports.addImplemented = doclets => {
|
||||
*
|
||||
* @return {void}
|
||||
*/
|
||||
exports.augmentAll = doclets => {
|
||||
exports.augmentAll = (doclets) => {
|
||||
exports.addMixedIn(doclets);
|
||||
exports.addImplemented(doclets);
|
||||
exports.addInherited(doclets);
|
||||
|
||||
@ -5,8 +5,8 @@
|
||||
const _ = require('lodash');
|
||||
const { SCOPE } = require('@jsdoc/core').name;
|
||||
|
||||
function cloneBorrowedDoclets({borrowed, longname}, doclets) {
|
||||
borrowed.forEach(({from, as}) => {
|
||||
function cloneBorrowedDoclets({ borrowed, longname }, doclets) {
|
||||
borrowed.forEach(({ from, as }) => {
|
||||
const borrowedDoclets = doclets.index.longname[from];
|
||||
let borrowedAs = as || from;
|
||||
let parts;
|
||||
@ -14,15 +14,14 @@ function cloneBorrowedDoclets({borrowed, longname}, doclets) {
|
||||
|
||||
if (borrowedDoclets) {
|
||||
borrowedAs = borrowedAs.replace(/^prototype\./, SCOPE.PUNC.INSTANCE);
|
||||
_.cloneDeep(borrowedDoclets).forEach(clone => {
|
||||
_.cloneDeep(borrowedDoclets).forEach((clone) => {
|
||||
// TODO: this will fail on longnames like '"Foo#bar".baz'
|
||||
parts = borrowedAs.split(SCOPE.PUNC.INSTANCE);
|
||||
|
||||
if (parts.length === 2) {
|
||||
clone.scope = SCOPE.NAMES.INSTANCE;
|
||||
scopePunc = SCOPE.PUNC.INSTANCE;
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
clone.scope = SCOPE.NAMES.STATIC;
|
||||
scopePunc = SCOPE.PUNC.STATIC;
|
||||
}
|
||||
@ -42,7 +41,7 @@ function cloneBorrowedDoclets({borrowed, longname}, doclets) {
|
||||
moving docs from the "borrowed" array and into the general docs, then
|
||||
deleting the "borrowed" array.
|
||||
*/
|
||||
exports.resolveBorrows = doclets => {
|
||||
exports.resolveBorrows = (doclets) => {
|
||||
for (let doclet of doclets.index.borrowed) {
|
||||
cloneBorrowedDoclets(doclet, doclets);
|
||||
delete doclet.borrowed;
|
||||
|
||||
@ -15,7 +15,7 @@ const {
|
||||
PUNC_TO_SCOPE,
|
||||
SCOPE,
|
||||
SCOPE_TO_PUNC,
|
||||
toParts
|
||||
toParts,
|
||||
} = require('@jsdoc/core').name;
|
||||
const helper = require('jsdoc/util/templateHelper');
|
||||
const path = require('path');
|
||||
@ -28,7 +28,7 @@ const DEFAULT_SCOPE = SCOPE.NAMES.STATIC;
|
||||
function fakeMeta(node) {
|
||||
return {
|
||||
type: node ? node.type : null,
|
||||
node: node
|
||||
node: node,
|
||||
};
|
||||
}
|
||||
|
||||
@ -40,31 +40,26 @@ function codeToKind(code) {
|
||||
|
||||
if (isFunction(code.type) && code.type !== Syntax.MethodDefinition) {
|
||||
kind = 'function';
|
||||
}
|
||||
else if (code.type === Syntax.MethodDefinition && node) {
|
||||
} else if (code.type === Syntax.MethodDefinition && node) {
|
||||
if (node.kind === 'constructor') {
|
||||
kind = 'class';
|
||||
}
|
||||
else if (node.kind !== 'get' && node.kind !== 'set') {
|
||||
} else if (node.kind !== 'get' && node.kind !== 'set') {
|
||||
kind = 'function';
|
||||
}
|
||||
}
|
||||
else if (code.type === Syntax.ClassDeclaration || code.type === Syntax.ClassExpression) {
|
||||
} else if (code.type === Syntax.ClassDeclaration || code.type === Syntax.ClassExpression) {
|
||||
kind = 'class';
|
||||
}
|
||||
else if (code.type === Syntax.ExportAllDeclaration) {
|
||||
} else if (code.type === Syntax.ExportAllDeclaration) {
|
||||
// this value will often be an Identifier for a variable, which isn't very useful
|
||||
kind = codeToKind(fakeMeta(node.source));
|
||||
}
|
||||
else if (code.type === Syntax.ExportDefaultDeclaration ||
|
||||
code.type === Syntax.ExportNamedDeclaration) {
|
||||
} else if (
|
||||
code.type === Syntax.ExportDefaultDeclaration ||
|
||||
code.type === Syntax.ExportNamedDeclaration
|
||||
) {
|
||||
kind = codeToKind(fakeMeta(node.declaration));
|
||||
}
|
||||
else if (code.type === Syntax.ExportSpecifier) {
|
||||
} else if (code.type === Syntax.ExportSpecifier) {
|
||||
// this value will often be an Identifier for a variable, which isn't very useful
|
||||
kind = codeToKind(fakeMeta(node.local));
|
||||
}
|
||||
else if (node && node.parent && isFunction(node.parent)) {
|
||||
} else if (node && node.parent && isFunction(node.parent)) {
|
||||
kind = 'param';
|
||||
}
|
||||
|
||||
@ -82,7 +77,8 @@ function unwrap(docletSrc) {
|
||||
// use the /m flag on regex to avoid having to guess what this platform's newline is
|
||||
docletSrc =
|
||||
// remove opening slash+stars
|
||||
docletSrc.replace(/^\/\*\*+/, '')
|
||||
docletSrc
|
||||
.replace(/^\/\*\*+/, '')
|
||||
// replace closing star slash with end-marker
|
||||
.replace(/\**\*\/$/, '\\Z')
|
||||
// remove left margin like: spaces+star or spaces+end-marker
|
||||
@ -110,7 +106,7 @@ function toTags(docletSrc) {
|
||||
.replace(/^(\s*)@(\S)/gm, '$1\\@$2')
|
||||
// then split on that arbitrary sequence
|
||||
.split('\\@')
|
||||
.forEach($ => {
|
||||
.forEach(($) => {
|
||||
if ($) {
|
||||
parsedTag = $.match(/^(\S+)(?:\s+(\S[\s\S]*))?/);
|
||||
|
||||
@ -121,7 +117,7 @@ function toTags(docletSrc) {
|
||||
if (tagTitle) {
|
||||
tagData.push({
|
||||
title: tagTitle,
|
||||
text: tagText
|
||||
text: tagText,
|
||||
});
|
||||
}
|
||||
}
|
||||
@ -131,13 +127,12 @@ function toTags(docletSrc) {
|
||||
return tagData;
|
||||
}
|
||||
|
||||
function fixDescription(docletSrc, {code}) {
|
||||
function fixDescription(docletSrc, { code }) {
|
||||
let isClass;
|
||||
|
||||
if (!/^\s*@/.test(docletSrc) && docletSrc.replace(/\s/g, '').length) {
|
||||
isClass = code &&
|
||||
(code.type === Syntax.ClassDeclaration ||
|
||||
code.type === Syntax.ClassExpression);
|
||||
isClass =
|
||||
code && (code.type === Syntax.ClassDeclaration || code.type === Syntax.ClassExpression);
|
||||
|
||||
docletSrc = `${isClass ? '@classdesc' : '@description'} ${docletSrc}`;
|
||||
}
|
||||
@ -184,31 +179,33 @@ function resolve(doclet) {
|
||||
name = doclet.longname = doclet.meta.code.funcscope + SCOPE.PUNC.INNER + name;
|
||||
}
|
||||
|
||||
if (memberof || doclet.forceMemberof) { // @memberof tag given
|
||||
if (memberof || doclet.forceMemberof) {
|
||||
// @memberof tag given
|
||||
memberof = prototypeToPunc(memberof);
|
||||
|
||||
// The name is a complete longname, like `@name foo.bar` with `@memberof foo`.
|
||||
if (name && nameIsLongname(name, memberof) && name !== memberof) {
|
||||
about = toParts(name, (doclet.forceMemberof ? memberof : undefined));
|
||||
about = toParts(name, doclet.forceMemberof ? memberof : undefined);
|
||||
}
|
||||
// The name and memberof are identical and refer to a module, like `@name module:foo` with
|
||||
// `@memberof module:foo`.
|
||||
else if (name && name === memberof && name.indexOf(MODULE_NAMESPACE) === 0) {
|
||||
about = toParts(name, (doclet.forceMemberof ? memberof : undefined));
|
||||
about = toParts(name, doclet.forceMemberof ? memberof : undefined);
|
||||
}
|
||||
// The name and memberof are identical, like `@name foo` with `@memberof foo`.
|
||||
else if (name && name === memberof) {
|
||||
doclet.scope = doclet.scope || DEFAULT_SCOPE;
|
||||
name = memberof + SCOPE_TO_PUNC[doclet.scope] + name;
|
||||
about = toParts(name, (doclet.forceMemberof ? memberof : undefined));
|
||||
about = toParts(name, doclet.forceMemberof ? memberof : undefined);
|
||||
}
|
||||
// Like `@memberof foo#` or `@memberof foo~`.
|
||||
else if (name && hasTrailingScope(memberof) ) {
|
||||
about = toParts(memberof + name, (doclet.forceMemberof ? memberof : undefined));
|
||||
}
|
||||
else if (name && doclet.scope) {
|
||||
about = toParts(memberof + (SCOPE_TO_PUNC[doclet.scope] || '') + name,
|
||||
(doclet.forceMemberof ? memberof : undefined));
|
||||
else if (name && hasTrailingScope(memberof)) {
|
||||
about = toParts(memberof + name, doclet.forceMemberof ? memberof : undefined);
|
||||
} else if (name && doclet.scope) {
|
||||
about = toParts(
|
||||
memberof + (SCOPE_TO_PUNC[doclet.scope] || '') + name,
|
||||
doclet.forceMemberof ? memberof : undefined
|
||||
);
|
||||
}
|
||||
} else {
|
||||
// No memberof.
|
||||
@ -227,24 +224,22 @@ function resolve(doclet) {
|
||||
doclet.setLongname(about.longname);
|
||||
}
|
||||
|
||||
if (doclet.scope === SCOPE.NAMES.GLOBAL) { // via @global tag?
|
||||
if (doclet.scope === SCOPE.NAMES.GLOBAL) {
|
||||
// via @global tag?
|
||||
doclet.setLongname(doclet.name);
|
||||
delete doclet.memberof;
|
||||
}
|
||||
else if (about.scope) {
|
||||
if (about.memberof === LONGNAMES.GLOBAL) { // via @memberof <global> ?
|
||||
} else if (about.scope) {
|
||||
if (about.memberof === LONGNAMES.GLOBAL) {
|
||||
// via @memberof <global> ?
|
||||
doclet.scope = SCOPE.NAMES.GLOBAL;
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
doclet.scope = PUNC_TO_SCOPE[about.scope];
|
||||
}
|
||||
}
|
||||
else if (doclet.name && doclet.memberof && !doclet.longname) {
|
||||
} else if (doclet.name && doclet.memberof && !doclet.longname) {
|
||||
if (hasLeadingScope(doclet.name)) {
|
||||
doclet.scope = PUNC_TO_SCOPE[RegExp.$1];
|
||||
doclet.name = doclet.name.substr(1);
|
||||
}
|
||||
else if (doclet.meta.code && doclet.meta.code.name) {
|
||||
} else if (doclet.meta.code && doclet.meta.code.name) {
|
||||
// HACK: Handle cases where an ES 2015 class is a static memberof something else, and
|
||||
// the class has instance members. In these cases, we have to detect the instance
|
||||
// members' scope by looking at the meta info. There's almost certainly a better way to
|
||||
@ -252,10 +247,7 @@ function resolve(doclet) {
|
||||
metaName = String(doclet.meta.code.name);
|
||||
puncAndName = SCOPE.PUNC.INSTANCE + doclet.name;
|
||||
puncAndNameIndex = metaName.indexOf(puncAndName);
|
||||
if (
|
||||
puncAndNameIndex !== -1 &&
|
||||
(puncAndNameIndex === metaName.length - puncAndName.length)
|
||||
) {
|
||||
if (puncAndNameIndex !== -1 && puncAndNameIndex === metaName.length - puncAndName.length) {
|
||||
doclet.scope = SCOPE.NAMES.INSTANCE;
|
||||
}
|
||||
}
|
||||
@ -311,7 +303,7 @@ function getFilepath(doclet) {
|
||||
}
|
||||
|
||||
function clone(source, target, properties) {
|
||||
properties.forEach(property => {
|
||||
properties.forEach((property) => {
|
||||
switch (typeof source[property]) {
|
||||
case 'function':
|
||||
// do nothing
|
||||
@ -340,8 +332,10 @@ function clone(source, target, properties) {
|
||||
*/
|
||||
function copyMostProperties(primary, secondary, target, exclude) {
|
||||
const primaryProperties = _.difference(Object.getOwnPropertyNames(primary), exclude);
|
||||
const secondaryProperties = _.difference(Object.getOwnPropertyNames(secondary),
|
||||
exclude.concat(primaryProperties));
|
||||
const secondaryProperties = _.difference(
|
||||
Object.getOwnPropertyNames(secondary),
|
||||
exclude.concat(primaryProperties)
|
||||
);
|
||||
|
||||
clone(primary, target, primaryProperties);
|
||||
clone(secondary, target, secondaryProperties);
|
||||
@ -359,13 +353,18 @@ function copyMostProperties(primary, secondary, target, exclude) {
|
||||
* @param {Array.<string>} include - The names of properties to copy.
|
||||
*/
|
||||
function copySpecificProperties(primary, secondary, target, include) {
|
||||
include.forEach(property => {
|
||||
if ({}.hasOwnProperty.call(primary, property) && primary[property] &&
|
||||
primary[property].length) {
|
||||
include.forEach((property) => {
|
||||
if (
|
||||
{}.hasOwnProperty.call(primary, property) &&
|
||||
primary[property] &&
|
||||
primary[property].length
|
||||
) {
|
||||
target[property] = _.cloneDeep(primary[property]);
|
||||
}
|
||||
else if ({}.hasOwnProperty.call(secondary, property) && secondary[property] &&
|
||||
secondary[property].length) {
|
||||
} else if (
|
||||
{}.hasOwnProperty.call(secondary, property) &&
|
||||
secondary[property] &&
|
||||
secondary[property].length
|
||||
) {
|
||||
target[property] = _.cloneDeep(secondary[property]);
|
||||
}
|
||||
});
|
||||
@ -418,10 +417,10 @@ class Doclet {
|
||||
}
|
||||
|
||||
if (!this.kind && this.meta && this.meta.code) {
|
||||
this.addTag( 'kind', codeToKind(this.meta.code) );
|
||||
this.addTag('kind', codeToKind(this.meta.code));
|
||||
}
|
||||
|
||||
if (this.variation && this.longname && !/\)$/.test(this.longname) ) {
|
||||
if (this.variation && this.longname && !/\)$/.test(this.longname)) {
|
||||
this.longname += `(${this.variation})`;
|
||||
}
|
||||
|
||||
@ -501,7 +500,8 @@ class Doclet {
|
||||
if (!scopeNames.includes(scope)) {
|
||||
filepath = getFilepath(this);
|
||||
|
||||
errorMessage = `The scope name "${scope}" is not recognized. Use one of the ` +
|
||||
errorMessage =
|
||||
`The scope name "${scope}" is not recognized. Use one of the ` +
|
||||
`following values: ${scopeNames}`;
|
||||
if (filepath) {
|
||||
errorMessage += ` (Source file: ${filepath})`;
|
||||
@ -609,7 +609,9 @@ class Doclet {
|
||||
* @namespace
|
||||
*/
|
||||
this.meta.code = this.meta.code || {};
|
||||
if (meta.id) { this.meta.code.id = meta.id; }
|
||||
if (meta.id) {
|
||||
this.meta.code.id = meta.id;
|
||||
}
|
||||
if (meta.code) {
|
||||
if (meta.code.name) {
|
||||
/**
|
||||
@ -628,7 +630,7 @@ class Doclet {
|
||||
if (meta.code.node) {
|
||||
Object.defineProperty(this.meta.code, 'node', {
|
||||
value: meta.code.node,
|
||||
enumerable: false
|
||||
enumerable: false,
|
||||
});
|
||||
}
|
||||
if (meta.code.funcscope) {
|
||||
@ -659,15 +661,8 @@ exports.Doclet = Doclet;
|
||||
* doclets.
|
||||
*/
|
||||
exports.combine = (primary, secondary) => {
|
||||
const copyMostPropertiesExclude = [
|
||||
'params',
|
||||
'properties',
|
||||
'undocumented'
|
||||
];
|
||||
const copySpecificPropertiesInclude = [
|
||||
'params',
|
||||
'properties'
|
||||
];
|
||||
const copyMostPropertiesExclude = ['params', 'properties', 'undocumented'];
|
||||
const copySpecificPropertiesInclude = ['params', 'properties'];
|
||||
const target = new Doclet('');
|
||||
|
||||
// First, copy most properties to the target doclet.
|
||||
|
||||
@ -14,7 +14,7 @@ module.exports = {
|
||||
*/
|
||||
run: {
|
||||
start: new Date(),
|
||||
finish: null
|
||||
finish: null,
|
||||
},
|
||||
|
||||
/**
|
||||
@ -72,6 +72,6 @@ module.exports = {
|
||||
*/
|
||||
version: {
|
||||
number: null,
|
||||
revision: null
|
||||
}
|
||||
revision: null,
|
||||
},
|
||||
};
|
||||
|
||||
@ -80,8 +80,7 @@ class Package {
|
||||
|
||||
try {
|
||||
packageInfo = JSON.parse(json ? stripBom(json) : '{}');
|
||||
}
|
||||
catch (e) {
|
||||
} catch (e) {
|
||||
log.error(`Unable to parse the package file: ${e.message}`);
|
||||
packageInfo = {};
|
||||
}
|
||||
|
||||
@ -5,7 +5,7 @@
|
||||
const dictionary = require('jsdoc/tag/dictionary');
|
||||
|
||||
function addHandlers(handlers, parser) {
|
||||
Object.keys(handlers).forEach(eventName => {
|
||||
Object.keys(handlers).forEach((eventName) => {
|
||||
parser.on(eventName, handlers[eventName]);
|
||||
});
|
||||
}
|
||||
|
||||
@ -18,11 +18,11 @@ const EVENT_REGEXP = 'event:[\\S]+';
|
||||
const PACKAGE_REGEXP = 'package:[\\S]+';
|
||||
|
||||
const STRING_SCHEMA = {
|
||||
type: STRING
|
||||
type: STRING,
|
||||
};
|
||||
|
||||
// information about the code associated with a doclet
|
||||
const META_SCHEMA = exports.META_SCHEMA = {
|
||||
const META_SCHEMA = (exports.META_SCHEMA = {
|
||||
type: OBJECT,
|
||||
additionalProperties: false,
|
||||
properties: {
|
||||
@ -31,62 +31,63 @@ const META_SCHEMA = exports.META_SCHEMA = {
|
||||
additionalProperties: false,
|
||||
properties: {
|
||||
funcscope: {
|
||||
type: STRING
|
||||
type: STRING,
|
||||
},
|
||||
id: {
|
||||
type: STRING
|
||||
type: STRING,
|
||||
},
|
||||
name: {},
|
||||
node: {
|
||||
type: OBJECT
|
||||
type: OBJECT,
|
||||
},
|
||||
paramnames: {
|
||||
type: ARRAY,
|
||||
uniqueItems: true,
|
||||
items: {
|
||||
type: STRING
|
||||
}
|
||||
type: STRING,
|
||||
},
|
||||
},
|
||||
type: {
|
||||
type: STRING
|
||||
type: STRING,
|
||||
},
|
||||
value: {},
|
||||
},
|
||||
value: {}
|
||||
}
|
||||
},
|
||||
columnno: {
|
||||
title: 'The column number of the code associated with this doclet.',
|
||||
type: NUMBER
|
||||
type: NUMBER,
|
||||
},
|
||||
filename: {
|
||||
title: 'The name of the file that contains the code associated with this doclet.',
|
||||
type: STRING
|
||||
type: STRING,
|
||||
},
|
||||
lineno: {
|
||||
title: 'The line number of the code associated with this doclet.',
|
||||
type: NUMBER
|
||||
type: NUMBER,
|
||||
},
|
||||
path: {
|
||||
title: 'The path in which the code associated with this doclet is located.',
|
||||
type: STRING
|
||||
type: STRING,
|
||||
},
|
||||
range: {
|
||||
title: 'The positions of the first and last characters of the code associated with ' +
|
||||
title:
|
||||
'The positions of the first and last characters of the code associated with ' +
|
||||
'this doclet.',
|
||||
type: ARRAY,
|
||||
minItems: 2,
|
||||
maxItems: 2,
|
||||
items: {
|
||||
type: NUMBER
|
||||
}
|
||||
type: NUMBER,
|
||||
},
|
||||
},
|
||||
vars: {
|
||||
type: OBJECT
|
||||
}
|
||||
}
|
||||
};
|
||||
type: OBJECT,
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
// type property containing type names
|
||||
const TYPE_PROPERTY_SCHEMA = exports.TYPE_PROPERTY_SCHEMA = {
|
||||
const TYPE_PROPERTY_SCHEMA = (exports.TYPE_PROPERTY_SCHEMA = {
|
||||
type: OBJECT,
|
||||
additionalProperties: false,
|
||||
properties: {
|
||||
@ -94,65 +95,65 @@ const TYPE_PROPERTY_SCHEMA = exports.TYPE_PROPERTY_SCHEMA = {
|
||||
type: ARRAY,
|
||||
minItems: 1,
|
||||
items: {
|
||||
type: STRING
|
||||
}
|
||||
type: STRING,
|
||||
},
|
||||
},
|
||||
// type parser output
|
||||
parsedType: {
|
||||
type: OBJECT,
|
||||
additionalProperties: true
|
||||
}
|
||||
}
|
||||
};
|
||||
additionalProperties: true,
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
// enumeration properties
|
||||
const ENUM_PROPERTY_SCHEMA = exports.ENUM_PROPERTY_SCHEMA = {
|
||||
const ENUM_PROPERTY_SCHEMA = (exports.ENUM_PROPERTY_SCHEMA = {
|
||||
type: OBJECT,
|
||||
additionalProperties: false,
|
||||
properties: {
|
||||
comment: {
|
||||
type: STRING
|
||||
type: STRING,
|
||||
},
|
||||
defaultvalue: {},
|
||||
description: {
|
||||
type: STRING_OPTIONAL
|
||||
type: STRING_OPTIONAL,
|
||||
},
|
||||
kind: {
|
||||
type: STRING,
|
||||
enum: ['member']
|
||||
enum: ['member'],
|
||||
},
|
||||
longname: {
|
||||
type: STRING
|
||||
type: STRING,
|
||||
},
|
||||
memberof: {
|
||||
type: STRING
|
||||
type: STRING,
|
||||
},
|
||||
meta: META_SCHEMA,
|
||||
name: {
|
||||
type: STRING
|
||||
type: STRING,
|
||||
},
|
||||
// is this member nullable? (derived from the type expression)
|
||||
nullable: {
|
||||
type: BOOLEAN_OPTIONAL
|
||||
type: BOOLEAN_OPTIONAL,
|
||||
},
|
||||
// is this member optional? (derived from the type expression)
|
||||
optional: {
|
||||
type: BOOLEAN_OPTIONAL
|
||||
type: BOOLEAN_OPTIONAL,
|
||||
},
|
||||
scope: {
|
||||
type: STRING,
|
||||
enum: ['static']
|
||||
enum: ['static'],
|
||||
},
|
||||
type: TYPE_PROPERTY_SCHEMA,
|
||||
// can this member be provided more than once? (derived from the type expression)
|
||||
variable: {
|
||||
type: BOOLEAN_OPTIONAL
|
||||
}
|
||||
}
|
||||
};
|
||||
type: BOOLEAN_OPTIONAL,
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
// function parameter, or object property defined with @property tag
|
||||
const PARAM_SCHEMA = exports.PARAM_SCHEMA = {
|
||||
const PARAM_SCHEMA = (exports.PARAM_SCHEMA = {
|
||||
type: OBJECT,
|
||||
additionalProperties: false,
|
||||
properties: {
|
||||
@ -160,61 +161,56 @@ const PARAM_SCHEMA = exports.PARAM_SCHEMA = {
|
||||
defaultvalue: {},
|
||||
// a description of the parameter
|
||||
description: {
|
||||
type: STRING_OPTIONAL
|
||||
type: STRING_OPTIONAL,
|
||||
},
|
||||
// what name does this parameter have within the function?
|
||||
name: {
|
||||
type: STRING
|
||||
type: STRING,
|
||||
},
|
||||
// can the value for this parameter be null?
|
||||
nullable: {
|
||||
type: BOOLEAN_OPTIONAL
|
||||
type: BOOLEAN_OPTIONAL,
|
||||
},
|
||||
// is a value for this parameter optional?
|
||||
optional: {
|
||||
type: BOOLEAN_OPTIONAL
|
||||
type: BOOLEAN_OPTIONAL,
|
||||
},
|
||||
// what are the types of value expected for this parameter?
|
||||
type: TYPE_PROPERTY_SCHEMA,
|
||||
// can this parameter be repeated?
|
||||
variable: {
|
||||
type: BOOLEAN_OPTIONAL
|
||||
}
|
||||
}
|
||||
};
|
||||
type: BOOLEAN_OPTIONAL,
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
const DOCLET_SCHEMA = exports.DOCLET_SCHEMA = {
|
||||
const DOCLET_SCHEMA = (exports.DOCLET_SCHEMA = {
|
||||
type: OBJECT,
|
||||
additionalProperties: false,
|
||||
properties: {
|
||||
// what access privileges are allowed
|
||||
access: {
|
||||
type: STRING,
|
||||
enum: [
|
||||
'package',
|
||||
'private',
|
||||
'protected',
|
||||
'public'
|
||||
]
|
||||
enum: ['package', 'private', 'protected', 'public'],
|
||||
},
|
||||
alias: {
|
||||
type: STRING
|
||||
type: STRING,
|
||||
},
|
||||
async: {
|
||||
type: BOOLEAN
|
||||
type: BOOLEAN,
|
||||
},
|
||||
augments: {
|
||||
type: ARRAY,
|
||||
uniqueItems: true,
|
||||
items: {
|
||||
type: STRING
|
||||
}
|
||||
type: STRING,
|
||||
},
|
||||
},
|
||||
author: {
|
||||
type: ARRAY,
|
||||
items: {
|
||||
type: STRING
|
||||
}
|
||||
type: STRING,
|
||||
},
|
||||
},
|
||||
borrowed: {
|
||||
type: ARRAY,
|
||||
@ -225,56 +221,56 @@ const DOCLET_SCHEMA = exports.DOCLET_SCHEMA = {
|
||||
properties: {
|
||||
// name of the target
|
||||
as: {
|
||||
type: STRING
|
||||
type: STRING,
|
||||
},
|
||||
// name of the source
|
||||
from: {
|
||||
type: STRING
|
||||
}
|
||||
}
|
||||
}
|
||||
type: STRING,
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
// a description of the class that this constructor belongs to
|
||||
classdesc: {
|
||||
type: STRING
|
||||
type: STRING,
|
||||
},
|
||||
comment: {
|
||||
type: STRING
|
||||
type: STRING,
|
||||
},
|
||||
copyright: {
|
||||
type: STRING
|
||||
type: STRING,
|
||||
},
|
||||
defaultvalue: {},
|
||||
defaultvaluetype: {
|
||||
type: STRING,
|
||||
enum: [OBJECT, ARRAY]
|
||||
enum: [OBJECT, ARRAY],
|
||||
},
|
||||
// is usage of this symbol deprecated?
|
||||
deprecated: {
|
||||
type: [STRING, BOOLEAN]
|
||||
type: [STRING, BOOLEAN],
|
||||
},
|
||||
// a description
|
||||
description: {
|
||||
type: STRING_OPTIONAL
|
||||
type: STRING_OPTIONAL,
|
||||
},
|
||||
// something else to consider
|
||||
examples: {
|
||||
type: ARRAY,
|
||||
items: {
|
||||
type: STRING
|
||||
}
|
||||
type: STRING,
|
||||
},
|
||||
},
|
||||
exceptions: {
|
||||
type: ARRAY,
|
||||
items: PARAM_SCHEMA
|
||||
items: PARAM_SCHEMA,
|
||||
},
|
||||
// the path to another constructor
|
||||
extends: {
|
||||
type: ARRAY,
|
||||
uniqueItems: true,
|
||||
items: {
|
||||
type: STRING
|
||||
}
|
||||
type: STRING,
|
||||
},
|
||||
},
|
||||
// the path to another doc object
|
||||
fires: {
|
||||
@ -282,47 +278,47 @@ const DOCLET_SCHEMA = exports.DOCLET_SCHEMA = {
|
||||
uniqueItems: true,
|
||||
items: {
|
||||
type: STRING,
|
||||
pattern: EVENT_REGEXP
|
||||
}
|
||||
pattern: EVENT_REGEXP,
|
||||
},
|
||||
},
|
||||
forceMemberof: {
|
||||
type: BOOLEAN_OPTIONAL
|
||||
type: BOOLEAN_OPTIONAL,
|
||||
},
|
||||
generator: {
|
||||
type: BOOLEAN
|
||||
type: BOOLEAN,
|
||||
},
|
||||
hideconstructor: {
|
||||
type: BOOLEAN
|
||||
type: BOOLEAN,
|
||||
},
|
||||
ignore: {
|
||||
type: BOOLEAN
|
||||
type: BOOLEAN,
|
||||
},
|
||||
implementations: {
|
||||
type: ARRAY,
|
||||
items: {
|
||||
type: STRING
|
||||
}
|
||||
type: STRING,
|
||||
},
|
||||
},
|
||||
implements: {
|
||||
type: ARRAY,
|
||||
items: {
|
||||
type: STRING
|
||||
}
|
||||
type: STRING,
|
||||
},
|
||||
},
|
||||
inheritdoc: {
|
||||
type: STRING
|
||||
type: STRING,
|
||||
},
|
||||
inherited: {
|
||||
type: BOOLEAN
|
||||
type: BOOLEAN,
|
||||
},
|
||||
inherits: {
|
||||
type: STRING,
|
||||
dependency: {
|
||||
inherited: true
|
||||
}
|
||||
inherited: true,
|
||||
},
|
||||
},
|
||||
isEnum: {
|
||||
type: BOOLEAN
|
||||
type: BOOLEAN,
|
||||
},
|
||||
// what kind of symbol is this?
|
||||
kind: {
|
||||
@ -341,83 +337,83 @@ const DOCLET_SCHEMA = exports.DOCLET_SCHEMA = {
|
||||
'namespace',
|
||||
'package',
|
||||
'param',
|
||||
'typedef'
|
||||
]
|
||||
'typedef',
|
||||
],
|
||||
},
|
||||
license: {
|
||||
type: STRING
|
||||
type: STRING,
|
||||
},
|
||||
listens: {
|
||||
type: ARRAY,
|
||||
uniqueItems: true,
|
||||
items: {
|
||||
type: STRING,
|
||||
pattern: EVENT_REGEXP
|
||||
}
|
||||
pattern: EVENT_REGEXP,
|
||||
},
|
||||
},
|
||||
longname: {
|
||||
type: STRING
|
||||
type: STRING,
|
||||
},
|
||||
// probably a leading substring of the path
|
||||
memberof: {
|
||||
type: STRING
|
||||
type: STRING,
|
||||
},
|
||||
// information about this doc
|
||||
meta: META_SCHEMA,
|
||||
// was this doclet mixed in?
|
||||
mixed: {
|
||||
type: BOOLEAN
|
||||
type: BOOLEAN,
|
||||
},
|
||||
mixes: {
|
||||
type: ARRAY,
|
||||
uniqueItems: true,
|
||||
items: {
|
||||
type: STRING
|
||||
}
|
||||
type: STRING,
|
||||
},
|
||||
},
|
||||
modifies: {
|
||||
type: ARRAY,
|
||||
uniqueItems: true,
|
||||
items: PARAM_SCHEMA
|
||||
items: PARAM_SCHEMA,
|
||||
},
|
||||
// probably a trailing substring of the path
|
||||
name: {
|
||||
type: STRING
|
||||
type: STRING,
|
||||
},
|
||||
// is this member nullable? (derived from the type expression)
|
||||
nullable: {
|
||||
type: BOOLEAN_OPTIONAL
|
||||
type: BOOLEAN_OPTIONAL,
|
||||
},
|
||||
// is this member optional? (derived from the type expression)
|
||||
optional: {
|
||||
type: BOOLEAN_OPTIONAL
|
||||
type: BOOLEAN_OPTIONAL,
|
||||
},
|
||||
// does this member explicitly override the parent?
|
||||
override: {
|
||||
type: BOOLEAN
|
||||
type: BOOLEAN,
|
||||
},
|
||||
overrides: {
|
||||
type: STRING
|
||||
type: STRING,
|
||||
},
|
||||
// are there function parameters associated with this doc?
|
||||
params: {
|
||||
type: ARRAY,
|
||||
uniqueItems: true,
|
||||
items: PARAM_SCHEMA
|
||||
items: PARAM_SCHEMA,
|
||||
},
|
||||
preserveName: {
|
||||
type: BOOLEAN
|
||||
type: BOOLEAN,
|
||||
},
|
||||
properties: {
|
||||
type: ARRAY,
|
||||
uniqueItems: true,
|
||||
minItems: 1,
|
||||
items: {
|
||||
anyOf: [ENUM_PROPERTY_SCHEMA, PARAM_SCHEMA]
|
||||
}
|
||||
anyOf: [ENUM_PROPERTY_SCHEMA, PARAM_SCHEMA],
|
||||
},
|
||||
},
|
||||
readonly: {
|
||||
type: BOOLEAN
|
||||
type: BOOLEAN,
|
||||
},
|
||||
// the symbol being documented requires another symbol
|
||||
requires: {
|
||||
@ -425,38 +421,33 @@ const DOCLET_SCHEMA = exports.DOCLET_SCHEMA = {
|
||||
uniqueItems: true,
|
||||
minItems: 1,
|
||||
items: {
|
||||
type: STRING
|
||||
}
|
||||
type: STRING,
|
||||
},
|
||||
},
|
||||
returns: {
|
||||
type: ARRAY,
|
||||
minItems: 1,
|
||||
items: PARAM_SCHEMA
|
||||
items: PARAM_SCHEMA,
|
||||
},
|
||||
// what sort of parent scope does this symbol have?
|
||||
scope: {
|
||||
type: STRING,
|
||||
enum: [
|
||||
'global',
|
||||
'inner',
|
||||
'instance',
|
||||
'static'
|
||||
]
|
||||
enum: ['global', 'inner', 'instance', 'static'],
|
||||
},
|
||||
// something else to consider
|
||||
see: {
|
||||
type: ARRAY,
|
||||
minItems: 1,
|
||||
items: {
|
||||
type: STRING
|
||||
}
|
||||
type: STRING,
|
||||
},
|
||||
},
|
||||
// at what previous version was this doc added?
|
||||
since: {
|
||||
type: STRING
|
||||
type: STRING,
|
||||
},
|
||||
summary: {
|
||||
type: STRING
|
||||
type: STRING,
|
||||
},
|
||||
// arbitrary tags associated with this doc
|
||||
tags: {
|
||||
@ -467,140 +458,140 @@ const DOCLET_SCHEMA = exports.DOCLET_SCHEMA = {
|
||||
additionalProperties: false,
|
||||
properties: {
|
||||
originalTitle: {
|
||||
type: STRING
|
||||
type: STRING,
|
||||
},
|
||||
text: {
|
||||
type: STRING
|
||||
type: STRING,
|
||||
},
|
||||
title: {
|
||||
type: STRING
|
||||
type: STRING,
|
||||
},
|
||||
value: {
|
||||
oneOf: [STRING_SCHEMA, PARAM_SCHEMA]
|
||||
}
|
||||
}
|
||||
}
|
||||
oneOf: [STRING_SCHEMA, PARAM_SCHEMA],
|
||||
},
|
||||
'this': {
|
||||
type: STRING
|
||||
},
|
||||
},
|
||||
},
|
||||
this: {
|
||||
type: STRING,
|
||||
},
|
||||
todo: {
|
||||
type: ARRAY,
|
||||
minItems: 1,
|
||||
items: {
|
||||
type: STRING
|
||||
}
|
||||
type: STRING,
|
||||
},
|
||||
},
|
||||
// what type is the value that this doc is associated with, like `number`
|
||||
type: TYPE_PROPERTY_SCHEMA,
|
||||
undocumented: {
|
||||
type: BOOLEAN
|
||||
type: BOOLEAN,
|
||||
},
|
||||
// can this member be provided more than once? (derived from the type expression)
|
||||
variable: {
|
||||
type: BOOLEAN_OPTIONAL
|
||||
type: BOOLEAN_OPTIONAL,
|
||||
},
|
||||
variation: {
|
||||
type: STRING
|
||||
type: STRING,
|
||||
},
|
||||
// what is the version of this doc
|
||||
version: {
|
||||
type: STRING
|
||||
type: STRING,
|
||||
},
|
||||
// is a member left to be implemented during inheritance?
|
||||
virtual: {
|
||||
type: BOOLEAN
|
||||
type: BOOLEAN,
|
||||
},
|
||||
yields: {
|
||||
type: ARRAY,
|
||||
minItems: 1,
|
||||
items: PARAM_SCHEMA
|
||||
}
|
||||
}
|
||||
};
|
||||
items: PARAM_SCHEMA,
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
const CONTACT_INFO_SCHEMA = exports.CONTACT_INFO_SCHEMA = {
|
||||
const CONTACT_INFO_SCHEMA = (exports.CONTACT_INFO_SCHEMA = {
|
||||
type: OBJECT,
|
||||
additionalProperties: false,
|
||||
properties: {
|
||||
email: {
|
||||
type: STRING
|
||||
type: STRING,
|
||||
},
|
||||
name: {
|
||||
type: STRING
|
||||
type: STRING,
|
||||
},
|
||||
url: {
|
||||
type: STRING,
|
||||
format: 'uri'
|
||||
}
|
||||
}
|
||||
};
|
||||
format: 'uri',
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
const BUGS_SCHEMA = exports.BUGS_SCHEMA = {
|
||||
const BUGS_SCHEMA = (exports.BUGS_SCHEMA = {
|
||||
type: OBJECT,
|
||||
additionalProperties: false,
|
||||
properties: {
|
||||
email: {
|
||||
type: STRING
|
||||
type: STRING,
|
||||
},
|
||||
url: {
|
||||
type: STRING,
|
||||
format: 'uri'
|
||||
}
|
||||
}
|
||||
};
|
||||
format: 'uri',
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
const PACKAGE_SCHEMA = exports.PACKAGE_SCHEMA = {
|
||||
const PACKAGE_SCHEMA = (exports.PACKAGE_SCHEMA = {
|
||||
type: OBJECT,
|
||||
additionalProperties: false,
|
||||
properties: {
|
||||
author: {
|
||||
anyOf: [STRING_SCHEMA, CONTACT_INFO_SCHEMA]
|
||||
anyOf: [STRING_SCHEMA, CONTACT_INFO_SCHEMA],
|
||||
},
|
||||
bugs: {
|
||||
anyOf: [STRING_SCHEMA, BUGS_SCHEMA]
|
||||
anyOf: [STRING_SCHEMA, BUGS_SCHEMA],
|
||||
},
|
||||
contributors: {
|
||||
type: ARRAY,
|
||||
minItems: 0,
|
||||
items: {
|
||||
anyOf: [STRING_SCHEMA, CONTACT_INFO_SCHEMA]
|
||||
}
|
||||
anyOf: [STRING_SCHEMA, CONTACT_INFO_SCHEMA],
|
||||
},
|
||||
},
|
||||
dependencies: {
|
||||
type: OBJECT
|
||||
type: OBJECT,
|
||||
},
|
||||
description: {
|
||||
type: STRING
|
||||
type: STRING,
|
||||
},
|
||||
devDependencies: {
|
||||
type: OBJECT
|
||||
type: OBJECT,
|
||||
},
|
||||
engines: {
|
||||
type: OBJECT
|
||||
type: OBJECT,
|
||||
},
|
||||
files: {
|
||||
type: ARRAY,
|
||||
uniqueItems: true,
|
||||
minItems: 0,
|
||||
items: {
|
||||
type: STRING
|
||||
}
|
||||
type: STRING,
|
||||
},
|
||||
},
|
||||
homepage: {
|
||||
type: STRING,
|
||||
format: 'uri'
|
||||
format: 'uri',
|
||||
},
|
||||
keywords: {
|
||||
type: ARRAY,
|
||||
minItems: 0,
|
||||
items: {
|
||||
type: STRING
|
||||
}
|
||||
type: STRING,
|
||||
},
|
||||
},
|
||||
kind: {
|
||||
type: STRING,
|
||||
enum: ['package']
|
||||
enum: ['package'],
|
||||
},
|
||||
licenses: {
|
||||
type: ARRAY,
|
||||
@ -610,47 +601,47 @@ const PACKAGE_SCHEMA = exports.PACKAGE_SCHEMA = {
|
||||
additionalProperties: false,
|
||||
properties: {
|
||||
type: {
|
||||
type: STRING
|
||||
type: STRING,
|
||||
},
|
||||
url: {
|
||||
type: STRING,
|
||||
format: 'uri'
|
||||
}
|
||||
}
|
||||
}
|
||||
format: 'uri',
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
longname: {
|
||||
type: STRING,
|
||||
pattern: PACKAGE_REGEXP
|
||||
pattern: PACKAGE_REGEXP,
|
||||
},
|
||||
main: {
|
||||
type: STRING
|
||||
type: STRING,
|
||||
},
|
||||
name: {
|
||||
type: STRING
|
||||
type: STRING,
|
||||
},
|
||||
repository: {
|
||||
type: OBJECT,
|
||||
additionalProperties: false,
|
||||
properties: {
|
||||
type: {
|
||||
type: STRING
|
||||
type: STRING,
|
||||
},
|
||||
// we don't use `format: 'uri'` here because repo URLs are atypical
|
||||
url: {
|
||||
type: STRING
|
||||
}
|
||||
}
|
||||
type: STRING,
|
||||
},
|
||||
},
|
||||
},
|
||||
version: {
|
||||
type: STRING
|
||||
}
|
||||
}
|
||||
};
|
||||
type: STRING,
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
exports.DOCLETS_SCHEMA = {
|
||||
type: ARRAY,
|
||||
items: {
|
||||
anyOf: [DOCLET_SCHEMA, PACKAGE_SCHEMA]
|
||||
}
|
||||
anyOf: [DOCLET_SCHEMA, PACKAGE_SCHEMA],
|
||||
},
|
||||
};
|
||||
|
||||
@ -7,7 +7,7 @@ function makeRegExp(config) {
|
||||
let regExp = null;
|
||||
|
||||
if (config) {
|
||||
regExp = (typeof config === 'string') ? new RegExp(config) : config;
|
||||
regExp = typeof config === 'string' ? new RegExp(config) : config;
|
||||
}
|
||||
|
||||
return regExp;
|
||||
@ -23,11 +23,10 @@ class Filter {
|
||||
* @param {(string|RegExp)} opts.includePattern
|
||||
* @param {(string|RegExp)} opts.excludePattern
|
||||
*/
|
||||
constructor({exclude, includePattern, excludePattern}) {
|
||||
constructor({ exclude, includePattern, excludePattern }) {
|
||||
this._cwd = process.cwd();
|
||||
this.exclude = exclude && Array.isArray(exclude) ?
|
||||
exclude.map($ => path.resolve(this._cwd, $)) :
|
||||
null;
|
||||
this.exclude =
|
||||
exclude && Array.isArray(exclude) ? exclude.map(($) => path.resolve(this._cwd, $)) : null;
|
||||
this.includePattern = makeRegExp(includePattern);
|
||||
this.excludePattern = makeRegExp(excludePattern);
|
||||
}
|
||||
@ -41,17 +40,17 @@ class Filter {
|
||||
|
||||
filepath = path.resolve(this._cwd, filepath);
|
||||
|
||||
if ( this.includePattern && !this.includePattern.test(filepath) ) {
|
||||
if (this.includePattern && !this.includePattern.test(filepath)) {
|
||||
included = false;
|
||||
}
|
||||
|
||||
if ( this.excludePattern && this.excludePattern.test(filepath) ) {
|
||||
if (this.excludePattern && this.excludePattern.test(filepath)) {
|
||||
included = false;
|
||||
}
|
||||
|
||||
if (this.exclude) {
|
||||
this.exclude.forEach(exclude => {
|
||||
if ( filepath.indexOf(exclude) === 0 ) {
|
||||
this.exclude.forEach((exclude) => {
|
||||
if (filepath.indexOf(exclude) === 0) {
|
||||
included = false;
|
||||
}
|
||||
});
|
||||
|
||||
@ -16,9 +16,9 @@ class CurrentModule {
|
||||
this.originalName = doclet.meta.code.name || '';
|
||||
}
|
||||
}
|
||||
function filterByLongname({longname}) {
|
||||
function filterByLongname({ longname }) {
|
||||
// you can't document prototypes
|
||||
if ( /#$/.test(longname) ) {
|
||||
if (/#$/.test(longname)) {
|
||||
return true;
|
||||
}
|
||||
|
||||
@ -32,8 +32,7 @@ function createDoclet(comment, e) {
|
||||
|
||||
try {
|
||||
doclet = new Doclet(comment, e);
|
||||
}
|
||||
catch (error) {
|
||||
} catch (error) {
|
||||
flatComment = comment.replace(/[\r\n]/g, '');
|
||||
msg = `cannot create a doclet for the comment "${flatComment}": ${error.message}`;
|
||||
log.error(msg);
|
||||
@ -88,13 +87,15 @@ function setModuleScopeMemberOf(parser, doclet) {
|
||||
if (currentModule && currentModule.longname !== doclet.name) {
|
||||
if (!doclet.scope) {
|
||||
// is this a method definition? if so, we usually get the scope from the node directly
|
||||
if (doclet.meta && doclet.meta.code && doclet.meta.code.node &&
|
||||
doclet.meta.code.node.type === Syntax.MethodDefinition) {
|
||||
if (
|
||||
doclet.meta &&
|
||||
doclet.meta.code &&
|
||||
doclet.meta.code.node &&
|
||||
doclet.meta.code.node.type === Syntax.MethodDefinition
|
||||
) {
|
||||
// special case for constructors of classes that have @alias tags
|
||||
if (doclet.meta.code.node.kind === 'constructor') {
|
||||
parentDoclet = parser._getDocletById(
|
||||
doclet.meta.code.node.parent.parent.nodeId
|
||||
);
|
||||
parentDoclet = parser._getDocletById(doclet.meta.code.node.parent.parent.nodeId);
|
||||
|
||||
if (parentDoclet && parentDoclet.alias) {
|
||||
// the constructor should use the same name as the class
|
||||
@ -104,18 +105,20 @@ function setModuleScopeMemberOf(parser, doclet) {
|
||||
// and we shouldn't try to set a memberof value
|
||||
skipMemberof = true;
|
||||
}
|
||||
}
|
||||
else if (doclet.meta.code.node.static) {
|
||||
} else if (doclet.meta.code.node.static) {
|
||||
doclet.addTag('static');
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
doclet.addTag('instance');
|
||||
}
|
||||
}
|
||||
// is this something that the module exports? if so, it's a static member
|
||||
else if (doclet.meta && doclet.meta.code && doclet.meta.code.node &&
|
||||
else if (
|
||||
doclet.meta &&
|
||||
doclet.meta.code &&
|
||||
doclet.meta.code.node &&
|
||||
doclet.meta.code.node.parent &&
|
||||
doclet.meta.code.node.parent.type === Syntax.ExportNamedDeclaration) {
|
||||
doclet.meta.code.node.parent.type === Syntax.ExportNamedDeclaration
|
||||
) {
|
||||
doclet.addTag('static');
|
||||
}
|
||||
// otherwise, it must be an inner member
|
||||
@ -147,7 +150,7 @@ function addDoclet(parser, newDoclet) {
|
||||
e = { doclet: newDoclet };
|
||||
parser.emit('newDoclet', e);
|
||||
|
||||
if ( !e.defaultPrevented && !filterByLongname(e.doclet) ) {
|
||||
if (!e.defaultPrevented && !filterByLongname(e.doclet)) {
|
||||
parser.addResult(e.doclet);
|
||||
}
|
||||
}
|
||||
@ -160,7 +163,7 @@ function processAlias(parser, doclet, astNode) {
|
||||
memberofName = parser.resolveThis(astNode);
|
||||
|
||||
// "class" refers to the owner of the prototype, not the prototype itself
|
||||
if ( /^(.+?)(\.prototype|#)$/.test(memberofName) ) {
|
||||
if (/^(.+?)(\.prototype|#)$/.test(memberofName)) {
|
||||
memberofName = RegExp.$1;
|
||||
}
|
||||
doclet.alias = memberofName;
|
||||
@ -202,8 +205,7 @@ function findSymbolMemberof(parser, doclet, astNode, nameStartsWith, trailingPun
|
||||
else if (doclet.name === 'module.exports' && currentModule) {
|
||||
doclet.addTag('name', currentModule.longname);
|
||||
doclet.postProcess();
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
memberof = parser.resolveThis(astNode);
|
||||
|
||||
// like the following at the top level of a module:
|
||||
@ -211,15 +213,14 @@ function findSymbolMemberof(parser, doclet, astNode, nameStartsWith, trailingPun
|
||||
if (nameStartsWith === 'this' && currentModule && !memberof) {
|
||||
memberof = currentModule.longname;
|
||||
scopePunc = SCOPE.PUNC.STATIC;
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
scopePunc = SCOPE.PUNC.INSTANCE;
|
||||
}
|
||||
}
|
||||
|
||||
return {
|
||||
memberof: memberof,
|
||||
scopePunc: scopePunc
|
||||
scopePunc: scopePunc,
|
||||
};
|
||||
}
|
||||
|
||||
@ -250,12 +251,9 @@ function addSymbolMemberof(parser, doclet, astNode) {
|
||||
scopePunc = memberofInfo.scopePunc;
|
||||
|
||||
if (memberof) {
|
||||
doclet.name = doclet.name ?
|
||||
memberof + scopePunc + doclet.name :
|
||||
memberof;
|
||||
doclet.name = doclet.name ? memberof + scopePunc + doclet.name : memberof;
|
||||
}
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
memberofInfo = parser.astnodeToMemberof(astNode);
|
||||
basename = memberofInfo.basename;
|
||||
memberof = memberofInfo.memberof;
|
||||
@ -265,8 +263,7 @@ function addSymbolMemberof(parser, doclet, astNode) {
|
||||
if (memberof) {
|
||||
doclet.addTag('memberof', memberof);
|
||||
if (basename) {
|
||||
doclet.name = (doclet.name || '')
|
||||
.replace(new RegExp(`^${escape(basename)}.`), '');
|
||||
doclet.name = (doclet.name || '').replace(new RegExp(`^${escape(basename)}.`), '');
|
||||
}
|
||||
}
|
||||
// otherwise, add the defaults for a module (if we're currently in a module)
|
||||
@ -290,8 +287,7 @@ function newSymbolDoclet(parser, docletSrc, e) {
|
||||
}
|
||||
|
||||
newDoclet.postProcess();
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
return false;
|
||||
}
|
||||
|
||||
@ -299,8 +295,11 @@ function newSymbolDoclet(parser, docletSrc, e) {
|
||||
// a) the doclet is a memberof something
|
||||
// b) the doclet represents a module
|
||||
// c) we're in a module that exports only this symbol
|
||||
if ( !newDoclet.memberof && newDoclet.kind !== 'module' &&
|
||||
(!currentModule || currentModule.longname !== newDoclet.name) ) {
|
||||
if (
|
||||
!newDoclet.memberof &&
|
||||
newDoclet.kind !== 'module' &&
|
||||
(!currentModule || currentModule.longname !== newDoclet.name)
|
||||
) {
|
||||
newDoclet.scope = SCOPE.NAMES.GLOBAL;
|
||||
}
|
||||
|
||||
@ -319,11 +318,11 @@ function newSymbolDoclet(parser, docletSrc, e) {
|
||||
* Attach these event handlers to a particular instance of a parser.
|
||||
* @param parser
|
||||
*/
|
||||
exports.attachTo = parser => {
|
||||
exports.attachTo = (parser) => {
|
||||
// Handle JSDoc "virtual comments" that include one of the following:
|
||||
// + A `@name` tag
|
||||
// + Another tag that accepts a name, such as `@function`
|
||||
parser.on('jsdocCommentFound', e => {
|
||||
parser.on('jsdocCommentFound', (e) => {
|
||||
const comments = e.comment.split(/@also\b/g);
|
||||
let newDoclet;
|
||||
|
||||
@ -349,7 +348,7 @@ exports.attachTo = parser => {
|
||||
});
|
||||
|
||||
// Handle named symbols in the code. May or may not have a JSDoc comment attached.
|
||||
parser.on('symbolFound', e => {
|
||||
parser.on('symbolFound', (e) => {
|
||||
const comments = e.comment.split(/@also\b/g);
|
||||
|
||||
for (let i = 0, l = comments.length; i < l; i++) {
|
||||
|
||||
@ -13,9 +13,9 @@ const { Walker } = require('jsdoc/src/walker');
|
||||
const hasOwnProp = Object.prototype.hasOwnProperty;
|
||||
|
||||
// TODO: docs
|
||||
const PARSERS = exports.PARSERS = {
|
||||
js: 'jsdoc/src/parser'
|
||||
};
|
||||
const PARSERS = (exports.PARSERS = {
|
||||
js: 'jsdoc/src/parser',
|
||||
});
|
||||
/* eslint-disable no-script-url */
|
||||
// Prefix for JavaScript strings that were provided in lieu of a filename.
|
||||
const SCHEMA = 'javascript:';
|
||||
@ -27,7 +27,7 @@ class DocletCache {
|
||||
}
|
||||
|
||||
get(itemName) {
|
||||
if ( !hasOwnProp.call(this._doclets, itemName) ) {
|
||||
if (!hasOwnProp.call(this._doclets, itemName)) {
|
||||
return null;
|
||||
}
|
||||
|
||||
@ -36,7 +36,7 @@ class DocletCache {
|
||||
}
|
||||
|
||||
put(itemName, value) {
|
||||
if ( !hasOwnProp.call(this._doclets, itemName) ) {
|
||||
if (!hasOwnProp.call(this._doclets, itemName)) {
|
||||
this._doclets[itemName] = [];
|
||||
}
|
||||
|
||||
@ -55,8 +55,7 @@ exports.createParser = (type, conf) => {
|
||||
|
||||
if (hasOwnProp.call(PARSERS, type)) {
|
||||
modulePath = PARSERS[type];
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
log.fatal(`The parser type "${type}" is not recognized.`);
|
||||
|
||||
return null;
|
||||
@ -67,20 +66,27 @@ exports.createParser = (type, conf) => {
|
||||
|
||||
// TODO: docs
|
||||
function pretreat(code) {
|
||||
return code
|
||||
return (
|
||||
code
|
||||
// comment out hashbang at the top of the file, like: #!/usr/bin/env node
|
||||
.replace(/^(#![\S \t]+\r?\n)/, '// $1')
|
||||
|
||||
// to support code minifiers that preserve /*! comments, treat /*!* as equivalent to /**
|
||||
.replace(/\/\*!\*/g, '/**')
|
||||
// merge adjacent doclets
|
||||
.replace(/\*\/\/\*\*+/g, '@also');
|
||||
.replace(/\*\/\/\*\*+/g, '@also')
|
||||
);
|
||||
}
|
||||
|
||||
// TODO: docs
|
||||
function definedInScope(doclet, basename) {
|
||||
return Boolean(doclet) && Boolean(doclet.meta) && Boolean(doclet.meta.vars) &&
|
||||
Boolean(basename) && hasOwnProp.call(doclet.meta.vars, basename);
|
||||
return (
|
||||
Boolean(doclet) &&
|
||||
Boolean(doclet.meta) &&
|
||||
Boolean(doclet.meta.vars) &&
|
||||
Boolean(basename) &&
|
||||
hasOwnProp.call(doclet.meta.vars, basename)
|
||||
);
|
||||
}
|
||||
|
||||
// TODO: docs
|
||||
@ -105,13 +111,13 @@ class Parser extends EventEmitter {
|
||||
visitor: {
|
||||
get() {
|
||||
return this._visitor;
|
||||
}
|
||||
},
|
||||
},
|
||||
walker: {
|
||||
get() {
|
||||
return this._walker;
|
||||
}
|
||||
}
|
||||
},
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
@ -122,12 +128,12 @@ class Parser extends EventEmitter {
|
||||
borrowed: [],
|
||||
documented: {},
|
||||
longname: {},
|
||||
memberof: {}
|
||||
memberof: {},
|
||||
};
|
||||
this._byNodeId = new DocletCache();
|
||||
this._byLongname = new DocletCache();
|
||||
this._byLongname.put(LONGNAMES.GLOBAL, {
|
||||
meta: {}
|
||||
meta: {},
|
||||
});
|
||||
}
|
||||
|
||||
@ -174,13 +180,11 @@ class Parser extends EventEmitter {
|
||||
if (sourceFile.indexOf(SCHEMA) === 0) {
|
||||
sourceCode = sourceFile.substr(SCHEMA.length);
|
||||
filename = `[[string${i}]]`;
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
filename = sourceFile;
|
||||
try {
|
||||
sourceCode = fs.readFileSync(filename, encoding);
|
||||
}
|
||||
catch (err) {
|
||||
} catch (err) {
|
||||
log.error(`Unable to read and parse the source file ${filename}: ${err}`);
|
||||
}
|
||||
}
|
||||
@ -193,7 +197,7 @@ class Parser extends EventEmitter {
|
||||
|
||||
this.emit('parseComplete', {
|
||||
sourcefiles: parsedFiles,
|
||||
doclets: this._resultBuffer
|
||||
doclets: this._resultBuffer,
|
||||
});
|
||||
log.debug('Finished parsing source files.');
|
||||
|
||||
@ -220,14 +224,14 @@ class Parser extends EventEmitter {
|
||||
this._resultBuffer.push(doclet);
|
||||
|
||||
// track all doclets by longname
|
||||
if ( !hasOwnProp.call(index.longname, doclet.longname) ) {
|
||||
if (!hasOwnProp.call(index.longname, doclet.longname)) {
|
||||
index.longname[doclet.longname] = [];
|
||||
}
|
||||
index.longname[doclet.longname].push(doclet);
|
||||
|
||||
// track all doclets that have a memberof by memberof
|
||||
if (doclet.memberof) {
|
||||
if ( !hasOwnProp.call(index.memberof, doclet.memberof) ) {
|
||||
if (!hasOwnProp.call(index.memberof, doclet.memberof)) {
|
||||
index.memberof[doclet.memberof] = [];
|
||||
}
|
||||
index.memberof[doclet.memberof].push(doclet);
|
||||
@ -235,14 +239,14 @@ class Parser extends EventEmitter {
|
||||
|
||||
// track longnames of documented symbols
|
||||
if (!doclet.undocumented) {
|
||||
if ( !hasOwnProp.call(index.documented, doclet.longname) ) {
|
||||
if (!hasOwnProp.call(index.documented, doclet.longname)) {
|
||||
index.documented[doclet.longname] = [];
|
||||
}
|
||||
index.documented[doclet.longname].push(doclet);
|
||||
}
|
||||
|
||||
// track doclets with a `borrowed` property
|
||||
if ( hasOwnProp.call(doclet, 'borrowed') ) {
|
||||
if (hasOwnProp.call(doclet, 'borrowed')) {
|
||||
index.borrowed.push(doclet);
|
||||
}
|
||||
}
|
||||
@ -261,7 +265,7 @@ class Parser extends EventEmitter {
|
||||
_parseSourceCode(sourceCode, sourceName) {
|
||||
let ast;
|
||||
let e = {
|
||||
filename: sourceName
|
||||
filename: sourceName,
|
||||
};
|
||||
let sourceType;
|
||||
|
||||
@ -271,7 +275,7 @@ class Parser extends EventEmitter {
|
||||
if (!e.defaultPrevented) {
|
||||
e = {
|
||||
filename: sourceName,
|
||||
source: sourceCode
|
||||
source: sourceCode,
|
||||
};
|
||||
this.emit('beforeParse', e);
|
||||
sourceCode = e.source;
|
||||
@ -308,14 +312,16 @@ class Parser extends EventEmitter {
|
||||
}
|
||||
// keep references to undocumented anonymous functions, too, as they might have scoped vars
|
||||
else if (
|
||||
(node.type === Syntax.FunctionDeclaration || node.type === Syntax.FunctionExpression ||
|
||||
(node.type === Syntax.FunctionDeclaration ||
|
||||
node.type === Syntax.FunctionExpression ||
|
||||
node.type === Syntax.ArrowFunctionExpression) &&
|
||||
!this._getDocletById(node.nodeId) ) {
|
||||
!this._getDocletById(node.nodeId)
|
||||
) {
|
||||
fakeDoclet = {
|
||||
longname: LONGNAMES.ANONYMOUS,
|
||||
meta: {
|
||||
code: e.code
|
||||
}
|
||||
code: e.code,
|
||||
},
|
||||
};
|
||||
this._byNodeId.put(node.nodeId, fakeDoclet);
|
||||
this._byLongname.put(fakeDoclet.longname, fakeDoclet);
|
||||
@ -352,29 +358,29 @@ class Parser extends EventEmitter {
|
||||
const result = {};
|
||||
const type = node.type;
|
||||
|
||||
if ( (type === Syntax.FunctionDeclaration || type === Syntax.FunctionExpression ||
|
||||
type === Syntax.ArrowFunctionExpression || type === Syntax.VariableDeclarator) &&
|
||||
node.enclosingScope ) {
|
||||
if (
|
||||
(type === Syntax.FunctionDeclaration ||
|
||||
type === Syntax.FunctionExpression ||
|
||||
type === Syntax.ArrowFunctionExpression ||
|
||||
type === Syntax.VariableDeclarator) &&
|
||||
node.enclosingScope
|
||||
) {
|
||||
doclet = this._getDocletById(node.enclosingScope.nodeId);
|
||||
|
||||
if (!doclet) {
|
||||
result.memberof = LONGNAMES.ANONYMOUS + SCOPE.PUNC.INNER;
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
result.memberof = doclet.longname + SCOPE.PUNC.INNER;
|
||||
}
|
||||
}
|
||||
else if (type === Syntax.ClassPrivateProperty || type === Syntax.ClassProperty) {
|
||||
} else if (type === Syntax.ClassPrivateProperty || type === Syntax.ClassProperty) {
|
||||
doclet = this._getDocletById(node.enclosingScope.nodeId);
|
||||
|
||||
if (!doclet) {
|
||||
result.memberof = LONGNAMES.ANONYMOUS + SCOPE.PUNC.INSTANCE;
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
result.memberof = doclet.longname + SCOPE.PUNC.INSTANCE;
|
||||
}
|
||||
}
|
||||
else if (type === Syntax.MethodDefinition && node.kind === 'constructor') {
|
||||
} else if (type === Syntax.MethodDefinition && node.kind === 'constructor') {
|
||||
doclet = this._getDocletById(node.enclosingScope.nodeId);
|
||||
|
||||
// global classes aren't a member of anything
|
||||
@ -385,31 +391,30 @@ class Parser extends EventEmitter {
|
||||
// special case for methods in classes that are returned by arrow function expressions; for
|
||||
// other method definitions, we get the memberof from the node name elsewhere. yes, this is
|
||||
// confusing...
|
||||
else if (type === Syntax.MethodDefinition && node.parent.parent.parent &&
|
||||
node.parent.parent.parent.type === Syntax.ArrowFunctionExpression) {
|
||||
else if (
|
||||
type === Syntax.MethodDefinition &&
|
||||
node.parent.parent.parent &&
|
||||
node.parent.parent.parent.type === Syntax.ArrowFunctionExpression
|
||||
) {
|
||||
doclet = this._getDocletById(node.enclosingScope.nodeId);
|
||||
|
||||
if (doclet) {
|
||||
result.memberof = doclet.longname +
|
||||
(node.static === true ?
|
||||
SCOPE.PUNC.STATIC :
|
||||
SCOPE.PUNC.INSTANCE);
|
||||
result.memberof =
|
||||
doclet.longname + (node.static === true ? SCOPE.PUNC.STATIC : SCOPE.PUNC.INSTANCE);
|
||||
}
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
// check local references for aliases
|
||||
scope = node;
|
||||
basename = getBasename( astNode.nodeToValue(node) );
|
||||
basename = getBasename(astNode.nodeToValue(node));
|
||||
|
||||
// walk up the scope chain until we find the scope in which the node is defined
|
||||
while (scope.enclosingScope) {
|
||||
doclet = this._getDocletById(scope.enclosingScope.nodeId);
|
||||
if ( doclet && definedInScope(doclet, basename) ) {
|
||||
if (doclet && definedInScope(doclet, basename)) {
|
||||
result.memberof = doclet.meta.vars[basename];
|
||||
result.basename = basename;
|
||||
break;
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
// move up
|
||||
scope = scope.enclosingScope;
|
||||
}
|
||||
@ -417,11 +422,10 @@ class Parser extends EventEmitter {
|
||||
|
||||
// do we know that it's a global?
|
||||
doclet = this._getDocletByLongname(LONGNAMES.GLOBAL);
|
||||
if ( doclet && definedInScope(doclet, basename) ) {
|
||||
if (doclet && definedInScope(doclet, basename)) {
|
||||
result.memberof = doclet.meta.vars[basename];
|
||||
result.basename = basename;
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
doclet = this._getDocletById(node.parent.nodeId);
|
||||
|
||||
// set the result if we found a doclet. (if we didn't, the AST node may describe a
|
||||
@ -442,7 +446,7 @@ class Parser extends EventEmitter {
|
||||
* @return {module:jsdoc/doclet.Doclet?} The doclet for the lowest-level class in the node's scope
|
||||
* chain.
|
||||
*/
|
||||
_getParentClass({enclosingScope}) {
|
||||
_getParentClass({ enclosingScope }) {
|
||||
let doclet;
|
||||
let parts;
|
||||
let scope = enclosingScope;
|
||||
@ -457,7 +461,7 @@ class Parser extends EventEmitter {
|
||||
|
||||
if (doclet) {
|
||||
// is the doclet for a class? if so, we're done
|
||||
if ( isClass(doclet) ) {
|
||||
if (isClass(doclet)) {
|
||||
break;
|
||||
}
|
||||
|
||||
@ -466,7 +470,7 @@ class Parser extends EventEmitter {
|
||||
parts = toParts(doclet.longname);
|
||||
if (parts.scope === SCOPE.PUNC.INSTANCE) {
|
||||
doclet = this._getDocletByLongname(parts.memberof);
|
||||
if ( isClass(doclet) ) {
|
||||
if (isClass(doclet)) {
|
||||
break;
|
||||
}
|
||||
}
|
||||
@ -476,7 +480,7 @@ class Parser extends EventEmitter {
|
||||
scope = scope.enclosingScope;
|
||||
}
|
||||
|
||||
return (isClass(doclet) ? doclet : null);
|
||||
return isClass(doclet) ? doclet : null;
|
||||
}
|
||||
|
||||
// TODO: docs
|
||||
@ -493,8 +497,11 @@ class Parser extends EventEmitter {
|
||||
// Properties are handled below.
|
||||
if (node.type !== Syntax.Property && node.enclosingScope) {
|
||||
// For ES2015 constructor functions, we use the class declaration to resolve `this`.
|
||||
if (node.parent && node.parent.type === Syntax.MethodDefinition &&
|
||||
node.parent.kind === 'constructor') {
|
||||
if (
|
||||
node.parent &&
|
||||
node.parent.type === Syntax.MethodDefinition &&
|
||||
node.parent.kind === 'constructor'
|
||||
) {
|
||||
doclet = this._getDocletById(node.parent.parent.parent.nodeId);
|
||||
}
|
||||
// Otherwise, if there's an enclosing scope, we use the enclosing scope to resolve `this`.
|
||||
@ -504,18 +511,16 @@ class Parser extends EventEmitter {
|
||||
|
||||
if (!doclet) {
|
||||
result = LONGNAMES.ANONYMOUS; // TODO handle global this?
|
||||
}
|
||||
else if (doclet.this) {
|
||||
} else if (doclet.this) {
|
||||
result = doclet.this;
|
||||
}
|
||||
else if (doclet.kind === 'function' && doclet.memberof) {
|
||||
} else if (doclet.kind === 'function' && doclet.memberof) {
|
||||
parentClass = this._getParentClass(node);
|
||||
|
||||
// like: function Foo() { this.bar = function(n) { /** blah */ this.name = n; };
|
||||
// or: Foo.prototype.bar = function(n) { /** blah */ this.name = n; };
|
||||
// or: var Foo = exports.Foo = function(n) { /** blah */ this.name = n; };
|
||||
// or: Foo.constructor = function(n) { /** blah */ this.name = n; }
|
||||
if ( parentClass || /\.constructor$/.test(doclet.longname) ) {
|
||||
if (parentClass || /\.constructor$/.test(doclet.longname)) {
|
||||
result = doclet.memberof;
|
||||
}
|
||||
// like: function notAClass(n) { /** global this */ this.name = n; }
|
||||
@ -524,15 +529,13 @@ class Parser extends EventEmitter {
|
||||
}
|
||||
}
|
||||
// like: var foo = function(n) { /** blah */ this.bar = n; }
|
||||
else if ( doclet.kind === 'member' && astNode.isAssignment(node) ) {
|
||||
else if (doclet.kind === 'member' && astNode.isAssignment(node)) {
|
||||
result = doclet.longname;
|
||||
}
|
||||
// walk up to the closest class we can find
|
||||
else if (doclet.kind === 'class' || doclet.kind === 'interface' ||
|
||||
doclet.kind === 'module') {
|
||||
else if (doclet.kind === 'class' || doclet.kind === 'interface' || doclet.kind === 'module') {
|
||||
result = doclet.longname;
|
||||
}
|
||||
else if (node.enclosingScope) {
|
||||
} else if (node.enclosingScope) {
|
||||
result = this.resolveThis(node.enclosingScope);
|
||||
}
|
||||
}
|
||||
@ -543,8 +546,7 @@ class Parser extends EventEmitter {
|
||||
if (!doclet) {
|
||||
// The object wasn't documented, so we don't know what name to use.
|
||||
result = '';
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
result = doclet.longname;
|
||||
}
|
||||
}
|
||||
@ -566,7 +568,7 @@ class Parser extends EventEmitter {
|
||||
* @return {Array.<module:jsdoc/doclet.Doclet>} An array of doclets for the parent object or objects, or
|
||||
* an empty array if no doclets are found.
|
||||
*/
|
||||
resolvePropertyParents({parent}) {
|
||||
resolvePropertyParents({ parent }) {
|
||||
let currentAncestor = parent;
|
||||
let nextAncestor = currentAncestor.parent;
|
||||
let doclet;
|
||||
@ -599,7 +601,7 @@ class Parser extends EventEmitter {
|
||||
* @param {astnode} node
|
||||
* @param {string} basename The leftmost name in the long name: in foo.bar.zip the basename is foo.
|
||||
*/
|
||||
resolveVar({enclosingScope, type}, basename) {
|
||||
resolveVar({ enclosingScope, type }, basename) {
|
||||
let doclet;
|
||||
let result;
|
||||
const scope = enclosingScope;
|
||||
@ -608,16 +610,13 @@ class Parser extends EventEmitter {
|
||||
// scope (see #685 and #693)
|
||||
if (type === Syntax.FunctionDeclaration) {
|
||||
result = '';
|
||||
}
|
||||
else if (!scope) {
|
||||
} else if (!scope) {
|
||||
result = ''; // global
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
doclet = this._getDocletById(scope.nodeId);
|
||||
if ( definedInScope(doclet, basename) ) {
|
||||
if (definedInScope(doclet, basename)) {
|
||||
result = doclet.longname;
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
result = this.resolveVar(scope, basename);
|
||||
}
|
||||
}
|
||||
@ -629,7 +628,7 @@ class Parser extends EventEmitter {
|
||||
resolveEnum(e) {
|
||||
const doclets = this.resolvePropertyParents(e.code.node.parent);
|
||||
|
||||
doclets.forEach(doclet => {
|
||||
doclets.forEach((doclet) => {
|
||||
if (doclet && doclet.isEnum) {
|
||||
doclet.properties = doclet.properties || [];
|
||||
|
||||
|
||||
@ -28,29 +28,27 @@ class Scanner extends EventEmitter {
|
||||
searchPaths = searchPaths || [];
|
||||
depth = depth || 1;
|
||||
|
||||
searchPaths.forEach($ => {
|
||||
searchPaths.forEach(($) => {
|
||||
const filepath = path.resolve(process.cwd(), decodeURIComponent($));
|
||||
|
||||
try {
|
||||
currentFile = statSync(filepath);
|
||||
}
|
||||
catch (e) {
|
||||
} catch (e) {
|
||||
log.error(`Unable to find the source file or directory ${filepath}`);
|
||||
|
||||
return;
|
||||
}
|
||||
|
||||
if ( currentFile.isFile() ) {
|
||||
if (currentFile.isFile()) {
|
||||
filePaths.push(filepath);
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
filePaths = filePaths.concat(lsSync(filepath, depth));
|
||||
}
|
||||
});
|
||||
|
||||
filePaths = filePaths.filter($ => filter.isIncluded($));
|
||||
filePaths = filePaths.filter(($) => filter.isIncluded($));
|
||||
|
||||
filePaths = filePaths.filter($ => {
|
||||
filePaths = filePaths.filter(($) => {
|
||||
const e = { fileName: $ };
|
||||
|
||||
this.emit('sourceFileFound', e);
|
||||
|
||||
@ -13,7 +13,7 @@ const { Syntax } = require('@jsdoc/parse');
|
||||
* @private
|
||||
* @param {!Object} comment - A comment node with `type` and `value` properties.
|
||||
*/
|
||||
function getRawComment({value}) {
|
||||
function getRawComment({ value }) {
|
||||
return `/*${value}*/`;
|
||||
}
|
||||
|
||||
@ -23,7 +23,7 @@ function getRawComment({value}) {
|
||||
* @param {!Object} comment - A comment node with `type` and `value` properties.
|
||||
* @return {boolean} `true` if the comment is a block comment, `false` otherwise.
|
||||
*/
|
||||
function isBlockComment({type}) {
|
||||
function isBlockComment({ type }) {
|
||||
return type === 'CommentBlock';
|
||||
}
|
||||
|
||||
@ -35,8 +35,12 @@ function isBlockComment({type}) {
|
||||
* @memberof module:jsdoc/src/parser.Parser
|
||||
*/
|
||||
function isValidJsdoc(commentSrc) {
|
||||
return commentSrc && commentSrc.length > 4 && commentSrc.indexOf('/**') === 0 &&
|
||||
commentSrc.indexOf('/***') !== 0;
|
||||
return (
|
||||
commentSrc &&
|
||||
commentSrc.length > 4 &&
|
||||
commentSrc.indexOf('/**') === 0 &&
|
||||
commentSrc.indexOf('/***') !== 0
|
||||
);
|
||||
}
|
||||
|
||||
// TODO: docs
|
||||
@ -52,7 +56,7 @@ function getLeadingJsdocComment(node) {
|
||||
// treat the comment closest to the node as the leading comment
|
||||
comment = getRawComment(leadingComments[leadingComments.length - 1]);
|
||||
|
||||
if ( !isValidJsdoc(comment) ) {
|
||||
if (!isValidJsdoc(comment)) {
|
||||
comment = null;
|
||||
}
|
||||
}
|
||||
@ -63,18 +67,23 @@ function getLeadingJsdocComment(node) {
|
||||
|
||||
// TODO: docs
|
||||
function makeVarsFinisher(scopeDoclet) {
|
||||
return ({doclet, code}) => {
|
||||
return ({ doclet, code }) => {
|
||||
// no need to evaluate all things related to scopeDoclet again, just use it
|
||||
if ( scopeDoclet && doclet && (doclet.alias || doclet.memberof) ) {
|
||||
if (scopeDoclet && doclet && (doclet.alias || doclet.memberof)) {
|
||||
scopeDoclet.meta.vars[code.name] = doclet.longname;
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
// Given an event, get the parent node's doclet.
|
||||
function getParentDocletFromEvent(parser, {doclet}) {
|
||||
if (doclet && doclet.meta && doclet.meta.code && doclet.meta.code.node &&
|
||||
doclet.meta.code.node.parent) {
|
||||
function getParentDocletFromEvent(parser, { doclet }) {
|
||||
if (
|
||||
doclet &&
|
||||
doclet.meta &&
|
||||
doclet.meta.code &&
|
||||
doclet.meta.code.node &&
|
||||
doclet.meta.code.node.parent
|
||||
) {
|
||||
return parser._getDocletById(doclet.meta.code.node.parent.nodeId);
|
||||
}
|
||||
|
||||
@ -92,7 +101,7 @@ function getParentDocletFromEvent(parser, {doclet}) {
|
||||
* doclet.
|
||||
*/
|
||||
function makeInlineParamsFinisher(parser) {
|
||||
return e => {
|
||||
return (e) => {
|
||||
let documentedParams;
|
||||
let knownParams;
|
||||
let param;
|
||||
@ -126,11 +135,11 @@ function makeInlineParamsFinisher(parser) {
|
||||
|
||||
// if we ran out of documented params, or we're at the parameter's actual position,
|
||||
// splice in the param at the current index
|
||||
if ( !param || i === knownParams.indexOf(e.doclet.name) ) {
|
||||
if (!param || i === knownParams.indexOf(e.doclet.name)) {
|
||||
documentedParams.splice(i, 0, {
|
||||
type: e.doclet.type || {},
|
||||
description: '',
|
||||
name: e.doclet.name
|
||||
name: e.doclet.name,
|
||||
});
|
||||
|
||||
// the doclet is no longer needed
|
||||
@ -155,7 +164,7 @@ function makeInlineParamsFinisher(parser) {
|
||||
function findRestParam(params) {
|
||||
let restParam = null;
|
||||
|
||||
params.some(param => {
|
||||
params.some((param) => {
|
||||
if (param.type === Syntax.RestElement) {
|
||||
restParam = param;
|
||||
|
||||
@ -178,7 +187,7 @@ function findRestParam(params) {
|
||||
* the parameter is repeatable.
|
||||
*/
|
||||
function makeRestParamFinisher() {
|
||||
return e => {
|
||||
return (e) => {
|
||||
const doclet = e.doclet;
|
||||
let documentedParams;
|
||||
let restNode;
|
||||
@ -188,10 +197,12 @@ function makeRestParamFinisher() {
|
||||
}
|
||||
|
||||
documentedParams = doclet.params = doclet.params || [];
|
||||
restNode = findRestParam(e.code.node.params ||
|
||||
restNode = findRestParam(
|
||||
e.code.node.params ||
|
||||
(e.code.node.value && e.code.node.value.params) ||
|
||||
(e.code.node.init && e.code.node.init.params) ||
|
||||
[]);
|
||||
[]
|
||||
);
|
||||
|
||||
if (restNode) {
|
||||
for (let i = documentedParams.length - 1; i >= 0; i--) {
|
||||
@ -215,11 +226,10 @@ function makeRestParamFinisher() {
|
||||
function findDefaultParams(params) {
|
||||
const defaultParams = [];
|
||||
|
||||
params.forEach(param => {
|
||||
params.forEach((param) => {
|
||||
if (param.type === Syntax.AssignmentPattern) {
|
||||
defaultParams.push(param);
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
defaultParams.push(null);
|
||||
}
|
||||
});
|
||||
@ -242,7 +252,7 @@ function findDefaultParams(params) {
|
||||
* parameters.
|
||||
*/
|
||||
function makeDefaultParamFinisher() {
|
||||
return e => {
|
||||
return (e) => {
|
||||
let defaultValues;
|
||||
const doclet = e.doclet;
|
||||
let documentedParams;
|
||||
@ -264,22 +274,22 @@ function makeDefaultParamFinisher() {
|
||||
}
|
||||
|
||||
// if the current parameter doesn't appear to be documented, move to the next one
|
||||
paramName = params[i].type === Syntax.AssignmentPattern ?
|
||||
params[i].left.name :
|
||||
params[i].name;
|
||||
paramName =
|
||||
params[i].type === Syntax.AssignmentPattern ? params[i].left.name : params[i].name;
|
||||
if (paramName !== documentedParams[j].name) {
|
||||
continue;
|
||||
}
|
||||
|
||||
// add the default value iff a) a literal default value is defined in the code,
|
||||
// b) no default value is documented, and c) the default value is not an empty string
|
||||
if (defaultValues[i] &&
|
||||
if (
|
||||
defaultValues[i] &&
|
||||
defaultValues[i].right &&
|
||||
defaultValues[i].right.type === Syntax.Literal &&
|
||||
typeof documentedParams[j].defaultvalue === 'undefined' &&
|
||||
defaultValues[i].right.value !== '') {
|
||||
documentedParams[j].defaultvalue =
|
||||
astNode.nodeToValue(defaultValues[i].right);
|
||||
defaultValues[i].right.value !== ''
|
||||
) {
|
||||
documentedParams[j].defaultvalue = astNode.nodeToValue(defaultValues[i].right);
|
||||
}
|
||||
|
||||
// move to the next documented param
|
||||
@ -299,15 +309,17 @@ function makeDefaultParamFinisher() {
|
||||
* @return {function} A function that merges the constructor's doclet into the class's doclet.
|
||||
*/
|
||||
function makeConstructorFinisher(parser) {
|
||||
return e => {
|
||||
return (e) => {
|
||||
let combined;
|
||||
const eventDoclet = e.doclet;
|
||||
let parentDoclet;
|
||||
|
||||
// for class declarations that are named module exports, the node that's documented is the
|
||||
// ExportNamedDeclaration, not the ClassDeclaration
|
||||
if (e.code.node.parent.parent.parent &&
|
||||
e.code.node.parent.parent.parent.type === Syntax.ExportNamedDeclaration) {
|
||||
if (
|
||||
e.code.node.parent.parent.parent &&
|
||||
e.code.node.parent.parent.parent.type === Syntax.ExportNamedDeclaration
|
||||
) {
|
||||
parentDoclet = parser._getDocletById(e.code.node.parent.parent.parent.nodeId);
|
||||
}
|
||||
// otherwise, we want the ClassDeclaration
|
||||
@ -336,15 +348,18 @@ function makeConstructorFinisher(parser) {
|
||||
* @return {function} A function that adds an `async` property to the doclet of async functions.
|
||||
*/
|
||||
function makeAsyncFunctionFinisher() {
|
||||
return e => {
|
||||
return (e) => {
|
||||
const doclet = e.doclet;
|
||||
|
||||
if (!doclet) {
|
||||
return;
|
||||
}
|
||||
|
||||
if ( e.code.node.async || (e.code.node.value && e.code.node.value.async) ||
|
||||
(e.code.node.init && e.code.node.init.async) ) {
|
||||
if (
|
||||
e.code.node.async ||
|
||||
(e.code.node.value && e.code.node.value.async) ||
|
||||
(e.code.node.init && e.code.node.init.async)
|
||||
) {
|
||||
doclet.async = true;
|
||||
}
|
||||
};
|
||||
@ -357,7 +372,7 @@ function makeAsyncFunctionFinisher() {
|
||||
* @return {function} A function that marks a doclet as private.
|
||||
*/
|
||||
function makePrivatePropertyFinisher() {
|
||||
return ({doclet}) => {
|
||||
return ({ doclet }) => {
|
||||
doclet.access = 'private';
|
||||
};
|
||||
}
|
||||
@ -369,15 +384,18 @@ function makePrivatePropertyFinisher() {
|
||||
* @return {function} A function that marks a doclet as a generator function.
|
||||
*/
|
||||
function makeGeneratorFinisher() {
|
||||
return e => {
|
||||
return (e) => {
|
||||
const doclet = e.doclet;
|
||||
|
||||
if (!doclet) {
|
||||
return;
|
||||
}
|
||||
|
||||
if ( e.code.node.generator || (e.code.node.init && e.code.node.init.generator) ||
|
||||
(e.code.node.value && e.code.node.value.generator) ) {
|
||||
if (
|
||||
e.code.node.generator ||
|
||||
(e.code.node.init && e.code.node.init.generator) ||
|
||||
(e.code.node.value && e.code.node.value.generator)
|
||||
) {
|
||||
doclet.generator = true;
|
||||
}
|
||||
};
|
||||
@ -399,7 +417,7 @@ class SymbolFound {
|
||||
this.finishers = extras.finishers || [];
|
||||
|
||||
// make sure the event includes properties that don't have default values
|
||||
Object.keys(extras).forEach(key => {
|
||||
Object.keys(extras).forEach((key) => {
|
||||
this[key] = extras[key];
|
||||
});
|
||||
}
|
||||
@ -408,7 +426,7 @@ class SymbolFound {
|
||||
// TODO: docs
|
||||
class JsdocCommentFound {
|
||||
// TODO: docs
|
||||
constructor({loc, range}, rawComment, filename) {
|
||||
constructor({ loc, range }, rawComment, filename) {
|
||||
this.comment = rawComment;
|
||||
this.lineno = loc.start.line;
|
||||
this.columnno = loc.start.column;
|
||||
@ -416,16 +434,18 @@ class JsdocCommentFound {
|
||||
this.range = range;
|
||||
|
||||
Object.defineProperty(this, 'event', {
|
||||
value: 'jsdocCommentFound'
|
||||
value: 'jsdocCommentFound',
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
// TODO: docs
|
||||
function hasComments(node) {
|
||||
return (node && node.leadingComments && node.leadingComments.length) ||
|
||||
return (
|
||||
(node && node.leadingComments && node.leadingComments.length) ||
|
||||
(node && node.trailingComments && node.trailingComments.length) ||
|
||||
(node && node.innerComments && node.innerComments.length);
|
||||
(node && node.innerComments && node.innerComments.length)
|
||||
);
|
||||
}
|
||||
|
||||
// TODO: docs
|
||||
@ -440,21 +460,20 @@ function updateCommentNode(commentNode, comment) {
|
||||
|
||||
// TODO: docs
|
||||
// TODO: note that it's essential to call this function before you try to resolve names!
|
||||
function trackVars(parser, {enclosingScope}, {code, finishers}) {
|
||||
function trackVars(parser, { enclosingScope }, { code, finishers }) {
|
||||
let doclet;
|
||||
const enclosingScopeId = enclosingScope ? enclosingScope.nodeId : null;
|
||||
|
||||
if (enclosingScopeId) {
|
||||
doclet = parser._getDocletById(enclosingScopeId);
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
doclet = parser._getDocletByLongname(LONGNAMES.GLOBAL);
|
||||
}
|
||||
|
||||
if (doclet) {
|
||||
doclet.meta.vars = doclet.meta.vars || {};
|
||||
doclet.meta.vars[code.name] = null;
|
||||
finishers.push( makeVarsFinisher(doclet) );
|
||||
finishers.push(makeVarsFinisher(doclet));
|
||||
}
|
||||
}
|
||||
|
||||
@ -465,7 +484,7 @@ function makeSymbolFoundEvent(node, parser, filename) {
|
||||
let parent;
|
||||
|
||||
const extras = {
|
||||
code: astNode.getInfo(node)
|
||||
code: astNode.getInfo(node),
|
||||
};
|
||||
|
||||
switch (node.type) {
|
||||
@ -486,7 +505,7 @@ function makeSymbolFoundEvent(node, parser, filename) {
|
||||
case Syntax.AssignmentPattern:
|
||||
parent = node.parent;
|
||||
|
||||
if ( node.leadingComments && parent && astNode.isFunction(parent) ) {
|
||||
if (node.leadingComments && parent && astNode.isFunction(parent)) {
|
||||
extras.finishers = [makeInlineParamsFinisher(parser)];
|
||||
e = new SymbolFound(node, filename, extras);
|
||||
|
||||
@ -511,10 +530,7 @@ function makeSymbolFoundEvent(node, parser, filename) {
|
||||
|
||||
// like `#b = 1` in: class A { #b = 1; }
|
||||
case Syntax.ClassPrivateProperty:
|
||||
extras.finishers = [
|
||||
parser.resolveEnum,
|
||||
makePrivatePropertyFinisher()
|
||||
];
|
||||
extras.finishers = [parser.resolveEnum, makePrivatePropertyFinisher()];
|
||||
|
||||
e = new SymbolFound(node, filename, extras);
|
||||
|
||||
@ -569,7 +585,7 @@ function makeSymbolFoundEvent(node, parser, filename) {
|
||||
// handle async functions
|
||||
makeAsyncFunctionFinisher(),
|
||||
// handle generator functions
|
||||
makeGeneratorFinisher()
|
||||
makeGeneratorFinisher(),
|
||||
];
|
||||
|
||||
e = new SymbolFound(node, filename, extras);
|
||||
@ -589,7 +605,7 @@ function makeSymbolFoundEvent(node, parser, filename) {
|
||||
parent = node.parent;
|
||||
|
||||
// function parameters with inline comments
|
||||
if ( node.leadingComments && parent && astNode.isFunction(parent) ) {
|
||||
if (node.leadingComments && parent && astNode.isFunction(parent)) {
|
||||
extras.finishers = [makeInlineParamsFinisher(parser)];
|
||||
e = new SymbolFound(node, filename, extras);
|
||||
|
||||
@ -619,11 +635,11 @@ function makeSymbolFoundEvent(node, parser, filename) {
|
||||
// handle async functions
|
||||
makeAsyncFunctionFinisher(),
|
||||
// handle generator functions
|
||||
makeGeneratorFinisher()
|
||||
makeGeneratorFinisher(),
|
||||
];
|
||||
// for constructors, we attempt to merge the constructor's docs into the class's docs
|
||||
if (node.kind === 'constructor') {
|
||||
extras.finishers.push( makeConstructorFinisher(parser) );
|
||||
extras.finishers.push(makeConstructorFinisher(parser));
|
||||
}
|
||||
|
||||
e = new SymbolFound(node, filename, extras);
|
||||
@ -651,7 +667,7 @@ function makeSymbolFoundEvent(node, parser, filename) {
|
||||
case Syntax.RestElement:
|
||||
parent = node.parent;
|
||||
|
||||
if ( node.leadingComments && parent && astNode.isFunction(parent) ) {
|
||||
if (node.leadingComments && parent && astNode.isFunction(parent)) {
|
||||
extras.finishers = [makeInlineParamsFinisher(parser)];
|
||||
e = new SymbolFound(node, filename, extras);
|
||||
|
||||
@ -670,7 +686,7 @@ function makeSymbolFoundEvent(node, parser, filename) {
|
||||
// handle async functions
|
||||
makeAsyncFunctionFinisher(),
|
||||
// handle generator functions
|
||||
makeGeneratorFinisher()
|
||||
makeGeneratorFinisher(),
|
||||
];
|
||||
|
||||
e = new SymbolFound(node, filename, extras);
|
||||
@ -691,7 +707,7 @@ function makeSymbolFoundEvent(node, parser, filename) {
|
||||
|
||||
if (!e) {
|
||||
e = {
|
||||
finishers: []
|
||||
finishers: [],
|
||||
};
|
||||
}
|
||||
|
||||
@ -707,10 +723,7 @@ class Visitor {
|
||||
// ESTree node visitors added by plugins
|
||||
this._nodeVisitors = [];
|
||||
// built-in visitors
|
||||
this._visitors = [
|
||||
this.visitNodeComments,
|
||||
this.visitNode
|
||||
];
|
||||
this._visitors = [this.visitNodeComments, this.visitNode];
|
||||
}
|
||||
|
||||
/**
|
||||
@ -762,10 +775,10 @@ class Visitor {
|
||||
let rawComment;
|
||||
|
||||
function addComments(source) {
|
||||
comments = comments.concat( source.slice(0) );
|
||||
comments = comments.concat(source.slice(0));
|
||||
}
|
||||
|
||||
if ( !hasComments(node) && (!node.type || !isBlock) ) {
|
||||
if (!hasComments(node) && (!node.type || !isBlock)) {
|
||||
return true;
|
||||
}
|
||||
|
||||
@ -783,8 +796,12 @@ class Visitor {
|
||||
|
||||
// ...or if they were comments from the end of the file that were erroneously attached to a
|
||||
// `'use strict';` declaration (https://github.com/babel/babel/issues/6688).
|
||||
if (node.type === Syntax.ExpressionStatement && node.directive === 'use strict' &&
|
||||
node.trailingComments && node.trailingComments.length) {
|
||||
if (
|
||||
node.type === Syntax.ExpressionStatement &&
|
||||
node.directive === 'use strict' &&
|
||||
node.trailingComments &&
|
||||
node.trailingComments.length
|
||||
) {
|
||||
// to be safe, we verify that the trailing comments came after the next node in the Program
|
||||
// body, which means the comments were attached to the wrong node
|
||||
if (node.parent.body.length > 1) {
|
||||
@ -805,7 +822,7 @@ class Visitor {
|
||||
for (let comment of comments) {
|
||||
rawComment = getRawComment(comment);
|
||||
|
||||
if ( isValidJsdoc(rawComment) ) {
|
||||
if (isValidJsdoc(rawComment)) {
|
||||
e = new JsdocCommentFound(comment, rawComment, filename);
|
||||
|
||||
parser.emit(e.event, e, parser);
|
||||
|
||||
@ -32,7 +32,8 @@ function moveTrailingComments(source, target, count) {
|
||||
}
|
||||
|
||||
target.trailingComments = source.trailingComments.slice(
|
||||
source.trailingComments.length - count, count
|
||||
source.trailingComments.length - count,
|
||||
count
|
||||
);
|
||||
source.trailingComments = source.trailingComments.slice(0);
|
||||
}
|
||||
@ -43,7 +44,7 @@ function leafNode(node, parent, state, cb) {}
|
||||
/* eslint-enable no-empty-function, no-unused-vars */
|
||||
|
||||
// TODO: docs
|
||||
const walkers = exports.walkers = {};
|
||||
const walkers = (exports.walkers = {});
|
||||
|
||||
walkers[Syntax.ArrayExpression] = (node, parent, state, cb) => {
|
||||
for (let element of node.elements) {
|
||||
@ -109,7 +110,7 @@ walkers[Syntax.BlockStatement] = (node, parent, state, cb) => {
|
||||
|
||||
walkers[Syntax.BreakStatement] = leafNode;
|
||||
|
||||
walkers[Syntax.CallExpression] = function(node, parent, state, cb) {
|
||||
walkers[Syntax.CallExpression] = function (node, parent, state, cb) {
|
||||
cb(node.callee, node, state);
|
||||
|
||||
if (node.arguments) {
|
||||
@ -642,13 +643,12 @@ class Walker {
|
||||
const state = {
|
||||
filename: filename,
|
||||
nodes: [],
|
||||
scopes: []
|
||||
scopes: [],
|
||||
};
|
||||
|
||||
function logUnknownNodeType({type}) {
|
||||
function logUnknownNodeType({ type }) {
|
||||
log.debug(
|
||||
`Found a node with unrecognized type ${type}. Ignoring the node and its ` +
|
||||
'descendants.'
|
||||
`Found a node with unrecognized type ${type}. Ignoring the node and its ` + 'descendants.'
|
||||
);
|
||||
}
|
||||
|
||||
|
||||
@ -8,7 +8,7 @@ const path = require('path');
|
||||
const tag = {
|
||||
dictionary: require('jsdoc/tag/dictionary'),
|
||||
validator: require('jsdoc/tag/validator'),
|
||||
type: require('@jsdoc/tag').type
|
||||
type: require('@jsdoc/tag').type,
|
||||
};
|
||||
|
||||
// Check whether the text is the same as a symbol name with leading or trailing whitespace. If so,
|
||||
@ -24,10 +24,9 @@ function trim(text, opts, meta) {
|
||||
opts = opts || {};
|
||||
text = String(typeof text === 'undefined' ? '' : text);
|
||||
|
||||
if ( mustPreserveWhitespace(text, meta) ) {
|
||||
if (mustPreserveWhitespace(text, meta)) {
|
||||
text = `"${text}"`;
|
||||
}
|
||||
else if (opts.keepsWhitespace) {
|
||||
} else if (opts.keepsWhitespace) {
|
||||
text = text.replace(/^[\n\r\f]+|[\n\r\f]+$/g, '');
|
||||
if (opts.removesIndent) {
|
||||
match = text.match(/^([ \t]+)/);
|
||||
@ -36,8 +35,7 @@ function trim(text, opts, meta) {
|
||||
text = text.replace(indentMatcher, '');
|
||||
}
|
||||
}
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
text = text.replace(/^\s+|\s+$/g, '');
|
||||
}
|
||||
|
||||
@ -49,18 +47,21 @@ function addHiddenProperty(obj, propName, propValue) {
|
||||
value: propValue,
|
||||
writable: true,
|
||||
enumerable: Boolean(env.opts.debug),
|
||||
configurable: true
|
||||
configurable: true,
|
||||
});
|
||||
}
|
||||
|
||||
function parseType({text, originalTitle}, {canHaveName, canHaveType}, meta) {
|
||||
function parseType({ text, originalTitle }, { canHaveName, canHaveType }, meta) {
|
||||
try {
|
||||
return tag.type.parse(text, canHaveName, canHaveType);
|
||||
}
|
||||
catch (e) {
|
||||
} catch (e) {
|
||||
log.error(
|
||||
'Unable to parse a tag\'s type expression%s with tag title "%s" and text "%s": %s',
|
||||
meta.filename ? ( ` for source file ${path.join(meta.path, meta.filename)}${meta.lineno ? (` in line ${meta.lineno}`) : ''}` ) : '',
|
||||
meta.filename
|
||||
? ` for source file ${path.join(meta.path, meta.filename)}${
|
||||
meta.lineno ? ` in line ${meta.lineno}` : ''
|
||||
}`
|
||||
: '',
|
||||
originalTitle,
|
||||
text,
|
||||
e.message
|
||||
@ -89,12 +90,12 @@ function processTagText(tagInstance, tagDef, meta) {
|
||||
if (tagType.type) {
|
||||
if (tagType.type.length) {
|
||||
tagInstance.value.type = {
|
||||
names: tagType.type
|
||||
names: tagType.type,
|
||||
};
|
||||
addHiddenProperty(tagInstance.value.type, 'parsedType', tagType.parsedType);
|
||||
}
|
||||
|
||||
['optional', 'nullable', 'variable', 'defaultvalue'].forEach(prop => {
|
||||
['optional', 'nullable', 'variable', 'defaultvalue'].forEach((prop) => {
|
||||
if (typeof tagType[prop] !== 'undefined') {
|
||||
tagInstance.value[prop] = tagType[prop];
|
||||
}
|
||||
@ -111,8 +112,7 @@ function processTagText(tagInstance, tagDef, meta) {
|
||||
tagInstance.value.name = tagType.name;
|
||||
}
|
||||
}
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
tagInstance.value = tagInstance.text;
|
||||
}
|
||||
}
|
||||
@ -155,7 +155,7 @@ class Tag {
|
||||
tagDef = tag.dictionary.lookUp(this.title);
|
||||
trimOpts = {
|
||||
keepsWhitespace: tagDef.keepsWhitespace,
|
||||
removesIndent: tagDef.removesIndent
|
||||
removesIndent: tagDef.removesIndent,
|
||||
};
|
||||
|
||||
/**
|
||||
|
||||
@ -7,7 +7,7 @@ const hasOwnProp = Object.prototype.hasOwnProperty;
|
||||
|
||||
const DEFINITIONS = {
|
||||
closure: 'closureTags',
|
||||
jsdoc: 'jsdocTags'
|
||||
jsdoc: 'jsdocTags',
|
||||
};
|
||||
|
||||
let dictionary;
|
||||
@ -22,10 +22,10 @@ class TagDefinition {
|
||||
this.title = dict.normalize(title);
|
||||
|
||||
Object.defineProperty(this, '_dictionary', {
|
||||
value: dict
|
||||
value: dict,
|
||||
});
|
||||
|
||||
Object.keys(etc).forEach(p => {
|
||||
Object.keys(etc).forEach((p) => {
|
||||
self[p] = etc[p];
|
||||
});
|
||||
}
|
||||
@ -73,7 +73,7 @@ class Dictionary {
|
||||
this._defineNamespace(tagDef.title);
|
||||
}
|
||||
if (tagDef.synonyms) {
|
||||
tagDef.synonyms.forEach(synonym => {
|
||||
tagDef.synonyms.forEach((synonym) => {
|
||||
this.defineSynonym(title, synonym);
|
||||
});
|
||||
}
|
||||
@ -105,7 +105,10 @@ class Dictionary {
|
||||
'Unable to load tag definitions.'
|
||||
);
|
||||
} else {
|
||||
dictionaries.slice().reverse().forEach(dictName => {
|
||||
dictionaries
|
||||
.slice()
|
||||
.reverse()
|
||||
.forEach((dictName) => {
|
||||
const tagDefs = definitions[DEFINITIONS[dictName]];
|
||||
|
||||
if (!tagDefs) {
|
||||
@ -144,7 +147,7 @@ class Dictionary {
|
||||
lookup(title) {
|
||||
title = this.normalize(title);
|
||||
|
||||
if ( hasOwnProp.call(this._tags, title) ) {
|
||||
if (hasOwnProp.call(this._tags, title)) {
|
||||
return this._tags[title];
|
||||
}
|
||||
|
||||
@ -162,7 +165,7 @@ class Dictionary {
|
||||
normalize(title) {
|
||||
const canonicalName = title.toLowerCase();
|
||||
|
||||
if ( hasOwnProp.call(this._tagSynonyms, canonicalName) ) {
|
||||
if (hasOwnProp.call(this._tagSynonyms, canonicalName)) {
|
||||
return this._tagSynonyms[canonicalName];
|
||||
}
|
||||
|
||||
|
||||
@ -19,7 +19,7 @@ const MODULE_NAMESPACE = 'module:';
|
||||
const NOOP_TAG = {
|
||||
onTagged: () => {
|
||||
// Do nothing.
|
||||
}
|
||||
},
|
||||
};
|
||||
|
||||
// Clone a tag definition, excluding synonyms.
|
||||
@ -28,14 +28,14 @@ function cloneTagDef(tagDef, extras) {
|
||||
|
||||
delete newTagDef.synonyms;
|
||||
|
||||
return (extras ? _.extend(newTagDef, extras) : newTagDef);
|
||||
return extras ? _.extend(newTagDef, extras) : newTagDef;
|
||||
}
|
||||
|
||||
function getSourcePaths() {
|
||||
const sourcePaths = env.sourceFiles.slice(0) || [];
|
||||
|
||||
if (env.opts._) {
|
||||
env.opts._.forEach(sourcePath => {
|
||||
env.opts._.forEach((sourcePath) => {
|
||||
const resolved = path.resolve(process.cwd(), sourcePath);
|
||||
|
||||
if (!sourcePaths.includes(resolved)) {
|
||||
@ -52,15 +52,15 @@ function filepathMinusPrefix(filepath) {
|
||||
const sourcePaths = getSourcePaths();
|
||||
let result = '';
|
||||
|
||||
commonPrefix = sourcePaths.length > 1 ?
|
||||
commonPathPrefix(sourcePaths) :
|
||||
path.dirname(sourcePaths[0] || '') + path.sep;
|
||||
commonPrefix =
|
||||
sourcePaths.length > 1
|
||||
? commonPathPrefix(sourcePaths)
|
||||
: path.dirname(sourcePaths[0] || '') + path.sep;
|
||||
|
||||
if (filepath) {
|
||||
filepath = path.normalize(filepath);
|
||||
// always use forward slashes in the result
|
||||
result = (filepath + path.sep).replace(commonPrefix, '')
|
||||
.replace(/\\/g, '/');
|
||||
result = (filepath + path.sep).replace(commonPrefix, '').replace(/\\/g, '/');
|
||||
}
|
||||
|
||||
if (result.length > 0 && result[result.length - 1] !== '/') {
|
||||
@ -71,46 +71,46 @@ function filepathMinusPrefix(filepath) {
|
||||
}
|
||||
|
||||
/** @private */
|
||||
function setDocletKindToTitle(doclet, {title}) {
|
||||
doclet.addTag( 'kind', title );
|
||||
function setDocletKindToTitle(doclet, { title }) {
|
||||
doclet.addTag('kind', title);
|
||||
}
|
||||
|
||||
function setDocletScopeToTitle(doclet, {title}) {
|
||||
function setDocletScopeToTitle(doclet, { title }) {
|
||||
try {
|
||||
doclet.setScope(title);
|
||||
}
|
||||
catch (e) {
|
||||
} catch (e) {
|
||||
log.error(e.message);
|
||||
}
|
||||
}
|
||||
|
||||
function setDocletNameToValue(doclet, {value, text}) {
|
||||
if (value && value.description) { // as in a long tag
|
||||
function setDocletNameToValue(doclet, { value, text }) {
|
||||
if (value && value.description) {
|
||||
// as in a long tag
|
||||
doclet.addTag('name', value.description);
|
||||
}
|
||||
else if (text) { // or a short tag
|
||||
} else if (text) {
|
||||
// or a short tag
|
||||
doclet.addTag('name', text);
|
||||
}
|
||||
}
|
||||
|
||||
function setDocletNameToValueName(doclet, {value}) {
|
||||
function setDocletNameToValueName(doclet, { value }) {
|
||||
if (value && value.name) {
|
||||
doclet.addTag('name', value.name);
|
||||
}
|
||||
}
|
||||
|
||||
function setDocletDescriptionToValue(doclet, {value}) {
|
||||
function setDocletDescriptionToValue(doclet, { value }) {
|
||||
if (value) {
|
||||
doclet.addTag('description', value);
|
||||
}
|
||||
}
|
||||
|
||||
function setDocletTypeToValueType(doclet, {value}) {
|
||||
function setDocletTypeToValueType(doclet, { value }) {
|
||||
if (value && value.type) {
|
||||
// Add the type names and other type properties (such as `optional`).
|
||||
// Don't overwrite existing properties.
|
||||
Object.keys(value).forEach(prop => {
|
||||
if ( !hasOwnProp.call(doclet, prop) ) {
|
||||
Object.keys(value).forEach((prop) => {
|
||||
if (!hasOwnProp.call(doclet, prop)) {
|
||||
doclet[prop] = value[prop];
|
||||
}
|
||||
});
|
||||
@ -126,17 +126,18 @@ function setNameToFile(doclet) {
|
||||
}
|
||||
}
|
||||
|
||||
function setDocletMemberof(doclet, {value}) {
|
||||
function setDocletMemberof(doclet, { value }) {
|
||||
if (value && value !== '<global>') {
|
||||
doclet.setMemberof(value);
|
||||
}
|
||||
}
|
||||
|
||||
function applyNamespaceToTag(docletOrNs, tag) {
|
||||
if (typeof docletOrNs === 'string') { // ns
|
||||
if (typeof docletOrNs === 'string') {
|
||||
// ns
|
||||
tag.value = applyNamespace(tag.value, docletOrNs);
|
||||
}
|
||||
else { // doclet
|
||||
} else {
|
||||
// doclet
|
||||
if (!docletOrNs.name) {
|
||||
return; // error?
|
||||
}
|
||||
@ -162,19 +163,18 @@ function parseTypeText(text) {
|
||||
return tagType.typeExpression || text;
|
||||
}
|
||||
|
||||
function parseBorrows(doclet, {text}) {
|
||||
function parseBorrows(doclet, { text }) {
|
||||
const m = /^([\s\S]+?)(?:\s+as\s+([\s\S]+))?$/.exec(text);
|
||||
|
||||
if (m) {
|
||||
if (m[1] && m[2]) {
|
||||
return {
|
||||
target: m[1],
|
||||
source: m[2]
|
||||
source: m[2],
|
||||
};
|
||||
}
|
||||
else if (m[1]) {
|
||||
} else if (m[1]) {
|
||||
return {
|
||||
target: m[1]
|
||||
target: m[1],
|
||||
};
|
||||
}
|
||||
|
||||
@ -193,20 +193,18 @@ function firstWordOf(string) {
|
||||
|
||||
if (m) {
|
||||
return m[1];
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
return '';
|
||||
}
|
||||
}
|
||||
|
||||
function combineTypes({value}) {
|
||||
function combineTypes({ value }) {
|
||||
let combined;
|
||||
|
||||
if (value && value.type) {
|
||||
if (value.type.names.length === 1) {
|
||||
combined = value.type.names[0];
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
combined = `(${value.type.names.join('|')})`;
|
||||
}
|
||||
}
|
||||
@ -215,7 +213,7 @@ function combineTypes({value}) {
|
||||
}
|
||||
|
||||
// Tags that JSDoc uses internally, and that must always be defined.
|
||||
const internalTags = exports.internalTags = {
|
||||
const internalTags = (exports.internalTags = {
|
||||
// Special separator tag indicating that multiple doclets should be generated for the same
|
||||
// comment. Used internally (and by some JSDoc users, although it's not officially supported).
|
||||
// In the following example, the parser will replace `//**` with an `@also` tag:
|
||||
@ -229,85 +227,84 @@ const internalTags = exports.internalTags = {
|
||||
also: {
|
||||
onTagged() {
|
||||
// let the parser handle it; we define the tag here to avoid "not a known tag" errors
|
||||
}
|
||||
},
|
||||
},
|
||||
description: {
|
||||
mustHaveValue: true,
|
||||
onTagged: (doclet, {value}) => {
|
||||
onTagged: (doclet, { value }) => {
|
||||
doclet.description = value;
|
||||
},
|
||||
synonyms: ['desc']
|
||||
synonyms: ['desc'],
|
||||
},
|
||||
kind: {
|
||||
mustHaveValue: true,
|
||||
onTagged: (doclet, {value}) => {
|
||||
onTagged: (doclet, { value }) => {
|
||||
doclet.kind = value;
|
||||
}
|
||||
},
|
||||
},
|
||||
name: {
|
||||
mustHaveValue: true,
|
||||
onTagged: (doclet, {value}) => {
|
||||
onTagged: (doclet, { value }) => {
|
||||
doclet.name = value;
|
||||
}
|
||||
},
|
||||
},
|
||||
undocumented: {
|
||||
mustNotHaveValue: true,
|
||||
onTagged(doclet) {
|
||||
doclet.undocumented = true;
|
||||
doclet.comment = '';
|
||||
}
|
||||
}
|
||||
};
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
// Core JSDoc tags that are shared with other tag dictionaries.
|
||||
let baseTags = exports.baseTags = {
|
||||
let baseTags = (exports.baseTags = {
|
||||
abstract: {
|
||||
mustNotHaveValue: true,
|
||||
onTagged(doclet) {
|
||||
// we call this `virtual` because `abstract` is a reserved word
|
||||
doclet.virtual = true;
|
||||
},
|
||||
synonyms: ['virtual']
|
||||
synonyms: ['virtual'],
|
||||
},
|
||||
access: {
|
||||
mustHaveValue: true,
|
||||
onTagged(doclet, {value}) {
|
||||
onTagged(doclet, { value }) {
|
||||
// only valid values are package, private, protected and public
|
||||
if ( /^(package|private|protected|public)$/i.test(value) ) {
|
||||
if (/^(package|private|protected|public)$/i.test(value)) {
|
||||
doclet.access = value.toLowerCase();
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
delete doclet.access;
|
||||
}
|
||||
}
|
||||
},
|
||||
},
|
||||
alias: {
|
||||
mustHaveValue: true,
|
||||
onTagged(doclet, {value}) {
|
||||
onTagged(doclet, { value }) {
|
||||
doclet.alias = value;
|
||||
}
|
||||
},
|
||||
},
|
||||
async: {
|
||||
mustNotHaveValue: true,
|
||||
onTagged(doclet) {
|
||||
doclet.async = true;
|
||||
}
|
||||
},
|
||||
},
|
||||
augments: {
|
||||
mustHaveValue: true,
|
||||
// Allow augments value to be specified as a normal type, e.g. {Type}
|
||||
onTagText: parseTypeText,
|
||||
onTagged(doclet, {value}) {
|
||||
doclet.augment( firstWordOf(value) );
|
||||
onTagged(doclet, { value }) {
|
||||
doclet.augment(firstWordOf(value));
|
||||
},
|
||||
synonyms: ['extends']
|
||||
synonyms: ['extends'],
|
||||
},
|
||||
author: {
|
||||
mustHaveValue: true,
|
||||
onTagged(doclet, {value}) {
|
||||
onTagged(doclet, { value }) {
|
||||
doclet.author = doclet.author || [];
|
||||
doclet.author.push(value);
|
||||
}
|
||||
},
|
||||
},
|
||||
// this symbol has a member that should use the same docs as another symbol
|
||||
borrows: {
|
||||
@ -316,7 +313,7 @@ let baseTags = exports.baseTags = {
|
||||
const borrows = parseBorrows(doclet, tag);
|
||||
|
||||
doclet.borrow(borrows.target, borrows.source);
|
||||
}
|
||||
},
|
||||
},
|
||||
class: {
|
||||
onTagged(doclet, tag) {
|
||||
@ -328,8 +325,10 @@ let baseTags = exports.baseTags = {
|
||||
if (tag.originalTitle === 'class') {
|
||||
// multiple words after @class?
|
||||
looksLikeDesc = (tag.value || '').match(/\S+\s+\S+/);
|
||||
if ((looksLikeDesc || /@construct(s|or)\b/i.test(doclet.comment)) &&
|
||||
!/@classdesc\b/i.test(doclet.comment)) {
|
||||
if (
|
||||
(looksLikeDesc || /@construct(s|or)\b/i.test(doclet.comment)) &&
|
||||
!/@classdesc\b/i.test(doclet.comment)
|
||||
) {
|
||||
// treat the @class tag as a @classdesc tag instead
|
||||
doclet.classdesc = tag.value;
|
||||
|
||||
@ -339,12 +338,12 @@ let baseTags = exports.baseTags = {
|
||||
|
||||
setDocletNameToValue(doclet, tag);
|
||||
},
|
||||
synonyms: ['constructor']
|
||||
synonyms: ['constructor'],
|
||||
},
|
||||
classdesc: {
|
||||
onTagged(doclet, {value}) {
|
||||
onTagged(doclet, { value }) {
|
||||
doclet.classdesc = value;
|
||||
}
|
||||
},
|
||||
},
|
||||
constant: {
|
||||
canHaveType: true,
|
||||
@ -354,36 +353,33 @@ let baseTags = exports.baseTags = {
|
||||
setDocletNameToValueName(doclet, tag);
|
||||
setDocletTypeToValueType(doclet, tag);
|
||||
},
|
||||
synonyms: ['const']
|
||||
synonyms: ['const'],
|
||||
},
|
||||
constructs: {
|
||||
onTagged(doclet, {value}) {
|
||||
onTagged(doclet, { value }) {
|
||||
let ownerClassName;
|
||||
|
||||
if (!value) {
|
||||
// this can be resolved later in the handlers
|
||||
ownerClassName = '{@thisClass}';
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
ownerClassName = firstWordOf(value);
|
||||
}
|
||||
doclet.addTag('alias', ownerClassName);
|
||||
doclet.addTag('kind', 'class');
|
||||
}
|
||||
},
|
||||
},
|
||||
copyright: {
|
||||
mustHaveValue: true,
|
||||
onTagged(doclet, {value}) {
|
||||
onTagged(doclet, { value }) {
|
||||
doclet.copyright = value;
|
||||
}
|
||||
},
|
||||
},
|
||||
default: {
|
||||
onTagged(doclet, {value}) {
|
||||
onTagged(doclet, { value }) {
|
||||
if (value) {
|
||||
doclet.defaultvalue = value;
|
||||
}
|
||||
else if (doclet.meta && doclet.meta.code &&
|
||||
typeof doclet.meta.code.value !== 'undefined') {
|
||||
} else if (doclet.meta && doclet.meta.code && typeof doclet.meta.code.value !== 'undefined') {
|
||||
switch (doclet.meta.code.type) {
|
||||
case Syntax.ArrayExpression:
|
||||
doclet.defaultvalue = nodeToValue(doclet.meta.code.node);
|
||||
@ -405,13 +401,13 @@ let baseTags = exports.baseTags = {
|
||||
}
|
||||
}
|
||||
},
|
||||
synonyms: ['defaultvalue']
|
||||
synonyms: ['defaultvalue'],
|
||||
},
|
||||
deprecated: {
|
||||
// value is optional
|
||||
onTagged(doclet, {value}) {
|
||||
onTagged(doclet, { value }) {
|
||||
doclet.deprecated = value || true;
|
||||
}
|
||||
},
|
||||
},
|
||||
enum: {
|
||||
canHaveType: true,
|
||||
@ -419,33 +415,33 @@ let baseTags = exports.baseTags = {
|
||||
doclet.kind = doclet.kind || 'member';
|
||||
doclet.isEnum = true;
|
||||
setDocletTypeToValueType(doclet, tag);
|
||||
}
|
||||
},
|
||||
},
|
||||
event: {
|
||||
isNamespace: true,
|
||||
onTagged(doclet, tag) {
|
||||
setDocletKindToTitle(doclet, tag);
|
||||
setDocletNameToValue(doclet, tag);
|
||||
}
|
||||
},
|
||||
},
|
||||
example: {
|
||||
keepsWhitespace: true,
|
||||
removesIndent: true,
|
||||
mustHaveValue: true,
|
||||
onTagged(doclet, {value}) {
|
||||
onTagged(doclet, { value }) {
|
||||
doclet.examples = doclet.examples || [];
|
||||
doclet.examples.push(value);
|
||||
}
|
||||
},
|
||||
},
|
||||
exports: {
|
||||
mustHaveValue: true,
|
||||
onTagged(doclet, {value}) {
|
||||
onTagged(doclet, { value }) {
|
||||
const modName = firstWordOf(value);
|
||||
|
||||
// in case the user wrote something like `/** @exports module:foo */`:
|
||||
doclet.addTag( 'alias', stripModuleNamespace(modName) );
|
||||
doclet.addTag('alias', stripModuleNamespace(modName));
|
||||
doclet.addTag('kind', 'module');
|
||||
}
|
||||
},
|
||||
},
|
||||
external: {
|
||||
canHaveType: true,
|
||||
@ -455,12 +451,11 @@ let baseTags = exports.baseTags = {
|
||||
if (tag.value && tag.value.type) {
|
||||
setDocletTypeToValueType(doclet, tag);
|
||||
doclet.addTag('name', doclet.type.names[0]);
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
setDocletNameToValue(doclet, tag);
|
||||
}
|
||||
},
|
||||
synonyms: ['host']
|
||||
synonyms: ['host'],
|
||||
},
|
||||
file: {
|
||||
onTagged(doclet, tag) {
|
||||
@ -470,7 +465,7 @@ let baseTags = exports.baseTags = {
|
||||
|
||||
doclet.preserveName = true;
|
||||
},
|
||||
synonyms: ['fileoverview', 'overview']
|
||||
synonyms: ['fileoverview', 'overview'],
|
||||
},
|
||||
fires: {
|
||||
mustHaveValue: true,
|
||||
@ -479,64 +474,64 @@ let baseTags = exports.baseTags = {
|
||||
applyNamespaceToTag('event', tag);
|
||||
doclet.fires.push(tag.value);
|
||||
},
|
||||
synonyms: ['emits']
|
||||
synonyms: ['emits'],
|
||||
},
|
||||
function: {
|
||||
onTagged(doclet, tag) {
|
||||
setDocletKindToTitle(doclet, tag);
|
||||
setDocletNameToValue(doclet, tag);
|
||||
},
|
||||
synonyms: ['func', 'method']
|
||||
synonyms: ['func', 'method'],
|
||||
},
|
||||
generator: {
|
||||
mustNotHaveValue: true,
|
||||
onTagged(doclet) {
|
||||
doclet.generator = true;
|
||||
}
|
||||
},
|
||||
},
|
||||
global: {
|
||||
mustNotHaveValue: true,
|
||||
onTagged(doclet) {
|
||||
doclet.scope = SCOPE.NAMES.GLOBAL;
|
||||
delete doclet.memberof;
|
||||
}
|
||||
},
|
||||
},
|
||||
hideconstructor: {
|
||||
mustNotHaveValue: true,
|
||||
onTagged(doclet) {
|
||||
doclet.hideconstructor = true;
|
||||
}
|
||||
},
|
||||
},
|
||||
ignore: {
|
||||
mustNotHaveValue: true,
|
||||
onTagged(doclet) {
|
||||
doclet.ignore = true;
|
||||
}
|
||||
},
|
||||
},
|
||||
implements: {
|
||||
mustHaveValue: true,
|
||||
onTagText: parseTypeText,
|
||||
onTagged(doclet, {value}) {
|
||||
onTagged(doclet, { value }) {
|
||||
doclet.implements = doclet.implements || [];
|
||||
doclet.implements.push(value);
|
||||
}
|
||||
},
|
||||
},
|
||||
inheritdoc: {
|
||||
mustNotHaveValue: true,
|
||||
onTagged(doclet) {
|
||||
// use an empty string so JSDoc can support `@inheritdoc Foo#bar` in the future
|
||||
doclet.inheritdoc = '';
|
||||
}
|
||||
},
|
||||
},
|
||||
inner: {
|
||||
onTagged(doclet, tag) {
|
||||
setDocletScopeToTitle(doclet, tag);
|
||||
}
|
||||
},
|
||||
},
|
||||
instance: {
|
||||
onTagged(doclet, tag) {
|
||||
setDocletScopeToTitle(doclet, tag);
|
||||
}
|
||||
},
|
||||
},
|
||||
interface: {
|
||||
canHaveName: true,
|
||||
@ -545,19 +540,19 @@ let baseTags = exports.baseTags = {
|
||||
if (tag.value) {
|
||||
setDocletNameToValueName(doclet, tag);
|
||||
}
|
||||
}
|
||||
},
|
||||
},
|
||||
lends: {
|
||||
onTagged(doclet, {value}) {
|
||||
onTagged(doclet, { value }) {
|
||||
doclet.alias = value || LONGNAMES.GLOBAL;
|
||||
doclet.addTag('undocumented');
|
||||
}
|
||||
},
|
||||
},
|
||||
license: {
|
||||
mustHaveValue: true,
|
||||
onTagged(doclet, {value}) {
|
||||
onTagged(doclet, { value }) {
|
||||
doclet.license = value;
|
||||
}
|
||||
},
|
||||
},
|
||||
listens: {
|
||||
mustHaveValue: true,
|
||||
@ -565,7 +560,7 @@ let baseTags = exports.baseTags = {
|
||||
doclet.listens = doclet.listens || [];
|
||||
applyNamespaceToTag('event', tag);
|
||||
doclet.listens.push(tag.value);
|
||||
}
|
||||
},
|
||||
},
|
||||
member: {
|
||||
canHaveType: true,
|
||||
@ -575,7 +570,7 @@ let baseTags = exports.baseTags = {
|
||||
setDocletNameToValueName(doclet, tag);
|
||||
setDocletTypeToValueType(doclet, tag);
|
||||
},
|
||||
synonyms: ['var']
|
||||
synonyms: ['var'],
|
||||
},
|
||||
memberof: {
|
||||
mustHaveValue: true,
|
||||
@ -589,29 +584,29 @@ let baseTags = exports.baseTags = {
|
||||
}
|
||||
setDocletMemberof(doclet, tag);
|
||||
},
|
||||
synonyms: ['memberof!']
|
||||
synonyms: ['memberof!'],
|
||||
},
|
||||
// this symbol mixes in all of the specified object's members
|
||||
mixes: {
|
||||
mustHaveValue: true,
|
||||
onTagged(doclet, {value}) {
|
||||
onTagged(doclet, { value }) {
|
||||
const source = firstWordOf(value);
|
||||
|
||||
doclet.mix(source);
|
||||
}
|
||||
},
|
||||
},
|
||||
mixin: {
|
||||
onTagged(doclet, tag) {
|
||||
setDocletKindToTitle(doclet, tag);
|
||||
setDocletNameToValue(doclet, tag);
|
||||
}
|
||||
},
|
||||
},
|
||||
modifies: {
|
||||
canHaveType: true,
|
||||
onTagged(doclet, {value}) {
|
||||
onTagged(doclet, { value }) {
|
||||
doclet.modifies = doclet.modifies || [];
|
||||
doclet.modifies.push(value);
|
||||
}
|
||||
},
|
||||
},
|
||||
module: {
|
||||
canHaveType: true,
|
||||
@ -626,7 +621,7 @@ let baseTags = exports.baseTags = {
|
||||
doclet.name = stripModuleNamespace(doclet.name);
|
||||
|
||||
setDocletTypeToValueType(doclet, tag);
|
||||
}
|
||||
},
|
||||
},
|
||||
namespace: {
|
||||
canHaveType: true,
|
||||
@ -634,64 +629,64 @@ let baseTags = exports.baseTags = {
|
||||
setDocletKindToTitle(doclet, tag);
|
||||
setDocletNameToValue(doclet, tag);
|
||||
setDocletTypeToValueType(doclet, tag);
|
||||
}
|
||||
},
|
||||
},
|
||||
package: {
|
||||
mustNotHaveValue: true,
|
||||
onTagged(doclet) {
|
||||
doclet.access = 'package';
|
||||
}
|
||||
},
|
||||
},
|
||||
param: {
|
||||
canHaveType: true,
|
||||
canHaveName: true,
|
||||
onTagged(doclet, {value}) {
|
||||
onTagged(doclet, { value }) {
|
||||
doclet.params = doclet.params || [];
|
||||
doclet.params.push(value || {});
|
||||
},
|
||||
synonyms: ['arg', 'argument']
|
||||
synonyms: ['arg', 'argument'],
|
||||
},
|
||||
private: {
|
||||
mustNotHaveValue: true,
|
||||
onTagged(doclet) {
|
||||
doclet.access = 'private';
|
||||
}
|
||||
},
|
||||
},
|
||||
property: {
|
||||
mustHaveValue: true,
|
||||
canHaveType: true,
|
||||
canHaveName: true,
|
||||
onTagged(doclet, {value}) {
|
||||
onTagged(doclet, { value }) {
|
||||
doclet.properties = doclet.properties || [];
|
||||
doclet.properties.push(value);
|
||||
},
|
||||
synonyms: ['prop']
|
||||
synonyms: ['prop'],
|
||||
},
|
||||
protected: {
|
||||
mustNotHaveValue: true,
|
||||
onTagged(doclet) {
|
||||
doclet.access = 'protected';
|
||||
}
|
||||
},
|
||||
},
|
||||
public: {
|
||||
mustNotHaveValue: true,
|
||||
onTagged(doclet) {
|
||||
doclet.access = 'public';
|
||||
}
|
||||
},
|
||||
},
|
||||
readonly: {
|
||||
mustNotHaveValue: true,
|
||||
onTagged(doclet) {
|
||||
doclet.readonly = true;
|
||||
}
|
||||
},
|
||||
},
|
||||
requires: {
|
||||
mustHaveValue: true,
|
||||
onTagged(doclet, {value}) {
|
||||
onTagged(doclet, { value }) {
|
||||
let requiresName;
|
||||
|
||||
// inline link tags are passed through as-is so that `@requires {@link foo}` works
|
||||
if ( isInlineTag(value, 'link\\S*') ) {
|
||||
if (isInlineTag(value, 'link\\S*')) {
|
||||
requiresName = value;
|
||||
}
|
||||
// otherwise, assume it's a module
|
||||
@ -704,62 +699,62 @@ let baseTags = exports.baseTags = {
|
||||
|
||||
doclet.requires = doclet.requires || [];
|
||||
doclet.requires.push(requiresName);
|
||||
}
|
||||
},
|
||||
},
|
||||
returns: {
|
||||
mustHaveValue: true,
|
||||
canHaveType: true,
|
||||
onTagged(doclet, {value}) {
|
||||
onTagged(doclet, { value }) {
|
||||
doclet.returns = doclet.returns || [];
|
||||
doclet.returns.push(value);
|
||||
},
|
||||
synonyms: ['return']
|
||||
synonyms: ['return'],
|
||||
},
|
||||
see: {
|
||||
mustHaveValue: true,
|
||||
onTagged(doclet, {value}) {
|
||||
onTagged(doclet, { value }) {
|
||||
doclet.see = doclet.see || [];
|
||||
doclet.see.push(value);
|
||||
}
|
||||
},
|
||||
},
|
||||
since: {
|
||||
mustHaveValue: true,
|
||||
onTagged(doclet, {value}) {
|
||||
onTagged(doclet, { value }) {
|
||||
doclet.since = value;
|
||||
}
|
||||
},
|
||||
},
|
||||
static: {
|
||||
onTagged(doclet, tag) {
|
||||
setDocletScopeToTitle(doclet, tag);
|
||||
}
|
||||
},
|
||||
},
|
||||
summary: {
|
||||
mustHaveValue: true,
|
||||
onTagged(doclet, {value}) {
|
||||
onTagged(doclet, { value }) {
|
||||
doclet.summary = value;
|
||||
}
|
||||
},
|
||||
'this': {
|
||||
},
|
||||
this: {
|
||||
mustHaveValue: true,
|
||||
onTagged(doclet, {value}) {
|
||||
onTagged(doclet, { value }) {
|
||||
doclet.this = firstWordOf(value);
|
||||
}
|
||||
},
|
||||
},
|
||||
todo: {
|
||||
mustHaveValue: true,
|
||||
onTagged(doclet, {value}) {
|
||||
onTagged(doclet, { value }) {
|
||||
doclet.todo = doclet.todo || [];
|
||||
doclet.todo.push(value);
|
||||
}
|
||||
},
|
||||
},
|
||||
throws: {
|
||||
mustHaveValue: true,
|
||||
canHaveType: true,
|
||||
onTagged(doclet, {value}) {
|
||||
onTagged(doclet, { value }) {
|
||||
doclet.exceptions = doclet.exceptions || [];
|
||||
doclet.exceptions.push(value);
|
||||
},
|
||||
synonyms: ['exception']
|
||||
synonyms: ['exception'],
|
||||
},
|
||||
type: {
|
||||
mustHaveValue: true,
|
||||
@ -781,7 +776,7 @@ let baseTags = exports.baseTags = {
|
||||
closeIdx = text.indexOf(CLOSE_BRACE);
|
||||
|
||||
// a type expression is at least one character long
|
||||
if ( openIdx !== 0 || closeIdx <= openIdx + 1) {
|
||||
if (openIdx !== 0 || closeIdx <= openIdx + 1) {
|
||||
text = OPEN_BRACE + text + CLOSE_BRACE;
|
||||
}
|
||||
|
||||
@ -796,7 +791,7 @@ let baseTags = exports.baseTags = {
|
||||
doclet.addTag('returns', tag.text);
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
},
|
||||
typedef: {
|
||||
canHaveType: true,
|
||||
@ -810,46 +805,43 @@ let baseTags = exports.baseTags = {
|
||||
// callbacks are always type {function}
|
||||
if (tag.originalTitle === 'callback') {
|
||||
doclet.type = {
|
||||
names: [
|
||||
'function'
|
||||
]
|
||||
names: ['function'],
|
||||
};
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
setDocletTypeToValueType(doclet, tag);
|
||||
}
|
||||
}
|
||||
},
|
||||
synonyms: ['callback']
|
||||
synonyms: ['callback'],
|
||||
},
|
||||
variation: {
|
||||
mustHaveValue: true,
|
||||
onTagged(doclet, tag) {
|
||||
let value = tag.value;
|
||||
|
||||
if ( /^\((.+)\)$/.test(value) ) {
|
||||
if (/^\((.+)\)$/.test(value)) {
|
||||
value = RegExp.$1;
|
||||
}
|
||||
|
||||
doclet.variation = value;
|
||||
}
|
||||
},
|
||||
},
|
||||
version: {
|
||||
mustHaveValue: true,
|
||||
onTagged(doclet, {value}) {
|
||||
onTagged(doclet, { value }) {
|
||||
doclet.version = value;
|
||||
}
|
||||
},
|
||||
},
|
||||
yields: {
|
||||
mustHaveValue: true,
|
||||
canHaveType: true,
|
||||
onTagged(doclet, {value}) {
|
||||
onTagged(doclet, { value }) {
|
||||
doclet.yields = doclet.yields || [];
|
||||
doclet.yields.push(value);
|
||||
},
|
||||
synonyms: ['yield']
|
||||
}
|
||||
};
|
||||
synonyms: ['yield'],
|
||||
},
|
||||
});
|
||||
|
||||
baseTags = _.extend(baseTags, internalTags);
|
||||
|
||||
@ -865,7 +857,7 @@ exports.closureTags = {
|
||||
setDocletTypeToValueType(doclet, tag);
|
||||
},
|
||||
// Closure Compiler only
|
||||
synonyms: ['define']
|
||||
synonyms: ['define'],
|
||||
},
|
||||
constructor: cloneTagDef(baseTags.class),
|
||||
deprecated: cloneTagDef(baseTags.deprecated),
|
||||
@ -884,7 +876,7 @@ exports.closureTags = {
|
||||
setDocletDescriptionToValue(doclet, tag);
|
||||
|
||||
doclet.preserveName = true;
|
||||
}
|
||||
},
|
||||
},
|
||||
final: cloneTagDef(baseTags.readonly),
|
||||
implements: cloneTagDef(baseTags.implements),
|
||||
@ -895,7 +887,7 @@ exports.closureTags = {
|
||||
canHaveName: false,
|
||||
mustNotHaveValue: true,
|
||||
// Closure Compiler only
|
||||
synonyms: ['record']
|
||||
synonyms: ['record'],
|
||||
}),
|
||||
lends: cloneTagDef(baseTags.lends),
|
||||
license: cloneTagDef(baseTags.license),
|
||||
@ -910,14 +902,14 @@ exports.closureTags = {
|
||||
nosideeffects: {
|
||||
onTagged(doclet) {
|
||||
doclet.modifies = [];
|
||||
}
|
||||
},
|
||||
},
|
||||
// Closure Compiler only
|
||||
override: {
|
||||
mustNotHaveValue: true,
|
||||
onTagged(doclet) {
|
||||
doclet.override = true;
|
||||
}
|
||||
},
|
||||
},
|
||||
package: {
|
||||
canHaveType: true,
|
||||
@ -927,7 +919,7 @@ exports.closureTags = {
|
||||
if (tag.value && tag.value.type) {
|
||||
setDocletTypeToValueType(doclet, tag);
|
||||
}
|
||||
}
|
||||
},
|
||||
},
|
||||
param: cloneTagDef(baseTags.param),
|
||||
// Closure Compiler only
|
||||
@ -944,7 +936,7 @@ exports.closureTags = {
|
||||
if (tag.value && tag.value.type) {
|
||||
setDocletTypeToValueType(doclet, tag);
|
||||
}
|
||||
}
|
||||
},
|
||||
},
|
||||
protected: {
|
||||
canHaveType: true,
|
||||
@ -954,7 +946,7 @@ exports.closureTags = {
|
||||
if (tag.value && tag.value.type) {
|
||||
setDocletTypeToValueType(doclet, tag);
|
||||
}
|
||||
}
|
||||
},
|
||||
},
|
||||
public: {
|
||||
canHaveType: true,
|
||||
@ -964,7 +956,7 @@ exports.closureTags = {
|
||||
if (tag.value && tag.value.type) {
|
||||
setDocletTypeToValueType(doclet, tag);
|
||||
}
|
||||
}
|
||||
},
|
||||
},
|
||||
return: cloneTagDef(baseTags.returns),
|
||||
// Closure Compiler only
|
||||
@ -973,23 +965,23 @@ exports.closureTags = {
|
||||
suppress: NOOP_TAG,
|
||||
// Closure Compiler only
|
||||
template: NOOP_TAG,
|
||||
'this': {
|
||||
this: {
|
||||
canHaveType: true,
|
||||
onTagged(doclet, tag) {
|
||||
doclet.this = combineTypes(tag);
|
||||
}
|
||||
},
|
||||
},
|
||||
throws: cloneTagDef(baseTags.throws),
|
||||
type: cloneTagDef(baseTags.type, {
|
||||
mustNotHaveDescription: false
|
||||
mustNotHaveDescription: false,
|
||||
}),
|
||||
typedef: {
|
||||
canHaveType: true,
|
||||
onTagged(doclet, tag) {
|
||||
setDocletKindToTitle(doclet, tag);
|
||||
setDocletTypeToValueType(doclet, tag);
|
||||
}
|
||||
},
|
||||
},
|
||||
// Closure Compiler only
|
||||
unrestricted: NOOP_TAG
|
||||
unrestricted: NOOP_TAG,
|
||||
};
|
||||
|
||||
@ -5,7 +5,7 @@
|
||||
const env = require('jsdoc/env');
|
||||
const { log } = require('@jsdoc/util');
|
||||
|
||||
function buildMessage(tagName, {filename, lineno, comment}, desc) {
|
||||
function buildMessage(tagName, { filename, lineno, comment }, desc) {
|
||||
let result = `The @${tagName} tag ${desc}. File: ${filename}, line: ${lineno}`;
|
||||
|
||||
if (comment) {
|
||||
@ -18,15 +18,16 @@ function buildMessage(tagName, {filename, lineno, comment}, desc) {
|
||||
/**
|
||||
* Validate the given tag.
|
||||
*/
|
||||
exports.validate = ({title, text, value}, tagDef, meta) => {
|
||||
exports.validate = ({ title, text, value }, tagDef, meta) => {
|
||||
const allowUnknownTags = env.conf.tags.allowUnknownTags;
|
||||
|
||||
// handle cases where the tag definition does not exist
|
||||
if (!tagDef) {
|
||||
// log an error if unknown tags are not allowed
|
||||
if (!allowUnknownTags ||
|
||||
(Array.isArray(allowUnknownTags) &&
|
||||
!allowUnknownTags.includes(title))) {
|
||||
if (
|
||||
!allowUnknownTags ||
|
||||
(Array.isArray(allowUnknownTags) && !allowUnknownTags.includes(title))
|
||||
) {
|
||||
log.error(buildMessage(title, meta, 'is not a known tag'));
|
||||
}
|
||||
|
||||
@ -41,11 +42,10 @@ exports.validate = ({title, text, value}, tagDef, meta) => {
|
||||
|
||||
// check for minor issues that are usually harmless
|
||||
else if (text && tagDef.mustNotHaveValue) {
|
||||
log.warn(buildMessage(title, meta,
|
||||
'does not permit a value; the value will be ignored'));
|
||||
}
|
||||
else if (value && value.description && tagDef.mustNotHaveDescription) {
|
||||
log.warn(buildMessage(title, meta,
|
||||
'does not permit a description; the description will be ignored'));
|
||||
log.warn(buildMessage(title, meta, 'does not permit a value; the value will be ignored'));
|
||||
} else if (value && value.description && tagDef.mustNotHaveDescription) {
|
||||
log.warn(
|
||||
buildMessage(title, meta, 'does not permit a description; the description will be ignored')
|
||||
);
|
||||
}
|
||||
};
|
||||
|
||||
@ -21,7 +21,7 @@ class Template {
|
||||
this.settings = {
|
||||
evaluate: /<\?js([\s\S]+?)\?>/g,
|
||||
interpolate: /<\?js=([\s\S]+?)\?>/g,
|
||||
escape: /<\?js~([\s\S]+?)\?>/g
|
||||
escape: /<\?js~([\s\S]+?)\?>/g,
|
||||
};
|
||||
}
|
||||
|
||||
|
||||
@ -28,20 +28,20 @@ const linkMap = {
|
||||
urlToLongname: {},
|
||||
|
||||
// one-way lookup (IDs are only unique per file)
|
||||
longnameToId: {}
|
||||
longnameToId: {},
|
||||
};
|
||||
|
||||
const longnameToUrl = exports.longnameToUrl = linkMap.longnameToUrl;
|
||||
const longnameToId = exports.longnameToId = linkMap.longnameToId;
|
||||
const longnameToUrl = (exports.longnameToUrl = linkMap.longnameToUrl);
|
||||
const longnameToId = (exports.longnameToId = linkMap.longnameToId);
|
||||
|
||||
const registerLink = exports.registerLink = (longname, fileUrl) => {
|
||||
const registerLink = (exports.registerLink = (longname, fileUrl) => {
|
||||
linkMap.longnameToUrl[longname] = fileUrl;
|
||||
linkMap.urlToLongname[fileUrl] = longname;
|
||||
};
|
||||
});
|
||||
|
||||
const registerId = exports.registerId = (longname, fragment) => {
|
||||
const registerId = (exports.registerId = (longname, fragment) => {
|
||||
linkMap.longnameToId[longname] = fragment;
|
||||
};
|
||||
});
|
||||
|
||||
function getNamespace(kind) {
|
||||
if (dictionary.isNamespace(kind)) {
|
||||
@ -77,7 +77,7 @@ function makeUniqueFilename(filename, str) {
|
||||
|
||||
// append enough underscores to make the filename unique
|
||||
while (nonUnique) {
|
||||
if ( hasOwnProp.call(files, key) ) {
|
||||
if (hasOwnProp.call(files, key)) {
|
||||
filename += '_';
|
||||
key = filename.toLowerCase();
|
||||
} else {
|
||||
@ -103,7 +103,7 @@ function makeUniqueFilename(filename, str) {
|
||||
* @param {string} str The string to convert.
|
||||
* @return {string} The filename to use for the string.
|
||||
*/
|
||||
const getUniqueFilename = exports.getUniqueFilename = str => {
|
||||
const getUniqueFilename = (exports.getUniqueFilename = (str) => {
|
||||
const namespaces = dictionary.getNamespaces().join('|');
|
||||
let basename = (str || '')
|
||||
// use - instead of : in namespace prefixes
|
||||
@ -125,7 +125,7 @@ const getUniqueFilename = exports.getUniqueFilename = str => {
|
||||
basename = basename.length ? basename : '_';
|
||||
|
||||
return makeUniqueFilename(basename, str) + exports.fileExtension;
|
||||
};
|
||||
});
|
||||
|
||||
/**
|
||||
* Get a longname's filename if one has been registered; otherwise, generate a unique filename, then
|
||||
@ -135,10 +135,9 @@ const getUniqueFilename = exports.getUniqueFilename = str => {
|
||||
function getFilename(longname) {
|
||||
let fileUrl;
|
||||
|
||||
if ( hasOwnProp.call(longnameToUrl, longname) ) {
|
||||
if (hasOwnProp.call(longnameToUrl, longname)) {
|
||||
fileUrl = longnameToUrl[longname];
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
fileUrl = getUniqueFilename(longname);
|
||||
registerLink(longname, fileUrl);
|
||||
}
|
||||
@ -156,8 +155,12 @@ function getFilename(longname) {
|
||||
* `false`.
|
||||
*/
|
||||
function isModuleExports(doclet) {
|
||||
return doclet.longname && doclet.longname === doclet.name &&
|
||||
doclet.longname.indexOf(MODULE_NAMESPACE) === 0 && doclet.kind !== 'module';
|
||||
return (
|
||||
doclet.longname &&
|
||||
doclet.longname === doclet.name &&
|
||||
doclet.longname.indexOf(MODULE_NAMESPACE) === 0 &&
|
||||
doclet.kind !== 'module'
|
||||
);
|
||||
}
|
||||
|
||||
function makeUniqueId(filename, id) {
|
||||
@ -171,11 +174,10 @@ function makeUniqueId(filename, id) {
|
||||
|
||||
// append enough underscores to make the identifier unique
|
||||
while (nonUnique) {
|
||||
if ( hasOwnProp.call(ids, filename) && hasOwnProp.call(ids[filename], key) ) {
|
||||
if (hasOwnProp.call(ids, filename) && hasOwnProp.call(ids[filename], key)) {
|
||||
id += '_';
|
||||
key = id.toLowerCase();
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
nonUnique = false;
|
||||
}
|
||||
}
|
||||
@ -192,14 +194,12 @@ function makeUniqueId(filename, id) {
|
||||
* @private
|
||||
*/
|
||||
function getId(longname, id) {
|
||||
if ( hasOwnProp.call(longnameToId, longname) ) {
|
||||
if (hasOwnProp.call(longnameToId, longname)) {
|
||||
id = longnameToId[longname];
|
||||
}
|
||||
else if (!id) {
|
||||
} else if (!id) {
|
||||
// no ID required
|
||||
return '';
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
id = makeUniqueId(longname, id);
|
||||
registerId(longname, id);
|
||||
}
|
||||
@ -220,22 +220,20 @@ function getId(longname, id) {
|
||||
*/
|
||||
exports.getUniqueId = makeUniqueId;
|
||||
|
||||
const htmlsafe = exports.htmlsafe = str => {
|
||||
const htmlsafe = (exports.htmlsafe = (str) => {
|
||||
if (typeof str !== 'string') {
|
||||
str = String(str);
|
||||
}
|
||||
|
||||
return str.replace(/&/g, '&')
|
||||
.replace(/</g, '<');
|
||||
};
|
||||
return str.replace(/&/g, '&').replace(/</g, '<');
|
||||
});
|
||||
|
||||
function parseType(longname) {
|
||||
let err;
|
||||
|
||||
try {
|
||||
return catharsis.parse(longname, {jsdoc: true});
|
||||
}
|
||||
catch (e) {
|
||||
return catharsis.parse(longname, { jsdoc: true });
|
||||
} catch (e) {
|
||||
err = new Error(`unable to parse ${longname}: ${e.message}`);
|
||||
log.error(err);
|
||||
|
||||
@ -247,12 +245,12 @@ function stringifyType(parsedType, cssClass, stringifyLinkMap) {
|
||||
return require('catharsis').stringify(parsedType, {
|
||||
cssClass: cssClass,
|
||||
htmlSafe: true,
|
||||
links: stringifyLinkMap
|
||||
links: stringifyLinkMap,
|
||||
});
|
||||
}
|
||||
|
||||
function hasUrlPrefix(text) {
|
||||
return (/^(http|ftp)s?:\/\//).test(text);
|
||||
return /^(http|ftp)s?:\/\//.test(text);
|
||||
}
|
||||
|
||||
function isComplexTypeExpression(expr) {
|
||||
@ -311,19 +309,22 @@ function buildLink(longname, linkText, options) {
|
||||
// @see <http://example.org>
|
||||
// @see http://example.org
|
||||
stripped = longname ? longname.replace(/^<|>$/g, '') : '';
|
||||
if ( hasUrlPrefix(stripped) ) {
|
||||
if (hasUrlPrefix(stripped)) {
|
||||
fileUrl = stripped;
|
||||
text = linkText || stripped;
|
||||
}
|
||||
// handle complex type expressions that may require multiple links
|
||||
// (but skip anything that looks like an inline tag or HTML tag)
|
||||
else if (longname && isComplexTypeExpression(longname) && /\{@.+\}/.test(longname) === false &&
|
||||
/^<[\s\S]+>/.test(longname) === false) {
|
||||
else if (
|
||||
longname &&
|
||||
isComplexTypeExpression(longname) &&
|
||||
/\{@.+\}/.test(longname) === false &&
|
||||
/^<[\s\S]+>/.test(longname) === false
|
||||
) {
|
||||
parsedType = parseType(longname);
|
||||
|
||||
return stringifyType(parsedType, options.cssClass, options.linkMap);
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
fileUrl = hasOwnProp.call(options.linkMap, longname) ? options.linkMap[longname] : '';
|
||||
text = linkText || (options.shortenName ? getShortName(longname) : longname);
|
||||
}
|
||||
@ -332,8 +333,7 @@ function buildLink(longname, linkText, options) {
|
||||
|
||||
if (!fileUrl) {
|
||||
return text;
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
return `<a href="${encodeURI(fileUrl + fragmentString)}"${classString}>${text}</a>`;
|
||||
}
|
||||
}
|
||||
@ -358,27 +358,25 @@ function buildLink(longname, linkText, options) {
|
||||
* append to the link target.
|
||||
* @return {string} The HTML link, or a plain-text string if the link is not available.
|
||||
*/
|
||||
const linkto = exports.linkto = (longname, linkText, cssClass, fragmentId) => buildLink(longname, linkText, {
|
||||
const linkto = (exports.linkto = (longname, linkText, cssClass, fragmentId) =>
|
||||
buildLink(longname, linkText, {
|
||||
cssClass: cssClass,
|
||||
fragmentId: fragmentId,
|
||||
linkMap: longnameToUrl
|
||||
});
|
||||
linkMap: longnameToUrl,
|
||||
}));
|
||||
|
||||
function useMonospace(tag, text) {
|
||||
let cleverLinks;
|
||||
let monospaceLinks;
|
||||
let result;
|
||||
|
||||
if ( hasUrlPrefix(text) ) {
|
||||
if (hasUrlPrefix(text)) {
|
||||
result = false;
|
||||
}
|
||||
else if (tag === 'linkplain') {
|
||||
} else if (tag === 'linkplain') {
|
||||
result = false;
|
||||
}
|
||||
else if (tag === 'linkcode') {
|
||||
} else if (tag === 'linkcode') {
|
||||
result = true;
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
cleverLinks = env.conf.templates.cleverLinks;
|
||||
monospaceLinks = env.conf.templates.monospaceLinks;
|
||||
|
||||
@ -410,7 +408,7 @@ function splitLinkText(text) {
|
||||
|
||||
return {
|
||||
linkText: linkText,
|
||||
target: target || text
|
||||
target: target || text,
|
||||
};
|
||||
}
|
||||
|
||||
@ -428,7 +426,7 @@ function shouldShortenLongname() {
|
||||
* @param {string} str - The string to search for `{@link ...}` tags.
|
||||
* @return {string} The linkified text.
|
||||
*/
|
||||
exports.resolveLinks = str => {
|
||||
exports.resolveLinks = (str) => {
|
||||
let replacers;
|
||||
|
||||
function extractLeadingText(string, completeTag) {
|
||||
@ -450,11 +448,11 @@ exports.resolveLinks = str => {
|
||||
|
||||
return {
|
||||
leadingText: leadingText,
|
||||
string: string
|
||||
string: string,
|
||||
};
|
||||
}
|
||||
|
||||
function processLink(string, {completeTag, text, tag}) {
|
||||
function processLink(string, { completeTag, text, tag }) {
|
||||
const leading = extractLeadingText(string, completeTag);
|
||||
let linkText = leading.leadingText;
|
||||
let monospace;
|
||||
@ -469,17 +467,20 @@ exports.resolveLinks = str => {
|
||||
|
||||
monospace = useMonospace(tag, text);
|
||||
|
||||
return string.replace( completeTag, buildLink(target, linkText, {
|
||||
return string.replace(
|
||||
completeTag,
|
||||
buildLink(target, linkText, {
|
||||
linkMap: longnameToUrl,
|
||||
monospace: monospace,
|
||||
shortenName: shouldShortenLongname()
|
||||
}) );
|
||||
shortenName: shouldShortenLongname(),
|
||||
})
|
||||
);
|
||||
}
|
||||
|
||||
replacers = {
|
||||
link: processLink,
|
||||
linkcode: processLink,
|
||||
linkplain: processLink
|
||||
linkplain: processLink,
|
||||
};
|
||||
|
||||
return inline.replaceInlineTags(str, replacers).newString;
|
||||
@ -491,7 +492,7 @@ exports.resolveLinks = str => {
|
||||
* @param {string} str - The tag text.
|
||||
* @return {string} The linkified text.
|
||||
*/
|
||||
exports.resolveAuthorLinks = str => {
|
||||
exports.resolveAuthorLinks = (str) => {
|
||||
let author = '';
|
||||
let matches;
|
||||
|
||||
@ -500,8 +501,7 @@ exports.resolveAuthorLinks = str => {
|
||||
|
||||
if (matches && matches.length === 3) {
|
||||
author = `<a href="mailto:${matches[2]}">${htmlsafe(matches[1])}</a>`;
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
author = htmlsafe(str);
|
||||
}
|
||||
}
|
||||
@ -519,7 +519,7 @@ exports.resolveAuthorLinks = str => {
|
||||
* does not match.
|
||||
* @return {array<object>} The matching items.
|
||||
*/
|
||||
const find = exports.find = (data, spec) => data(spec).get();
|
||||
const find = (exports.find = (data, spec) => data(spec).get());
|
||||
|
||||
/**
|
||||
* Retrieve all of the following types of members from a set of doclets:
|
||||
@ -535,33 +535,33 @@ const find = exports.find = (data, spec) => data(spec).get();
|
||||
* @return {object} An object with `classes`, `externals`, `globals`, `mixins`, `modules`,
|
||||
* `events`, and `namespaces` properties. Each property contains an array of objects.
|
||||
*/
|
||||
exports.getMembers = data => {
|
||||
exports.getMembers = (data) => {
|
||||
const members = {
|
||||
classes: find( data, {kind: 'class'} ),
|
||||
externals: find( data, {kind: 'external'} ),
|
||||
events: find( data, {kind: 'event'} ),
|
||||
classes: find(data, { kind: 'class' }),
|
||||
externals: find(data, { kind: 'external' }),
|
||||
events: find(data, { kind: 'event' }),
|
||||
globals: find(data, {
|
||||
kind: ['member', 'function', 'constant', 'typedef'],
|
||||
memberof: { isUndefined: true }
|
||||
memberof: { isUndefined: true },
|
||||
}),
|
||||
mixins: find( data, {kind: 'mixin'} ),
|
||||
modules: find( data, {kind: 'module'} ),
|
||||
namespaces: find( data, {kind: 'namespace'} ),
|
||||
interfaces: find( data, {kind: 'interface'} )
|
||||
mixins: find(data, { kind: 'mixin' }),
|
||||
modules: find(data, { kind: 'module' }),
|
||||
namespaces: find(data, { kind: 'namespace' }),
|
||||
interfaces: find(data, { kind: 'interface' }),
|
||||
};
|
||||
|
||||
// strip quotes from externals, since we allow quoted names that would normally indicate a
|
||||
// namespace hierarchy (as in `@external "jquery.fn"`)
|
||||
// TODO: we should probably be doing this for other types of symbols, here or elsewhere; see
|
||||
// jsdoc3/jsdoc#396
|
||||
members.externals = members.externals.map(doclet => {
|
||||
members.externals = members.externals.map((doclet) => {
|
||||
doclet.name = doclet.name.replace(/(^"|"$)/g, '');
|
||||
|
||||
return doclet;
|
||||
});
|
||||
|
||||
// functions that are also modules (as in `module.exports = function() {};`) are not globals
|
||||
members.globals = members.globals.filter(doclet => !isModuleExports(doclet));
|
||||
members.globals = members.globals.filter((doclet) => !isModuleExports(doclet));
|
||||
|
||||
return members;
|
||||
};
|
||||
@ -572,7 +572,7 @@ exports.getMembers = data => {
|
||||
* @param {object} d The doclet whose attributes will be retrieved.
|
||||
* @return {array<string>} The member attributes for the doclet.
|
||||
*/
|
||||
exports.getAttribs = d => {
|
||||
exports.getAttribs = (d) => {
|
||||
const attribs = [];
|
||||
|
||||
if (!d) {
|
||||
@ -613,8 +613,7 @@ exports.getAttribs = d => {
|
||||
|
||||
if (d.nullable === true) {
|
||||
attribs.push('nullable');
|
||||
}
|
||||
else if (d.nullable === false) {
|
||||
} else if (d.nullable === false) {
|
||||
attribs.push('non-null');
|
||||
}
|
||||
|
||||
@ -628,7 +627,7 @@ exports.getAttribs = d => {
|
||||
* @param {string} [cssClass] - The CSS class to include in the `class` attribute for each link.
|
||||
* @return {Array.<string>} HTML links to allowed types for the member.
|
||||
*/
|
||||
exports.getSignatureTypes = ({type}, cssClass) => {
|
||||
exports.getSignatureTypes = ({ type }, cssClass) => {
|
||||
let types = [];
|
||||
|
||||
if (type && type.names) {
|
||||
@ -636,7 +635,7 @@ exports.getSignatureTypes = ({type}, cssClass) => {
|
||||
}
|
||||
|
||||
if (types && types.length) {
|
||||
types = types.map(t => linkto(t, htmlsafe(t), cssClass));
|
||||
types = types.map((t) => linkto(t, htmlsafe(t), cssClass));
|
||||
}
|
||||
|
||||
return types;
|
||||
@ -652,16 +651,15 @@ exports.getSignatureTypes = ({type}, cssClass) => {
|
||||
* @return {array<string>} An array of parameter names, with or without `<span>` tags wrapping the
|
||||
* names of optional parameters.
|
||||
*/
|
||||
exports.getSignatureParams = ({params}, optClass) => {
|
||||
exports.getSignatureParams = ({ params }, optClass) => {
|
||||
const pnames = [];
|
||||
|
||||
if (params) {
|
||||
params.forEach(p => {
|
||||
params.forEach((p) => {
|
||||
if (p.name && !p.name.includes('.')) {
|
||||
if (p.optional && optClass) {
|
||||
pnames.push(`<span class="${optClass}">${p.name}</span>`);
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
pnames.push(p.name);
|
||||
}
|
||||
}
|
||||
@ -678,11 +676,11 @@ exports.getSignatureParams = ({params}, optClass) => {
|
||||
* @param {string} [cssClass] - The CSS class to include in the `class` attribute for each link.
|
||||
* @return {Array.<string>} HTML links to types that the member can return or yield.
|
||||
*/
|
||||
exports.getSignatureReturns = ({yields, returns}, cssClass) => {
|
||||
exports.getSignatureReturns = ({ yields, returns }, cssClass) => {
|
||||
let returnTypes = [];
|
||||
|
||||
if (yields || returns) {
|
||||
(yields || returns).forEach(r => {
|
||||
(yields || returns).forEach((r) => {
|
||||
if (r && r.type && r.type.names) {
|
||||
if (!returnTypes.length) {
|
||||
returnTypes = r.type.names;
|
||||
@ -692,7 +690,7 @@ exports.getSignatureReturns = ({yields, returns}, cssClass) => {
|
||||
}
|
||||
|
||||
if (returnTypes && returnTypes.length) {
|
||||
returnTypes = returnTypes.map(r => linkto(r, htmlsafe(r), cssClass));
|
||||
returnTypes = returnTypes.map((r) => linkto(r, htmlsafe(r), cssClass));
|
||||
}
|
||||
|
||||
return returnTypes;
|
||||
@ -713,7 +711,7 @@ exports.getAncestors = (data, doclet) => {
|
||||
|
||||
while (doc) {
|
||||
previousDoc = doc;
|
||||
doc = find(data, {longname: doc.memberof})[0];
|
||||
doc = find(data, { longname: doc.memberof })[0];
|
||||
|
||||
// prevent infinite loop that can be caused by duplicated module definitions
|
||||
if (previousDoc === doc) {
|
||||
@ -740,7 +738,7 @@ exports.getAncestorLinks = (data, doclet, cssClass) => {
|
||||
const ancestors = exports.getAncestors(data, doclet);
|
||||
const links = [];
|
||||
|
||||
ancestors.forEach(ancestor => {
|
||||
ancestors.forEach((ancestor) => {
|
||||
const linkText = (SCOPE_TO_PUNC[ancestor.scope] || '') + ancestor.name;
|
||||
const link = linkto(ancestor.longname, linkText, cssClass);
|
||||
|
||||
@ -748,7 +746,7 @@ exports.getAncestorLinks = (data, doclet, cssClass) => {
|
||||
});
|
||||
|
||||
if (links.length) {
|
||||
links[links.length - 1] += (SCOPE_TO_PUNC[doclet.scope] || '');
|
||||
links[links.length - 1] += SCOPE_TO_PUNC[doclet.scope] || '';
|
||||
}
|
||||
|
||||
return links;
|
||||
@ -760,7 +758,7 @@ exports.getAncestorLinks = (data, doclet, cssClass) => {
|
||||
*
|
||||
* @param {TAFFY} data - The TaffyDB database to search.
|
||||
*/
|
||||
exports.addEventListeners = data => {
|
||||
exports.addEventListeners = (data) => {
|
||||
// just a cache to prevent me doing so many lookups
|
||||
const _events = {};
|
||||
let doc;
|
||||
@ -768,7 +766,7 @@ exports.addEventListeners = data => {
|
||||
// TODO: do this on the *pruned* data
|
||||
// find all doclets that @listen to something.
|
||||
/* eslint-disable no-invalid-this */
|
||||
const listeners = find(data, function() {
|
||||
const listeners = find(data, function () {
|
||||
return this.listens && this.listens.length;
|
||||
});
|
||||
/* eslint-enable no-invalid-this */
|
||||
@ -777,12 +775,14 @@ exports.addEventListeners = data => {
|
||||
return;
|
||||
}
|
||||
|
||||
listeners.forEach(({listens, longname}) => {
|
||||
listeners.forEach(({ listens, longname }) => {
|
||||
l = listens;
|
||||
l.forEach(eventLongname => {
|
||||
doc = _events[eventLongname] || find(data, {
|
||||
l.forEach((eventLongname) => {
|
||||
doc =
|
||||
_events[eventLongname] ||
|
||||
find(data, {
|
||||
longname: eventLongname,
|
||||
kind: 'event'
|
||||
kind: 'event',
|
||||
})[0];
|
||||
if (doc) {
|
||||
if (!doc.listeners) {
|
||||
@ -807,26 +807,29 @@ exports.addEventListeners = data => {
|
||||
* @param {TAFFY} data The TaffyDB database to prune.
|
||||
* @return {TAFFY} The pruned database.
|
||||
*/
|
||||
exports.prune = data => {
|
||||
data({undocumented: true}).remove();
|
||||
data({ignore: true}).remove();
|
||||
data({memberof: '<anonymous>'}).remove();
|
||||
exports.prune = (data) => {
|
||||
data({ undocumented: true }).remove();
|
||||
data({ ignore: true }).remove();
|
||||
data({ memberof: '<anonymous>' }).remove();
|
||||
|
||||
if (!env.opts.access || (env.opts.access && !env.opts.access.includes('all'))) {
|
||||
if (env.opts.access && !env.opts.access.includes('package')) {
|
||||
data({access: 'package'}).remove();
|
||||
data({ access: 'package' }).remove();
|
||||
}
|
||||
if (env.opts.access && !env.opts.access.includes('public')) {
|
||||
data({access: 'public'}).remove();
|
||||
data({ access: 'public' }).remove();
|
||||
}
|
||||
if (env.opts.access && !env.opts.access.includes('protected')) {
|
||||
data({access: 'protected'}).remove();
|
||||
data({ access: 'protected' }).remove();
|
||||
}
|
||||
if (!env.opts.private && (!env.opts.access || (env.opts.access && !env.opts.access.includes('private')))) {
|
||||
data({access: 'private'}).remove();
|
||||
if (
|
||||
!env.opts.private &&
|
||||
(!env.opts.access || (env.opts.access && !env.opts.access.includes('private')))
|
||||
) {
|
||||
data({ access: 'private' }).remove();
|
||||
}
|
||||
if (env.opts.access && !env.opts.access.includes('undefined')) {
|
||||
data({access: {isUndefined: true}}).remove();
|
||||
data({ access: { isUndefined: true } }).remove();
|
||||
}
|
||||
}
|
||||
|
||||
@ -845,7 +848,7 @@ exports.prune = data => {
|
||||
* @param {module:jsdoc/doclet.Doclet} doclet - The doclet that will be used to create the URL.
|
||||
* @return {string} The URL to the generated documentation for the doclet.
|
||||
*/
|
||||
exports.createLink = doclet => {
|
||||
exports.createLink = (doclet) => {
|
||||
let fakeContainer;
|
||||
let filename;
|
||||
let fileUrl;
|
||||
@ -865,11 +868,11 @@ exports.createLink = doclet => {
|
||||
}
|
||||
|
||||
// the doclet gets its own HTML file
|
||||
if ( containers.includes(doclet.kind) || isModuleExports(doclet) ) {
|
||||
if (containers.includes(doclet.kind) || isModuleExports(doclet)) {
|
||||
filename = getFilename(longname);
|
||||
}
|
||||
// mistagged version of a doclet that gets its own HTML file
|
||||
else if ( !containers.includes(doclet.kind) && fakeContainer ) {
|
||||
else if (!containers.includes(doclet.kind) && fakeContainer) {
|
||||
filename = getFilename(doclet.memberof || longname);
|
||||
if (doclet.name !== doclet.longname) {
|
||||
fragment = formatNameForLink(doclet);
|
||||
@ -879,13 +882,13 @@ exports.createLink = doclet => {
|
||||
// the doclet is within another HTML file
|
||||
else {
|
||||
filename = getFilename(doclet.memberof || exports.globalName);
|
||||
if ( (doclet.name !== doclet.longname) || (doclet.scope === SCOPE.NAMES.GLOBAL) ) {
|
||||
if (doclet.name !== doclet.longname || doclet.scope === SCOPE.NAMES.GLOBAL) {
|
||||
fragment = formatNameForLink(doclet);
|
||||
fragment = getId(longname, fragment);
|
||||
}
|
||||
}
|
||||
|
||||
fileUrl = encodeURI( filename + fragmentHash(fragment) );
|
||||
fileUrl = encodeURI(filename + fragmentHash(fragment));
|
||||
|
||||
return fileUrl;
|
||||
};
|
||||
|
||||
128
packages/jsdoc/package-lock.json
generated
128
packages/jsdoc/package-lock.json
generated
@ -1,8 +1,134 @@
|
||||
{
|
||||
"name": "jsdoc",
|
||||
"version": "4.0.0-dev.16",
|
||||
"lockfileVersion": 1,
|
||||
"lockfileVersion": 2,
|
||||
"requires": true,
|
||||
"packages": {
|
||||
"": {
|
||||
"name": "jsdoc",
|
||||
"version": "4.0.0-dev.16",
|
||||
"license": "Apache-2.0",
|
||||
"dependencies": {
|
||||
"@babel/parser": "^7.15.7",
|
||||
"bluebird": "^3.7.2",
|
||||
"catharsis": "^0.9.0",
|
||||
"code-prettify": "^0.1.0",
|
||||
"color-themes-for-google-code-prettify": "^2.0.4",
|
||||
"common-path-prefix": "^3.0.0",
|
||||
"escape-string-regexp": "^4.0.0",
|
||||
"lodash": "^4.17.21",
|
||||
"open-sans-fonts": "^1.6.2",
|
||||
"requizzle": "^0.2.3",
|
||||
"strip-bom": "^4.0.0",
|
||||
"strip-json-comments": "^3.1.1",
|
||||
"taffydb": "2.6.2"
|
||||
},
|
||||
"bin": {
|
||||
"jsdoc": "jsdoc.js"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=v14.17.6"
|
||||
}
|
||||
},
|
||||
"node_modules/@babel/parser": {
|
||||
"version": "7.15.7",
|
||||
"resolved": "https://registry.npmjs.org/@babel/parser/-/parser-7.15.7.tgz",
|
||||
"integrity": "sha512-rycZXvQ+xS9QyIcJ9HXeDWf1uxqlbVFAUq0Rq0dbc50Zb/+wUe/ehyfzGfm9KZZF0kBejYgxltBXocP+gKdL2g==",
|
||||
"bin": {
|
||||
"parser": "bin/babel-parser.js"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=6.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/bluebird": {
|
||||
"version": "3.7.2",
|
||||
"resolved": "https://registry.npmjs.org/bluebird/-/bluebird-3.7.2.tgz",
|
||||
"integrity": "sha512-XpNj6GDQzdfW+r2Wnn7xiSAd7TM3jzkxGXBGTtWKuSXv1xUV+azxAm8jdWZN06QTQk+2N2XB9jRDkvbmQmcRtg=="
|
||||
},
|
||||
"node_modules/catharsis": {
|
||||
"version": "0.9.0",
|
||||
"resolved": "https://registry.npmjs.org/catharsis/-/catharsis-0.9.0.tgz",
|
||||
"integrity": "sha512-prMTQVpcns/tzFgFVkVp6ak6RykZyWb3gu8ckUpd6YkTlacOd3DXGJjIpD4Q6zJirizvaiAjSSHlOsA+6sNh2A==",
|
||||
"dependencies": {
|
||||
"lodash": "^4.17.15"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 10"
|
||||
}
|
||||
},
|
||||
"node_modules/code-prettify": {
|
||||
"version": "0.1.0",
|
||||
"resolved": "https://registry.npmjs.org/code-prettify/-/code-prettify-0.1.0.tgz",
|
||||
"integrity": "sha1-RocMyMGlDQm61TmzOpg9vUqjSx4="
|
||||
},
|
||||
"node_modules/color-themes-for-google-code-prettify": {
|
||||
"version": "2.0.4",
|
||||
"resolved": "https://registry.npmjs.org/color-themes-for-google-code-prettify/-/color-themes-for-google-code-prettify-2.0.4.tgz",
|
||||
"integrity": "sha1-3urPZX/WhXaGR1TU5IbXjf2x54Q=",
|
||||
"engines": {
|
||||
"node": ">=5.9.0"
|
||||
}
|
||||
},
|
||||
"node_modules/common-path-prefix": {
|
||||
"version": "3.0.0",
|
||||
"resolved": "https://registry.npmjs.org/common-path-prefix/-/common-path-prefix-3.0.0.tgz",
|
||||
"integrity": "sha512-QE33hToZseCH3jS0qN96O/bSh3kaw/h+Tq7ngyY9eWDUnTlTNUyqfqvCXioLe5Na5jFsL78ra/wuBU4iuEgd4w=="
|
||||
},
|
||||
"node_modules/escape-string-regexp": {
|
||||
"version": "4.0.0",
|
||||
"resolved": "https://registry.npmjs.org/escape-string-regexp/-/escape-string-regexp-4.0.0.tgz",
|
||||
"integrity": "sha512-TtpcNJ3XAzx3Gq8sWRzJaVajRs0uVxA2YAkdb1jm2YkPz4G6egUFAyA3n5vtEIZefPk5Wa4UXbKuS5fKkJWdgA==",
|
||||
"engines": {
|
||||
"node": ">=10"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/sindresorhus"
|
||||
}
|
||||
},
|
||||
"node_modules/lodash": {
|
||||
"version": "4.17.21",
|
||||
"resolved": "https://registry.npmjs.org/lodash/-/lodash-4.17.21.tgz",
|
||||
"integrity": "sha512-v2kDEe57lecTulaDIuNTPy3Ry4gLGJ6Z1O3vE1krgXZNrsQ+LFTGHVxVjcXPs17LhbZVGedAJv8XZ1tvj5FvSg=="
|
||||
},
|
||||
"node_modules/open-sans-fonts": {
|
||||
"version": "1.6.2",
|
||||
"resolved": "https://registry.npmjs.org/open-sans-fonts/-/open-sans-fonts-1.6.2.tgz",
|
||||
"integrity": "sha512-vsJ6/Mm0TdUKQJqxfkXJy+0K2X0QeRuTmxQq9YE1ycziw6CbDPolDsHhQ6+ImoV/7OTh8K8ZTGklY1Z5nUAwug=="
|
||||
},
|
||||
"node_modules/requizzle": {
|
||||
"version": "0.2.3",
|
||||
"resolved": "https://registry.npmjs.org/requizzle/-/requizzle-0.2.3.tgz",
|
||||
"integrity": "sha512-YanoyJjykPxGHii0fZP0uUPEXpvqfBDxWV7s6GKAiiOsiqhX6vHNyW3Qzdmqp/iq/ExbhaGbVrjB4ruEVSM4GQ==",
|
||||
"dependencies": {
|
||||
"lodash": "^4.17.14"
|
||||
}
|
||||
},
|
||||
"node_modules/strip-bom": {
|
||||
"version": "4.0.0",
|
||||
"resolved": "https://registry.npmjs.org/strip-bom/-/strip-bom-4.0.0.tgz",
|
||||
"integrity": "sha512-3xurFv5tEgii33Zi8Jtp55wEIILR9eh34FAW00PZf+JnSsTmV/ioewSgQl97JHvgjoRGwPShsWm+IdrxB35d0w==",
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/strip-json-comments": {
|
||||
"version": "3.1.1",
|
||||
"resolved": "https://registry.npmjs.org/strip-json-comments/-/strip-json-comments-3.1.1.tgz",
|
||||
"integrity": "sha512-6fPc+R4ihwqP6N/aIv2f1gMH8lOVtWQHoqC4yK6oSDVVocumAsfCqjkXnqiYMhmMwS/mEHLp7Vehlt3ql6lEig==",
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/sindresorhus"
|
||||
}
|
||||
},
|
||||
"node_modules/taffydb": {
|
||||
"version": "2.6.2",
|
||||
"resolved": "https://registry.npmjs.org/taffydb/-/taffydb-2.6.2.tgz",
|
||||
"integrity": "sha1-fLy2S1oUG2ou/CxdLGe04VCyomg="
|
||||
}
|
||||
},
|
||||
"dependencies": {
|
||||
"@babel/parser": {
|
||||
"version": "7.15.7",
|
||||
|
||||
@ -12,10 +12,10 @@ exports.handlers = {
|
||||
/// @param e.source
|
||||
///
|
||||
beforeParse(e) {
|
||||
e.source = e.source.replace(/(\n[ \t]*\/\/\/[^\n]*)+/g, $ => {
|
||||
const replacement = `\n/**${$.replace(/^[ \t]*\/\/\//mg, '').replace(/(\n$|$)/, '*/$1')}`;
|
||||
e.source = e.source.replace(/(\n[ \t]*\/\/\/[^\n]*)+/g, ($) => {
|
||||
const replacement = `\n/**${$.replace(/^[ \t]*\/\/\//gm, '').replace(/(\n$|$)/, '*/$1')}`;
|
||||
|
||||
return replacement;
|
||||
});
|
||||
}
|
||||
},
|
||||
};
|
||||
|
||||
@ -13,5 +13,5 @@ exports.handlers = {
|
||||
} else {
|
||||
e.source = ''; // If file has no comments, parser should still receive no code
|
||||
}
|
||||
}
|
||||
},
|
||||
};
|
||||
|
||||
@ -7,12 +7,12 @@ exports.handlers = {
|
||||
/**
|
||||
* Translate HTML tags in descriptions into safe entities. Replaces <, & and newlines
|
||||
*/
|
||||
newDoclet({doclet}) {
|
||||
newDoclet({ doclet }) {
|
||||
if (doclet.description) {
|
||||
doclet.description = doclet.description
|
||||
.replace(/&/g, '&')
|
||||
.replace(/</g, '<')
|
||||
.replace(/\r\n|\n|\r/g, '<br>');
|
||||
}
|
||||
}
|
||||
},
|
||||
};
|
||||
|
||||
@ -18,7 +18,7 @@ let events = conf.include || [
|
||||
'newDoclet',
|
||||
'fileComplete',
|
||||
'parseComplete',
|
||||
'processingComplete'
|
||||
'processingComplete',
|
||||
];
|
||||
|
||||
// Don't dump the excluded parser events
|
||||
@ -63,10 +63,14 @@ function replaceNodeObjects(o) {
|
||||
function cleanse(e) {
|
||||
let result = {};
|
||||
|
||||
Object.keys(e).forEach(prop => {
|
||||
Object.keys(e).forEach((prop) => {
|
||||
// by default, don't stringify properties that contain an array of functions
|
||||
if (!conf.includeFunctions && Array.isArray(e[prop]) && e[prop][0] &&
|
||||
String(typeof e[prop][0]) === 'function') {
|
||||
if (
|
||||
!conf.includeFunctions &&
|
||||
Array.isArray(e[prop]) &&
|
||||
e[prop][0] &&
|
||||
String(typeof e[prop][0]) === 'function'
|
||||
) {
|
||||
result[prop] = `function[${e[prop].length}]`;
|
||||
}
|
||||
// never include functions that belong to the object
|
||||
@ -85,11 +89,15 @@ function cleanse(e) {
|
||||
|
||||
exports.handlers = {};
|
||||
|
||||
events.forEach(eventType => {
|
||||
exports.handlers[eventType] = e => {
|
||||
console.log(JSON.stringify({
|
||||
events.forEach((eventType) => {
|
||||
exports.handlers[eventType] = (e) => {
|
||||
console.log(
|
||||
JSON.stringify({
|
||||
type: eventType,
|
||||
content: cleanse(e)
|
||||
}), null, 4);
|
||||
content: cleanse(e),
|
||||
}),
|
||||
null,
|
||||
4
|
||||
);
|
||||
};
|
||||
});
|
||||
|
||||
@ -41,7 +41,7 @@ function hasUniqueValues(obj) {
|
||||
let isUnique = true;
|
||||
const seen = [];
|
||||
|
||||
Object.keys(obj).forEach(key => {
|
||||
Object.keys(obj).forEach((key) => {
|
||||
if (seen.includes(obj[key])) {
|
||||
isUnique = false;
|
||||
}
|
||||
@ -55,7 +55,7 @@ function hasUniqueValues(obj) {
|
||||
function getParamNames(params) {
|
||||
const names = [];
|
||||
|
||||
params.forEach(param => {
|
||||
params.forEach((param) => {
|
||||
let name = param.name || '';
|
||||
|
||||
if (param.variable) {
|
||||
@ -69,7 +69,7 @@ function getParamNames(params) {
|
||||
return names.length ? names.join(', ') : '';
|
||||
}
|
||||
|
||||
function getParamVariation({params}) {
|
||||
function getParamVariation({ params }) {
|
||||
return getParamNames(params || []);
|
||||
}
|
||||
|
||||
@ -79,7 +79,7 @@ function getUniqueVariations(doclets) {
|
||||
const docletKeys = Object.keys(doclets);
|
||||
|
||||
function getUniqueNumbers() {
|
||||
docletKeys.forEach(doclet => {
|
||||
docletKeys.forEach((doclet) => {
|
||||
let newLongname;
|
||||
|
||||
while (true) {
|
||||
@ -88,7 +88,7 @@ function getUniqueVariations(doclets) {
|
||||
|
||||
// is this longname + variation unique?
|
||||
newLongname = `${doclets[doclet].longname}(${variations[doclet]})`;
|
||||
if ( !functionDoclets[newLongname] ) {
|
||||
if (!functionDoclets[newLongname]) {
|
||||
break;
|
||||
}
|
||||
}
|
||||
@ -97,18 +97,18 @@ function getUniqueVariations(doclets) {
|
||||
|
||||
function getUniqueNames() {
|
||||
// start by trying to preserve existing variations
|
||||
docletKeys.forEach(doclet => {
|
||||
docletKeys.forEach((doclet) => {
|
||||
variations[doclet] = doclets[doclet].variation || getParamVariation(doclets[doclet]);
|
||||
});
|
||||
|
||||
// if they're identical, try again, without preserving existing variations
|
||||
if ( !hasUniqueValues(variations) ) {
|
||||
docletKeys.forEach(doclet => {
|
||||
if (!hasUniqueValues(variations)) {
|
||||
docletKeys.forEach((doclet) => {
|
||||
variations[doclet] = getParamVariation(doclets[doclet]);
|
||||
});
|
||||
|
||||
// if they're STILL identical, switch to numeric variations
|
||||
if ( !hasUniqueValues(variations) ) {
|
||||
if (!hasUniqueValues(variations)) {
|
||||
getUniqueNumbers();
|
||||
}
|
||||
}
|
||||
@ -117,8 +117,7 @@ function getUniqueVariations(doclets) {
|
||||
// are we already using numeric variations? if so, keep doing that
|
||||
if (functionDoclets[`${doclets.newDoclet.longname}(1)`]) {
|
||||
getUniqueNumbers();
|
||||
}
|
||||
else {
|
||||
} else {
|
||||
getUniqueNames();
|
||||
}
|
||||
|
||||
@ -128,7 +127,7 @@ function getUniqueVariations(doclets) {
|
||||
function ensureUniqueLongname(newDoclet) {
|
||||
const doclets = {
|
||||
oldDoclet: functionDoclets[newDoclet.longname],
|
||||
newDoclet: newDoclet
|
||||
newDoclet: newDoclet,
|
||||
};
|
||||
const docletKeys = Object.keys(doclets);
|
||||
let oldDocletLongname;
|
||||
@ -139,7 +138,7 @@ function ensureUniqueLongname(newDoclet) {
|
||||
// if the shared longname has a variation, like MyClass#myLongname(variation),
|
||||
// remove the variation
|
||||
if (doclets.oldDoclet.variation || doclets.oldDoclet.variation === '') {
|
||||
docletKeys.forEach(doclet => {
|
||||
docletKeys.forEach((doclet) => {
|
||||
doclets[doclet].longname = doclets[doclet].longname.replace(/\([\s\S]*\)$/, '');
|
||||
doclets[doclet].variation = null;
|
||||
});
|
||||
@ -148,7 +147,7 @@ function ensureUniqueLongname(newDoclet) {
|
||||
variations = getUniqueVariations(doclets);
|
||||
|
||||
// update the longnames/variations
|
||||
docletKeys.forEach(doclet => {
|
||||
docletKeys.forEach((doclet) => {
|
||||
doclets[doclet].longname += `(${variations[doclet]})`;
|
||||
doclets[doclet].variation = variations[doclet];
|
||||
});
|
||||
@ -177,5 +176,5 @@ exports.handlers = {
|
||||
|
||||
parseComplete() {
|
||||
functionDoclets = null;
|
||||
}
|
||||
},
|
||||
};
|
||||
|
||||
@ -18,7 +18,7 @@ exports.handlers = {
|
||||
* @partial "partial_doc.jsdoc"
|
||||
*/
|
||||
beforeParse(e) {
|
||||
e.source = e.source.replace(/(@partial ".*")+/g, $ => {
|
||||
e.source = e.source.replace(/(@partial ".*")+/g, ($) => {
|
||||
const pathArg = $.match(/".*"/)[0].replace(/"/g, '');
|
||||
const fullPath = path.join(e.filename, '..', pathArg);
|
||||
|
||||
@ -26,5 +26,5 @@ exports.handlers = {
|
||||
|
||||
return partialData;
|
||||
});
|
||||
}
|
||||
},
|
||||
};
|
||||
|
||||
@ -15,5 +15,5 @@ exports.handlers = {
|
||||
if (e.filename.match(/\.erb$/)) {
|
||||
e.source = e.source.replace(/<%.*%>/g, '');
|
||||
}
|
||||
}
|
||||
},
|
||||
};
|
||||
|
||||
@ -7,9 +7,9 @@ exports.handlers = {
|
||||
/**
|
||||
* Make your descriptions more shoutier.
|
||||
*/
|
||||
newDoclet({doclet}) {
|
||||
newDoclet({ doclet }) {
|
||||
if (typeof doclet.description === 'string') {
|
||||
doclet.description = doclet.description.toUpperCase();
|
||||
}
|
||||
}
|
||||
},
|
||||
};
|
||||
|
||||
@ -17,7 +17,7 @@ exports.handlers = {
|
||||
*
|
||||
* @source { "filename": "sourcetag.js", "lineno": 9 }
|
||||
*/
|
||||
newDoclet({doclet}) {
|
||||
newDoclet({ doclet }) {
|
||||
let tags = doclet.tags;
|
||||
let tag;
|
||||
let value;
|
||||
@ -25,7 +25,7 @@ exports.handlers = {
|
||||
// any user-defined tags in this doclet?
|
||||
if (typeof tags !== 'undefined') {
|
||||
// only interested in the @source tags
|
||||
tags = tags.filter(({title}) => title === 'source');
|
||||
tags = tags.filter(({ title }) => title === 'source');
|
||||
|
||||
if (tags.length) {
|
||||
// take the first one
|
||||
@ -33,8 +33,7 @@ exports.handlers = {
|
||||
|
||||
try {
|
||||
value = JSON.parse(tag.value);
|
||||
}
|
||||
catch (ex) {
|
||||
} catch (ex) {
|
||||
log.error(
|
||||
'@source tag expects a valid JSON value, like ' +
|
||||
'{ "filename": "myfile.js", "lineno": 123 }.'
|
||||
@ -48,5 +47,5 @@ exports.handlers = {
|
||||
doclet.meta.lineno = value.lineno || '';
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
};
|
||||
|
||||
@ -7,7 +7,7 @@ exports.handlers = {
|
||||
/**
|
||||
* Autogenerate summaries, if missing, from the description, if present.
|
||||
*/
|
||||
newDoclet({doclet}) {
|
||||
newDoclet({ doclet }) {
|
||||
let endTag;
|
||||
let tags;
|
||||
let stack;
|
||||
@ -25,7 +25,7 @@ exports.handlers = {
|
||||
tags = doclet.summary.match(/<[^>]+>/g) || [];
|
||||
stack = [];
|
||||
|
||||
tags.forEach(tag => {
|
||||
tags.forEach((tag) => {
|
||||
const idx = tag.indexOf('/');
|
||||
|
||||
if (idx === -1) {
|
||||
@ -54,5 +54,5 @@ exports.handlers = {
|
||||
// template decide whether to wrap the summary in a <p> tag
|
||||
doclet.summary = doclet.summary.replace(/^<p>(.*)<\/p>$/i, '$1');
|
||||
}
|
||||
}
|
||||
},
|
||||
};
|
||||
|
||||
@ -14,7 +14,9 @@ describe('escapeHtml plugin', () => {
|
||||
it("should escape '&', '<' and newlines in doclet descriptions", () => {
|
||||
const doclet = docSet.getByLongname('module:plugins/escapeHtml.handlers.newDoclet');
|
||||
|
||||
expect(doclet[0].description).toBe('Translate HTML tags in descriptions into safe ' +
|
||||
'entities. Replaces <, & and newlines');
|
||||
expect(doclet[0].description).toBe(
|
||||
'Translate HTML tags in descriptions into safe ' +
|
||||
'entities. Replaces <, & and newlines'
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
@ -10,7 +10,9 @@ describe('railsTemplate plugin', () => {
|
||||
require('jsdoc/src/handlers').attachTo(parser);
|
||||
|
||||
it('should remove <% %> rails template tags from the source of *.erb files', () => {
|
||||
const docSet = parser.parse([path.join(env.dirname, 'plugins/test/fixtures/railsTemplate.js.erb')]);
|
||||
const docSet = parser.parse([
|
||||
path.join(env.dirname, 'plugins/test/fixtures/railsTemplate.js.erb'),
|
||||
]);
|
||||
|
||||
expect(docSet[2].description).toEqual('Remove rails tags from the source input (e.g. )');
|
||||
});
|
||||
|
||||
@ -25,7 +25,7 @@ describe('summarize', () => {
|
||||
it('should not change the summary if it is already defined', () => {
|
||||
const doclet = {
|
||||
summary: 'This is a summary.',
|
||||
description: 'Descriptions are good.'
|
||||
description: 'Descriptions are good.',
|
||||
};
|
||||
|
||||
handler({ doclet: doclet });
|
||||
@ -43,7 +43,7 @@ describe('summarize', () => {
|
||||
|
||||
it('should use the first sentence as the summary', () => {
|
||||
const doclet = {
|
||||
description: 'This sentence is the summary. This sentence is not.'
|
||||
description: 'This sentence is the summary. This sentence is not.',
|
||||
};
|
||||
|
||||
handler({ doclet: doclet });
|
||||
@ -53,7 +53,7 @@ describe('summarize', () => {
|
||||
|
||||
it('should not add an extra period if there is only one sentence in the description', () => {
|
||||
const doclet = {
|
||||
description: 'This description has only one sentence.'
|
||||
description: 'This description has only one sentence.',
|
||||
};
|
||||
|
||||
handler({ doclet: doclet });
|
||||
@ -61,31 +61,37 @@ describe('summarize', () => {
|
||||
expect(doclet.summary).toBe('This description has only one sentence.');
|
||||
});
|
||||
|
||||
it('should use the entire description, plus a period, as the summary if the description ' +
|
||||
'does not contain a period', () => {
|
||||
it(
|
||||
'should use the entire description, plus a period, as the summary if the description ' +
|
||||
'does not contain a period',
|
||||
() => {
|
||||
const doclet = {
|
||||
description: 'This is a description'
|
||||
description: 'This is a description',
|
||||
};
|
||||
|
||||
handler({ doclet: doclet });
|
||||
|
||||
expect(doclet.summary).toBe('This is a description.');
|
||||
});
|
||||
}
|
||||
);
|
||||
|
||||
it('should use the entire description as the summary if the description contains only ' +
|
||||
'one sentence', () => {
|
||||
it(
|
||||
'should use the entire description as the summary if the description contains only ' +
|
||||
'one sentence',
|
||||
() => {
|
||||
const doclet = {
|
||||
description: 'This is a description.'
|
||||
description: 'This is a description.',
|
||||
};
|
||||
|
||||
handler({ doclet: doclet });
|
||||
|
||||
expect(doclet.description).toBe('This is a description.');
|
||||
});
|
||||
}
|
||||
);
|
||||
|
||||
it('should work when an HTML tag immediately follows the first sentence', () => {
|
||||
const doclet = {
|
||||
description: 'This sentence is the summary.<small>This sentence is small.</small>'
|
||||
description: 'This sentence is the summary.<small>This sentence is small.</small>',
|
||||
};
|
||||
|
||||
handler({ doclet: doclet });
|
||||
@ -95,7 +101,7 @@ describe('summarize', () => {
|
||||
|
||||
it('should generate valid HTML if a tag is opened, but not closed, in the summary', () => {
|
||||
const doclet = {
|
||||
description: 'This description has <em>a tag. The tag straddles</em> sentences.'
|
||||
description: 'This description has <em>a tag. The tag straddles</em> sentences.',
|
||||
};
|
||||
|
||||
handler({ doclet: doclet });
|
||||
@ -105,7 +111,7 @@ describe('summarize', () => {
|
||||
|
||||
it('should not include a <p> tag in the summary', () => {
|
||||
const doclet = {
|
||||
description: '<p>This description contains HTML.</p><p>And plenty of it!</p>'
|
||||
description: '<p>This description contains HTML.</p><p>And plenty of it!</p>',
|
||||
};
|
||||
|
||||
handler({ doclet: doclet });
|
||||
|
||||
@ -7,10 +7,10 @@
|
||||
*/
|
||||
|
||||
exports.handlers = {
|
||||
newDoclet({doclet}) {
|
||||
newDoclet({ doclet }) {
|
||||
// Ignore comment blocks for all symbols that begin with underscore
|
||||
if (doclet.name.charAt(0) === '_' || doclet.name.substr(0, 6) === 'this._') {
|
||||
doclet.access = 'private';
|
||||
}
|
||||
}
|
||||
},
|
||||
};
|
||||
|
||||
@ -5,7 +5,7 @@ To create or use your own template:
|
||||
|
||||
For example:
|
||||
|
||||
````javascript
|
||||
```javascript
|
||||
/** @module publish */
|
||||
|
||||
/**
|
||||
@ -15,13 +15,13 @@ For example:
|
||||
* all the symbols documented in your code.
|
||||
* @param {object} opts - An object with options information.
|
||||
*/
|
||||
exports.publish = function(data, opts) {
|
||||
exports.publish = function (data, opts) {
|
||||
// do stuff here to generate your output files
|
||||
};
|
||||
````
|
||||
```
|
||||
|
||||
To invoke JSDoc 3 with your own template, use the `-t` command line option, and specify the path to your template folder:
|
||||
|
||||
````
|
||||
```
|
||||
./jsdoc mycode.js -t /path/to/mycooltemplate
|
||||
````
|
||||
```
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Loading…
x
Reference in New Issue
Block a user