mirror of
https://github.com/serverless/serverless.git
synced 2026-01-18 14:58:43 +00:00
Merge branch 'master' into reduce-scope-of-rights-of-default-iamlambdarole
# Conflicts: # lib/plugins/aws/deploy/compile/functions/index.js # lib/plugins/aws/deploy/compile/functions/tests/index.js
This commit is contained in:
commit
2a5cbcc07e
@ -1,6 +1,6 @@
|
||||
The MIT License (MIT)
|
||||
|
||||
Copyright (c) 2015 Serverless, Inc. http://www.serverless.com
|
||||
Copyright (c) 2016 Serverless, Inc. http://www.serverless.com
|
||||
|
||||
The following license applies to all parts of this software except as
|
||||
documented below:
|
||||
|
||||
34
README.md
34
README.md
@ -10,8 +10,6 @@ Serverless is an MIT open-source project, actively maintained by a full-time, ve
|
||||
|
||||
## Links
|
||||
|
||||
|
||||
|
||||
* [Guide to Serverless](./docs/01-guide/README.md)
|
||||
* [Features](#features)
|
||||
* [Documentation v.1](./docs/README.md) / [v.0](http://serverless.readme.io)
|
||||
@ -54,6 +52,34 @@ Check out our in-depth [Guide to Serverless](./docs/01-guide/README.md) for more
|
||||
* An ecosystem of serverless services and plugins.
|
||||
* A passionate and welcoming community!
|
||||
|
||||
## <a name="v1-plugins"></a>Plugins (V1.0)
|
||||
|
||||
Use these plugins to overwrite or extend the Framework's functionality...
|
||||
|
||||
* [serverless-webpack](https://github.com/elastic-coders/serverless-webpack) - Bundle your lambdas with Webpack
|
||||
* [serverless-alexa-plugin](https://github.com/rajington/serverless-alexa-plugin) - Support Alexa Lambda events
|
||||
* [serverless-run-function](https://github.com/lithin/serverless-run-function-plugin) - Run functions locally
|
||||
* [serverless-plugin-write-env-vars](https://github.com/silvermine/serverless-plugin-write-env-vars)
|
||||
* [serverless-plugin-multiple-responses](https://github.com/silvermine/serverless-plugin-multiple-responses)
|
||||
* [serverless-build](https://github.com/nfour/serverless-build-plugin)
|
||||
* [serverless-scriptable](https://github.com/wei-xu-myob/serverless-scriptable-plugin)
|
||||
* [serverless-plugin-stage-variables](https://github.com/svdgraaf/serverless-plugin-stage-variables)
|
||||
* [serverless-dynamodb-local](https://github.com/99xt/serverless-dynamodb-local/tree/v1)
|
||||
|
||||
## <a name="v1-services"></a>Services & Projects (V1.0)
|
||||
|
||||
Pre-written functions you can use instantly and example implementations...
|
||||
|
||||
* [serverless-authentication-boilerplate](https://github.com/laardee/serverless-authentication-boilerplate)
|
||||
* [serverless-examples](https://github.com/andymac4182/serverless_example)
|
||||
* [serverless-npm-registry](https://github.com/craftship/yith)
|
||||
* [serverless-pokego](https://github.com/jch254/pokego-serverless)
|
||||
* [serverless-pocket-app](https://github.com/s0enke/weekly2pocket)
|
||||
* [serverless-quotebot](https://github.com/pmuens/quotebot)
|
||||
* [serverless-slackbot](https://github.com/conveyal/trevorbot)
|
||||
* [serverless-garden-aid](https://github.com/garden-aid/web-bff)
|
||||
* [serverless-react-boilerplate](https://github.com/99xt/serverless-react-boilerplate)
|
||||
|
||||
## <a name="contributing"></a>Contributing
|
||||
We love our contributors! Please read our [Contributing Document](CONTRIBUTING.md) to learn how you can start working on the Framework yourself.
|
||||
|
||||
@ -121,7 +147,7 @@ Below are projects and plugins relating to version 0.5 and below. Note that thes
|
||||
|
||||
You can read the v0.5.x documentation at [readme.io](https://serverless.readme.io/v0.5.0/docs).
|
||||
|
||||
## v0.5.x Projects
|
||||
## Projects (v0.5.x)
|
||||
Serverless Projects are shareable and installable. You can publish them to npm and install them via the Serverless Framework CLI by using `$ serverless project install <project-name>`
|
||||
* [serverless-graphql](https://github.com/serverless/serverless-graphql) - Official Serverless boilerplate to kick start your project
|
||||
* [serverless-starter](https://github.com/serverless/serverless-starter) - A simple boilerplate for new projects (JavaScript) with a few architectural options
|
||||
@ -131,7 +157,7 @@ Serverless Projects are shareable and installable. You can publish them to npm
|
||||
* [sc5-serverless-boilerplate](https://github.com/SC5/sc5-serverless-boilerplate) - A boilerplate for test driven development of REST endpoints
|
||||
* [MoonMail] (https://github.com/microapps/MoonMail) - Build your own email marketing infrastructure using Lambda + SES
|
||||
|
||||
## v0.5.x Plugins
|
||||
## Plugins (v0.5.x)
|
||||
Serverless is composed of Plugins. A group of default Plugins ship with the Framework, and here are some others you can add to improve/help your workflow:
|
||||
* [Meta Sync](https://github.com/serverless/serverless-meta-sync) - Securely sync your the variables in your project's `_meta/variables` across your team.
|
||||
* [Offline](https://github.com/dherault/serverless-offline) - Emulate AWS Lambda and Api Gateway locally to speed up your development cycles.
|
||||
|
||||
@ -28,3 +28,7 @@ services:
|
||||
image: qlik/gradle
|
||||
volumes:
|
||||
- ./tmp/serverless-integration-test-aws-java-gradle:/app
|
||||
aws-scala-sbt:
|
||||
image: hseeberger/scala-sbt
|
||||
volumes:
|
||||
- ./tmp/serverless-integration-test-aws-scala-sbt:/app
|
||||
|
||||
@ -1,45 +1,38 @@
|
||||
<!--
|
||||
title: Including/Excluding files from packaging
|
||||
title: Excluding files from packaging
|
||||
menuText: Packaging Services
|
||||
layout: Doc
|
||||
-->
|
||||
|
||||
# Including/Excluding files from packaging
|
||||
# Excluding files from packaging
|
||||
|
||||
Sometimes you might like to have more control over your function artifacts and how they are packaged.
|
||||
|
||||
You can use the `package` and `include/exclude` configuration for more control over the packaging process.
|
||||
|
||||
## Include
|
||||
The `include` config allows you to selectively include files into the created package. Only the configured paths will be included in the package. If both include and exclude are defined exclude is applied first, then include so files are guaranteed to be included.
|
||||
You can use the `package` and `exclude` configuration for more control over the packaging process.
|
||||
|
||||
## Exclude
|
||||
|
||||
Exclude allows you to define paths that will be excluded from the resulting artifact.
|
||||
Exclude allows you to define globs that will be excluded from the resulting artifact.
|
||||
|
||||
## Artifact
|
||||
For complete control over the packaging process you can specify your own zip file for your service. Serverless won't zip your service if this is configured so `include` and `exclude` will be ignored.
|
||||
For complete control over the packaging process you can specify your own zip file for your service. Serverless won't zip your service if this is configured so `exclude` will be ignored.
|
||||
|
||||
## Example
|
||||
|
||||
```yaml
|
||||
service: my-service
|
||||
package:
|
||||
include:
|
||||
- lib
|
||||
- functions
|
||||
exclude:
|
||||
- tmp
|
||||
- tmp/**
|
||||
- .git
|
||||
artifact: path/to/my-artifact.zip
|
||||
```
|
||||
|
||||
|
||||
## Packaging functions separately
|
||||
|
||||
If you want even more controls over your functions for deployment you can configure them to be packaged independently. This allows you more control for optimizing your deployment. To enable individual packaging set `individually` to true in the service wide packaging settings.
|
||||
|
||||
Then for every function you can use the same `include/exclude/artifact` config options as you can service wide. The `include/exclude` options will be merged with the service wide options to create one `include/exclude` config per function during packaging.
|
||||
Then for every function you can use the same `exclude/artifact` config options as you can service wide. The `exclude` option will be merged with the service wide options to create one `exclude` config per function during packaging.
|
||||
|
||||
```yaml
|
||||
service: my-service
|
||||
@ -51,9 +44,9 @@ functions:
|
||||
hello:
|
||||
handler: handler.hello
|
||||
package:
|
||||
include:
|
||||
# We're including this file so it will be in the final package of this function only
|
||||
- excluded-by-default.json
|
||||
exclude:
|
||||
# We're excluding this file so it will not be in the final package of this function only
|
||||
- included-by-default.json
|
||||
world:
|
||||
handler: handler.hello
|
||||
package:
|
||||
|
||||
@ -82,7 +82,7 @@ provider:
|
||||
|
||||
##### Per Stage Profiles
|
||||
|
||||
As an advanced use-case, you can deploy different stages to different accounts by using different profiles per stage. In order to use different profiles per stage, you must leverage [variables](../01-guide/08-serverless-variables.md) and the provider profile setting.
|
||||
As an advanced use-case, you can deploy different stages to different accounts by using different profiles per stage. In order to use different profiles per stage, you must leverage [variables](../../01-guide/08-serverless-variables.md) and the provider profile setting.
|
||||
|
||||
This example `serverless.yml` snippet will load the profile depending upon the stage specified in the command line options (or default to 'dev' if unspecified);
|
||||
|
||||
|
||||
@ -22,9 +22,13 @@ provider:
|
||||
runtime: nodejs4.3 # Runtime used for all functions in this provider
|
||||
stage: dev # Set the default stage used. Default is dev
|
||||
region: us-east-1 # Overwrite the default region used. Default is us-east-1
|
||||
deploymentBucket: com.serverless.${self:provider.region}.deploys # Overwrite the default deployment bucket
|
||||
variableSyntax: '\${{([\s\S]+?)}}' # Overwrite the default "${}" variable syntax to be "${{}}" instead. This can be helpful if you want to use "${}" as a string without using it as a variable.
|
||||
```
|
||||
|
||||
### Deployment S3Bucket
|
||||
The bucket must exist beforehand and be in the same region as the deploy.
|
||||
|
||||
## Additional function configuration
|
||||
|
||||
```yaml
|
||||
|
||||
@ -71,12 +71,34 @@ functions:
|
||||
path: whatever
|
||||
request:
|
||||
template:
|
||||
text/xhtml: { "stage" : "$context.stage" }
|
||||
application/json: { "httpMethod" : "$context.httpMethod" }
|
||||
text/xhtml: '{ "stage" : "$context.stage" }'
|
||||
application/json: '{ "httpMethod" : "$context.httpMethod" }'
|
||||
```
|
||||
|
||||
**Note:** The templates are defined as plain text here. However you can also reference an external file with the help of the `${file(templatefile)}` syntax.
|
||||
|
||||
**Note 2:** In .yml, strings containing `:`, `{`, `}`, `[`, `]`, `,`, `&`, `*`, `#`, `?`, `|`, `-`, `<`, `>`, `=`, `!`, `%`, `@`, `` ` `` must be quoted.
|
||||
|
||||
If you want to map querystrings to the event object, you can use the `$input.params('hub.challenge')` syntax from API Gateway, as follows:
|
||||
|
||||
```yml
|
||||
functions:
|
||||
create:
|
||||
handler: posts.create
|
||||
events:
|
||||
- http:
|
||||
method: get
|
||||
path: whatever
|
||||
request:
|
||||
template:
|
||||
application/json: '{ "foo" : "$input.params(''bar'')" }'
|
||||
```
|
||||
|
||||
**Note:** Notice when using single-quoted strings, any single quote `'` inside its contents must be doubled (`''`) to escape it.
|
||||
You can then access the query string `https://example.com/dev/whatever?bar=123` by `event.foo` in the lambda function.
|
||||
If you want to spread a string into multiple lines, you can use the `>` or `|` syntax, but the following strings have to be all indented with the same amount, [read more about `>` syntax](http://stackoverflow.com/questions/3790454/in-yaml-how-do-i-break-a-string-over-multiple-lines).
|
||||
|
||||
|
||||
### Pass Through Behavior
|
||||
API Gateway provides multiple ways to handle requests where the Content-Type header does not match any of the specified mapping templates. When this happens, the request payload will either be passed through the integration request *without transformation* or rejected with a `415 - Unsupported Media Type`, depending on the configuration.
|
||||
|
||||
@ -356,14 +378,14 @@ resources:
|
||||
- RootResourceId
|
||||
PathPart: serverless # the endpoint in your API that is set as proxy
|
||||
RestApiId:
|
||||
Ref: RestApiApigEvent
|
||||
Ref: ApiGatewayRestApi
|
||||
ProxyMethod:
|
||||
ResourceId:
|
||||
Ref: ProxyResource
|
||||
RestApiId:
|
||||
Ref: RestApiApigEvent
|
||||
Type: AWS::ApiGateway::Method
|
||||
Properties:
|
||||
ResourceId:
|
||||
Ref: ProxyResource
|
||||
RestApiId:
|
||||
Ref: ApiGatewayRestApi
|
||||
HttpMethod: GET # the method of your proxy. Is it GET or POST or ... ?
|
||||
MethodResponses:
|
||||
- StatusCode: 200
|
||||
|
||||
@ -32,6 +32,23 @@ functions:
|
||||
event: s3:ObjectRemoved:*
|
||||
```
|
||||
|
||||
## Setting filter rules
|
||||
|
||||
This will create a bucket `photos`. The `users` function is called whenever an image with `.jpg` extension is uploaded to folder `uploads` in the bucket. Check out the [AWS documentation](http://docs.aws.amazon.com/AmazonS3/latest/dev/NotificationHowTo.html#notification-how-to-filtering) to learn more about all the different filter types that can be configured.
|
||||
|
||||
```yml
|
||||
functions:
|
||||
users:
|
||||
handler: users.handler
|
||||
rules:
|
||||
- s3:
|
||||
bucket: photos
|
||||
event: s3:ObjectCreated:*
|
||||
rules:
|
||||
- prefix: uploads/
|
||||
- suffix: .jpg
|
||||
```
|
||||
|
||||
## Triggering separate functions from the same bucket
|
||||
|
||||
You're able to repeat the S3 event configuration in the same or separate functions so one bucket can call these functions. One caveat though is that you can't repeat the same configuration in both functions, e.g. the event type has to be different.
|
||||
|
||||
@ -24,7 +24,8 @@ serverless info
|
||||
|
||||
### AWS
|
||||
|
||||
On AWS the info plugin uses the `Outputs` section of the CloudFormation stack. Outputs will include Lambda function ARN's, a `ServiceEndpoint` for the API Gateway endpoint and user provided custom Outputs.
|
||||
On AWS the info plugin uses the `Outputs` section of the CloudFormation stack and the AWS SDK to gather the necessary information.
|
||||
See the example below for an example output.
|
||||
|
||||
**Example:**
|
||||
|
||||
@ -35,6 +36,8 @@ Service Information
|
||||
service: my-serverless-service
|
||||
stage: dev
|
||||
region: us-east-1
|
||||
api keys:
|
||||
myKey: some123valid456api789key1011for1213api1415gateway
|
||||
endpoints:
|
||||
GET - https://dxaynpuzd4.execute-api.us-east-1.amazonaws.com/dev/users
|
||||
functions:
|
||||
|
||||
@ -62,12 +62,10 @@ class Serverless {
|
||||
// load all plugins
|
||||
this.pluginManager.loadAllPlugins(this.service.plugins);
|
||||
|
||||
// give the CLI the plugins so that it can print out plugin information
|
||||
// such as options when the user enters --help
|
||||
// give the CLI the plugins and commands so that it can print out
|
||||
// information such as options when the user enters --help
|
||||
this.cli.setLoadedPlugins(this.pluginManager.getPlugins());
|
||||
|
||||
// populate variables after processing options
|
||||
return this.variables.populateService(this.pluginManager.cliOptions);
|
||||
this.cli.setLoadedCommands(this.pluginManager.getCommands());
|
||||
});
|
||||
}
|
||||
|
||||
@ -78,12 +76,19 @@ class Serverless {
|
||||
this.utils.track(this);
|
||||
}
|
||||
|
||||
if (!this.cli.displayHelp(this.processedInput) && this.processedInput.commands.length) {
|
||||
// trigger the plugin lifecycle when there's something which should be processed
|
||||
return this.pluginManager.run(this.processedInput.commands);
|
||||
if (this.cli.displayHelp(this.processedInput)) {
|
||||
return BbPromise.resolve();
|
||||
}
|
||||
|
||||
return BbPromise.resolve();
|
||||
// make sure the command exists before doing anything else
|
||||
this.pluginManager.validateCommand(this.processedInput.commands);
|
||||
|
||||
// populate variables after --help, otherwise help may fail to print
|
||||
// (https://github.com/serverless/serverless/issues/2041)
|
||||
this.variables.populateService(this.pluginManager.cliOptions);
|
||||
|
||||
// trigger the plugin lifecycle when there's something which should be processed
|
||||
return this.pluginManager.run(this.processedInput.commands);
|
||||
}
|
||||
|
||||
getVersion() {
|
||||
|
||||
@ -11,12 +11,17 @@ class CLI {
|
||||
this.serverless = serverless;
|
||||
this.inputArray = inputArray || null;
|
||||
this.loadedPlugins = [];
|
||||
this.loadedCommands = {};
|
||||
}
|
||||
|
||||
setLoadedPlugins(plugins) {
|
||||
this.loadedPlugins = plugins;
|
||||
}
|
||||
|
||||
setLoadedCommands(commands) {
|
||||
this.loadedCommands = commands;
|
||||
}
|
||||
|
||||
processInput() {
|
||||
let inputArray;
|
||||
|
||||
@ -63,6 +68,52 @@ class CLI {
|
||||
return false;
|
||||
}
|
||||
|
||||
displayCommandUsage(commandObject, command) {
|
||||
const dotsLength = 30;
|
||||
|
||||
// check if command has lifecycleEvents (can be executed)
|
||||
if (commandObject.lifecycleEvents) {
|
||||
const usage = commandObject.usage;
|
||||
const dots = _.repeat('.', dotsLength - command.length);
|
||||
this.consoleLog(`${chalk.yellow(command)} ${chalk.dim(dots)} ${usage}`);
|
||||
}
|
||||
|
||||
_.forEach(commandObject.commands, (subcommandObject, subcommand) => {
|
||||
this.displayCommandUsage(subcommandObject, `${command} ${subcommand}`);
|
||||
});
|
||||
}
|
||||
|
||||
displayCommandOptions(commandObject) {
|
||||
const dotsLength = 40;
|
||||
_.forEach(commandObject.options, (optionsObject, option) => {
|
||||
let optionsDots = _.repeat('.', dotsLength - option.length);
|
||||
const optionsUsage = optionsObject.usage;
|
||||
|
||||
if (optionsObject.required) {
|
||||
optionsDots = optionsDots.slice(0, optionsDots.length - 18);
|
||||
} else {
|
||||
optionsDots = optionsDots.slice(0, optionsDots.length - 7);
|
||||
}
|
||||
if (optionsObject.shortcut) {
|
||||
optionsDots = optionsDots.slice(0, optionsDots.length - 5);
|
||||
}
|
||||
|
||||
const optionInfo = ` --${option}`;
|
||||
let shortcutInfo = '';
|
||||
let requiredInfo = '';
|
||||
if (optionsObject.shortcut) {
|
||||
shortcutInfo = ` / -${optionsObject.shortcut}`;
|
||||
}
|
||||
if (optionsObject.required) {
|
||||
requiredInfo = ' (required)';
|
||||
}
|
||||
|
||||
const thingsToLog = `${optionInfo}${shortcutInfo}${requiredInfo} ${
|
||||
chalk.dim(optionsDots)} ${optionsUsage}`;
|
||||
this.consoleLog(chalk.yellow(thingsToLog));
|
||||
});
|
||||
}
|
||||
|
||||
generateMainHelp() {
|
||||
this.consoleLog('');
|
||||
|
||||
@ -73,153 +124,36 @@ class CLI {
|
||||
|
||||
this.consoleLog('');
|
||||
|
||||
const sortedPlugins = _.sortBy(
|
||||
this.loadedPlugins,
|
||||
(plugin) => plugin.constructor.name
|
||||
);
|
||||
|
||||
// TODO: implement recursive command exploration (now only 2 steps are possible)
|
||||
const dotsLength = 25;
|
||||
sortedPlugins.forEach((plugin) => {
|
||||
_.forEach(plugin.commands,
|
||||
(firstLevelCommandObject, firstLevelCommand) => {
|
||||
// check if command has lifecycleEvents (can be execute)
|
||||
if (firstLevelCommandObject.lifecycleEvents) {
|
||||
const command = firstLevelCommand;
|
||||
const usage = firstLevelCommandObject.usage;
|
||||
const dots = _.repeat('.', dotsLength - command.length);
|
||||
this.consoleLog(`${chalk
|
||||
.yellow(command)} ${chalk
|
||||
.dim(dots)} ${usage}`);
|
||||
}
|
||||
_.forEach(firstLevelCommandObject.commands,
|
||||
(secondLevelCommandObject, secondLevelCommand) => {
|
||||
// check if command has lifecycleEvents (can be executed)
|
||||
if (secondLevelCommandObject.lifecycleEvents) {
|
||||
const command = `${firstLevelCommand} ${secondLevelCommand}`;
|
||||
const usage = secondLevelCommandObject.usage;
|
||||
const dots = _.repeat('.', dotsLength - command.length);
|
||||
this.consoleLog(`${chalk
|
||||
.yellow(command)} ${chalk
|
||||
.dim(dots)} ${usage}`);
|
||||
}
|
||||
});
|
||||
});
|
||||
_.forEach(this.loadedCommands, (details, command) => {
|
||||
this.displayCommandUsage(details, command);
|
||||
});
|
||||
|
||||
this.consoleLog('');
|
||||
|
||||
// print all the installed plugins
|
||||
this.consoleLog(chalk.yellow.underline('Plugins'));
|
||||
if (sortedPlugins.length) {
|
||||
|
||||
if (this.loadedPlugins.length) {
|
||||
const sortedPlugins = _.sortBy(
|
||||
this.loadedPlugins,
|
||||
(plugin) => plugin.constructor.name
|
||||
);
|
||||
|
||||
this.consoleLog(sortedPlugins.map((plugin) => plugin.constructor.name).join(', '));
|
||||
} else {
|
||||
this.consoleLog('No plugins added yet');
|
||||
}
|
||||
}
|
||||
|
||||
generateCommandsHelp(commands) {
|
||||
const dotsLength = 40;
|
||||
generateCommandsHelp(commandsArray) {
|
||||
const command = this.serverless.pluginManager.getCommand(commandsArray);
|
||||
const commandName = commandsArray.join(' ');
|
||||
|
||||
// TODO: use lodash utility functions to reduce loop usage
|
||||
// TODO: support more than 2 levels of nested commands
|
||||
if (commands.length === 1) {
|
||||
this.loadedPlugins.forEach((plugin) => {
|
||||
_.forEach(plugin.commands, (commandObject, command) => {
|
||||
if (command === commands[0]) {
|
||||
if (commandObject.lifecycleEvents) {
|
||||
// print the name of the plugin
|
||||
this.consoleLog(chalk.yellow.underline(`Plugin: ${plugin.constructor.name}`));
|
||||
// print the command with the corresponding usage
|
||||
const commandsDots = _.repeat('.', dotsLength - command.length);
|
||||
const commandsUsage = commandObject.usage;
|
||||
this.consoleLog(`${chalk
|
||||
.yellow(command)} ${chalk
|
||||
.dim(commandsDots)} ${commandsUsage}`);
|
||||
// print all options
|
||||
_.forEach(commandObject.options, (optionsObject, option) => {
|
||||
let optionsDots = _.repeat('.', dotsLength - option.length);
|
||||
const optionsUsage = optionsObject.usage;
|
||||
// print the name of the plugin
|
||||
this.consoleLog(chalk.yellow.underline(`Plugin: ${command.pluginName}`));
|
||||
|
||||
if (optionsObject.required) {
|
||||
optionsDots = optionsDots.slice(0, optionsDots.length - 17);
|
||||
} else {
|
||||
optionsDots = optionsDots.slice(0, optionsDots.length - 7);
|
||||
}
|
||||
if (optionsObject.shortcut) {
|
||||
optionsDots = optionsDots.slice(0, optionsDots.length - 5);
|
||||
}
|
||||
|
||||
const optionInfo = ` --${option}`;
|
||||
let shortcutInfo = '';
|
||||
let requiredInfo = '';
|
||||
if (optionsObject.shortcut) {
|
||||
shortcutInfo = ` / -${optionsObject.shortcut}`;
|
||||
}
|
||||
if (optionsObject.required) {
|
||||
requiredInfo = ' (required)';
|
||||
}
|
||||
|
||||
const thingsToLog = `${optionInfo}${shortcutInfo}${requiredInfo} ${
|
||||
chalk.dim(optionsDots)} ${optionsUsage}`;
|
||||
this.consoleLog(chalk.yellow(thingsToLog));
|
||||
});
|
||||
}
|
||||
}
|
||||
});
|
||||
});
|
||||
} else {
|
||||
this.loadedPlugins.forEach((plugin) => {
|
||||
_.forEach(plugin.commands,
|
||||
(firstLevelCommandObject, firstLevelCommand) => {
|
||||
if (firstLevelCommand === commands[0]) {
|
||||
_.forEach(firstLevelCommandObject.commands,
|
||||
(secondLevelCommandObject, secondLevelCommand) => {
|
||||
if (secondLevelCommand === commands[1]) {
|
||||
if (secondLevelCommandObject.lifecycleEvents) {
|
||||
// print the name of the plugin
|
||||
this.consoleLog(chalk.yellow.underline(`Plugin: ${plugin.constructor.name}`));
|
||||
// print the command with the corresponding usage
|
||||
const commandsDots = _.repeat('.', dotsLength - secondLevelCommand.length);
|
||||
const commandsUsage = secondLevelCommandObject.usage;
|
||||
this.consoleLog(`${chalk
|
||||
.yellow(secondLevelCommand)} ${chalk
|
||||
.dim(commandsDots)} ${commandsUsage}`);
|
||||
// print all options
|
||||
_.forEach(secondLevelCommandObject.options, (optionsObject, option) => {
|
||||
let optionsDots = _.repeat('.', dotsLength - option.length);
|
||||
const optionsUsage = optionsObject.usage;
|
||||
|
||||
if (optionsObject.required) {
|
||||
optionsDots = optionsDots.slice(0, optionsDots.length - 17);
|
||||
} else {
|
||||
optionsDots = optionsDots.slice(0, optionsDots.length - 7);
|
||||
}
|
||||
if (optionsObject.shortcut) {
|
||||
optionsDots = optionsDots.slice(0, optionsDots.length - 5);
|
||||
}
|
||||
|
||||
const optionInfo = ` --${option}`;
|
||||
let shortcutInfo = '';
|
||||
let requiredInfo = '';
|
||||
if (optionsObject.shortcut) {
|
||||
shortcutInfo = ` / -${optionsObject.shortcut}`;
|
||||
}
|
||||
if (optionsObject.required) {
|
||||
requiredInfo = ' (required)';
|
||||
}
|
||||
|
||||
const thingsToLog = `${optionInfo}${shortcutInfo}${requiredInfo} ${
|
||||
chalk.dim(optionsDots)} ${optionsUsage}`;
|
||||
this.consoleLog(chalk.yellow(thingsToLog));
|
||||
});
|
||||
}
|
||||
}
|
||||
});
|
||||
}
|
||||
});
|
||||
});
|
||||
}
|
||||
this.displayCommandUsage(command, commandName);
|
||||
this.displayCommandOptions(command);
|
||||
|
||||
this.consoleLog('');
|
||||
}
|
||||
|
||||
@ -1,11 +1,13 @@
|
||||
'use strict';
|
||||
const chalk = require('chalk');
|
||||
const version = require('./../../package.json').version;
|
||||
|
||||
module.exports.SError = class ServerlessError extends Error {
|
||||
constructor(message) {
|
||||
constructor(message, statusCode) {
|
||||
super(message);
|
||||
this.name = this.constructor.name;
|
||||
this.message = message;
|
||||
this.statusCode = statusCode;
|
||||
Error.captureStackTrace(this, this.constructor);
|
||||
}
|
||||
};
|
||||
@ -65,8 +67,13 @@ module.exports.logError = (e) => {
|
||||
if (e.name !== 'ServerlessError') {
|
||||
consoleLog(' ');
|
||||
consoleLog(chalk.red(' Please report this error. We think it might be a bug.'));
|
||||
consoleLog(' ');
|
||||
}
|
||||
|
||||
consoleLog(chalk.yellow(' Your Environment Infomation -----------------------------'));
|
||||
consoleLog(chalk.yellow(` OS: ${process.platform}`));
|
||||
consoleLog(chalk.yellow(` Node Version: ${process.version.replace(/^[v|V]/, '')}`));
|
||||
consoleLog(chalk.yellow(` Serverless Version: ${version}`));
|
||||
consoleLog(' ');
|
||||
|
||||
// Failure exit
|
||||
|
||||
@ -1,19 +1,20 @@
|
||||
'use strict';
|
||||
|
||||
const path = require('path');
|
||||
const _ = require('lodash');
|
||||
const BbPromise = require('bluebird');
|
||||
const _ = require('lodash');
|
||||
|
||||
class PluginManager {
|
||||
constructor(serverless) {
|
||||
this.serverless = serverless;
|
||||
this.provider = null;
|
||||
|
||||
this.cliOptions = {};
|
||||
this.cliCommands = [];
|
||||
|
||||
this.plugins = [];
|
||||
this.commandsList = [];
|
||||
this.commands = {};
|
||||
this.hooks = {};
|
||||
}
|
||||
|
||||
setProvider(provider) {
|
||||
@ -28,35 +29,137 @@ class PluginManager {
|
||||
this.cliCommands = commands;
|
||||
}
|
||||
|
||||
addPlugin(Plugin) {
|
||||
const pluginInstance = new Plugin(this.serverless, this.cliOptions);
|
||||
|
||||
// ignore plugins that specify a different provider than the current one
|
||||
if (pluginInstance.provider && (pluginInstance.provider !== this.provider)) {
|
||||
return;
|
||||
}
|
||||
|
||||
this.loadCommands(pluginInstance);
|
||||
this.loadHooks(pluginInstance);
|
||||
|
||||
this.plugins.push(pluginInstance);
|
||||
}
|
||||
|
||||
loadAllPlugins(servicePlugins) {
|
||||
this.loadCorePlugins();
|
||||
this.loadServicePlugins(servicePlugins);
|
||||
}
|
||||
|
||||
validateCommands(commandsArray) {
|
||||
// TODO: implement an option to get deeper than one level
|
||||
if (!this.commands[commandsArray[0]]) {
|
||||
const errorMessage = [
|
||||
`command "${commandsArray[0]}" not found`,
|
||||
' Run "serverless help" for a list of all available commands.',
|
||||
].join();
|
||||
throw new this.serverless.classes.Error(errorMessage);
|
||||
loadPlugins(plugins) {
|
||||
plugins.forEach((plugin) => {
|
||||
const Plugin = require(plugin); // eslint-disable-line global-require
|
||||
|
||||
this.addPlugin(Plugin);
|
||||
});
|
||||
}
|
||||
|
||||
loadCorePlugins() {
|
||||
const pluginsDirectoryPath = path.join(__dirname, '../plugins');
|
||||
|
||||
const corePlugins = this.serverless.utils
|
||||
.readFileSync(path.join(pluginsDirectoryPath, 'Plugins.json')).plugins
|
||||
.map((corePluginPath) => path.join(pluginsDirectoryPath, corePluginPath));
|
||||
|
||||
this.loadPlugins(corePlugins);
|
||||
}
|
||||
|
||||
loadServicePlugins(servicePlugs) {
|
||||
const servicePlugins = (typeof servicePlugs !== 'undefined' ? servicePlugs : []);
|
||||
|
||||
// we want to load plugins installed locally in the service
|
||||
if (this.serverless && this.serverless.config && this.serverless.config.servicePath) {
|
||||
module.paths.unshift(path.join(this.serverless.config.servicePath, 'node_modules'));
|
||||
}
|
||||
|
||||
this.loadPlugins(servicePlugins);
|
||||
|
||||
// restore module paths
|
||||
if (this.serverless && this.serverless.config && this.serverless.config.servicePath) {
|
||||
module.paths.shift();
|
||||
}
|
||||
}
|
||||
|
||||
validateOptions(commandsArray) {
|
||||
let options;
|
||||
loadCommand(pluginName, details, key) {
|
||||
const commands = _.mapValues(details.commands, (subDetails, subKey) =>
|
||||
this.loadCommand(pluginName, subDetails, `${key}:${subKey}`)
|
||||
);
|
||||
return _.assign({}, details, { key, pluginName, commands });
|
||||
}
|
||||
|
||||
// TODO: implement an option to get deeper than two levels
|
||||
if (commandsArray.length === 1) {
|
||||
options = this.commands[commandsArray[0]].options;
|
||||
} else {
|
||||
options = this.commands[commandsArray[0]].commands[commandsArray[1]].options;
|
||||
loadCommands(pluginInstance) {
|
||||
const pluginName = pluginInstance.constructor.name;
|
||||
_.forEach(pluginInstance.commands, (details, key) => {
|
||||
const command = this.loadCommand(pluginName, details, key);
|
||||
this.commands[key] = _.merge({}, this.commands[key], command);
|
||||
});
|
||||
}
|
||||
|
||||
loadHooks(pluginInstance) {
|
||||
_.forEach(pluginInstance.hooks, (hook, event) => {
|
||||
this.hooks[event] = this.hooks[event] || [];
|
||||
this.hooks[event].push(hook);
|
||||
});
|
||||
}
|
||||
|
||||
getCommands() {
|
||||
return this.commands;
|
||||
}
|
||||
|
||||
getCommand(commandsArray) {
|
||||
return _.reduce(commandsArray, (current, name, index) => {
|
||||
if (name in current.commands) {
|
||||
return current.commands[name];
|
||||
}
|
||||
const commandName = commandsArray.slice(0, index + 1).join(' ');
|
||||
const errorMessage = [
|
||||
`Command "${commandName}" not found`,
|
||||
' Run "serverless help" for a list of all available commands.',
|
||||
].join();
|
||||
throw new this.serverless.classes.Error(errorMessage);
|
||||
}, { commands: this.commands });
|
||||
}
|
||||
|
||||
getEvents(command) {
|
||||
return _.flatMap(command.lifecycleEvents, (event) => [
|
||||
`before:${command.key}:${event}`,
|
||||
`${command.key}:${event}`,
|
||||
`after:${command.key}:${event}`,
|
||||
]);
|
||||
}
|
||||
|
||||
getPlugins() {
|
||||
return this.plugins;
|
||||
}
|
||||
|
||||
run(commandsArray) {
|
||||
const command = this.getCommand(commandsArray);
|
||||
|
||||
this.convertShortcutsIntoOptions(command);
|
||||
this.validateOptions(command);
|
||||
|
||||
const events = this.getEvents(command);
|
||||
const hooks = _.flatMap(events, (event) => this.hooks[event] || []);
|
||||
|
||||
if (hooks.length === 0) {
|
||||
const errorMessage = 'The command you entered did not catch on any hooks';
|
||||
throw new this.serverless.classes.Error(errorMessage);
|
||||
}
|
||||
|
||||
_.forEach(options, (value, key) => {
|
||||
return BbPromise.reduce(hooks, (__, hook) => hook(), null);
|
||||
}
|
||||
|
||||
validateCommand(commandsArray) {
|
||||
this.getCommand(commandsArray);
|
||||
}
|
||||
|
||||
validateOptions(command) {
|
||||
_.forEach(command.options, (value, key) => {
|
||||
if (value.required && (this.cliOptions[key] === true || !(this.cliOptions[key]))) {
|
||||
let requiredThings = `the --${key} option`;
|
||||
|
||||
if (value.shortcut) {
|
||||
requiredThings += ` / -${value.shortcut} shortcut`;
|
||||
}
|
||||
@ -74,163 +177,19 @@ class PluginManager {
|
||||
});
|
||||
}
|
||||
|
||||
run(commandsArray) {
|
||||
// check if the command the user has entered is provided through a plugin
|
||||
this.validateCommands(commandsArray);
|
||||
|
||||
// check if all options are passed
|
||||
this.validateOptions(commandsArray);
|
||||
|
||||
const events = this.getEvents(commandsArray, this.commands);
|
||||
const hooks = events.reduce((memo, event) => {
|
||||
this.plugins.forEach((pluginInstance) => {
|
||||
// if a provider is given it should only add the hook when the plugins provider matches
|
||||
// the services provider
|
||||
if (!pluginInstance.provider || (pluginInstance.provider === this.provider)) {
|
||||
_.forEach(pluginInstance.hooks, (hook, hookKey) => {
|
||||
if (hookKey === event) {
|
||||
memo.push(hook);
|
||||
}
|
||||
});
|
||||
}
|
||||
});
|
||||
return memo;
|
||||
}, []);
|
||||
|
||||
if (hooks.length === 0) {
|
||||
const errorMessage = `The command you entered was not found.
|
||||
Did you spell it correctly?`;
|
||||
throw new this.serverless.classes.Error(errorMessage);
|
||||
}
|
||||
|
||||
return BbPromise.reduce(hooks, (__, hook) => hook(), null);
|
||||
}
|
||||
|
||||
convertShortcutsIntoOptions(cliOptions, commands) {
|
||||
// TODO: implement an option to get deeper than two levels
|
||||
// check if the command entered is the one in the commands object which holds all commands
|
||||
// this is necessary so that shortcuts are not treated like global citizens but command
|
||||
// bound properties
|
||||
if (this.cliCommands.length === 1) {
|
||||
_.forEach(commands, (firstCommand, firstCommandKey) => {
|
||||
if (_.includes(this.cliCommands, firstCommandKey)) {
|
||||
_.forEach(firstCommand.options, (optionObject, optionKey) => {
|
||||
if (optionObject.shortcut && _.includes(Object.keys(cliOptions),
|
||||
optionObject.shortcut)) {
|
||||
Object.keys(cliOptions).forEach((option) => {
|
||||
if (option === optionObject.shortcut) {
|
||||
this.cliOptions[optionKey] = this.cliOptions[option];
|
||||
}
|
||||
});
|
||||
}
|
||||
});
|
||||
}
|
||||
});
|
||||
} else if (this.cliCommands.length === 2) {
|
||||
_.forEach(commands, (firstCommand) => {
|
||||
_.forEach(firstCommand.commands, (secondCommand, secondCommandKey) => {
|
||||
if (_.includes(this.cliCommands, secondCommandKey)) {
|
||||
_.forEach(secondCommand.options, (optionObject, optionKey) => {
|
||||
if (optionObject.shortcut && _.includes(Object.keys(cliOptions),
|
||||
optionObject.shortcut)) {
|
||||
Object.keys(cliOptions).forEach((option) => {
|
||||
if (option === optionObject.shortcut) {
|
||||
this.cliOptions[optionKey] = this.cliOptions[option];
|
||||
}
|
||||
});
|
||||
}
|
||||
});
|
||||
convertShortcutsIntoOptions(command) {
|
||||
_.forEach(command.options, (optionObject, optionKey) => {
|
||||
if (optionObject.shortcut && _.includes(Object.keys(this.cliOptions),
|
||||
optionObject.shortcut)) {
|
||||
Object.keys(this.cliOptions).forEach((option) => {
|
||||
if (option === optionObject.shortcut) {
|
||||
this.cliOptions[optionKey] = this.cliOptions[option];
|
||||
}
|
||||
});
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
addPlugin(Plugin) {
|
||||
const pluginInstance = new Plugin(this.serverless, this.cliOptions);
|
||||
|
||||
this.loadCommands(pluginInstance);
|
||||
|
||||
// shortcuts should be converted into options so that the plugin
|
||||
// author can use the option (instead of the shortcut)
|
||||
this.convertShortcutsIntoOptions(this.cliOptions, this.commands);
|
||||
|
||||
this.plugins.push(pluginInstance);
|
||||
}
|
||||
|
||||
loadCorePlugins() {
|
||||
const pluginsDirectoryPath = path.join(__dirname, '../plugins');
|
||||
|
||||
const corePlugins = this.serverless.utils
|
||||
.readFileSync(path.join(pluginsDirectoryPath, 'Plugins.json')).plugins;
|
||||
|
||||
corePlugins.forEach((corePlugin) => {
|
||||
const Plugin = require(path // eslint-disable-line global-require
|
||||
.join(pluginsDirectoryPath, corePlugin));
|
||||
|
||||
this.addPlugin(Plugin);
|
||||
});
|
||||
}
|
||||
|
||||
loadServicePlugins(servicePlugs) {
|
||||
const servicePlugins = (typeof servicePlugs !== 'undefined' ? servicePlugs : []);
|
||||
|
||||
// we want to load plugins installed locally in the service
|
||||
if (this.serverless && this.serverless.config && this.serverless.config.servicePath) {
|
||||
module.paths.unshift(path.join(this.serverless.config.servicePath, 'node_modules'));
|
||||
}
|
||||
|
||||
servicePlugins.forEach((servicePlugin) => {
|
||||
const Plugin = require(servicePlugin); // eslint-disable-line global-require
|
||||
|
||||
this.addPlugin(Plugin);
|
||||
});
|
||||
|
||||
// restore module paths
|
||||
if (this.serverless && this.serverless.config && this.serverless.config.servicePath) {
|
||||
module.paths.shift();
|
||||
}
|
||||
}
|
||||
|
||||
loadCommands(pluginInstance) {
|
||||
this.commandsList.push(pluginInstance.commands);
|
||||
|
||||
// TODO: refactor ASAP as it slows down overall performance
|
||||
// rebuild the commands
|
||||
_.forEach(this.commandsList, (commands) => {
|
||||
_.forEach(commands, (commandDetails, command) => {
|
||||
this.commands[command] = commandDetails;
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
getEvents(commandsArray, availableCommands, pre) {
|
||||
const prefix = (typeof pre !== 'undefined' ? pre : '');
|
||||
const commandPart = commandsArray[0];
|
||||
|
||||
if (_.has(availableCommands, commandPart)) {
|
||||
const commandDetails = availableCommands[commandPart];
|
||||
if (commandsArray.length === 1) {
|
||||
const events = [];
|
||||
commandDetails.lifecycleEvents.forEach((event) => {
|
||||
events.push(`before:${prefix}${commandPart}:${event}`);
|
||||
events.push(`${prefix}${commandPart}:${event}`);
|
||||
events.push(`after:${prefix}${commandPart}:${event}`);
|
||||
});
|
||||
return events;
|
||||
}
|
||||
if (_.has(commandDetails, 'commands')) {
|
||||
return this.getEvents(commandsArray.slice(1, commandsArray.length),
|
||||
commandDetails.commands, `${commandPart}:`);
|
||||
}
|
||||
}
|
||||
|
||||
return [];
|
||||
});
|
||||
}
|
||||
|
||||
getPlugins() {
|
||||
return this.plugins;
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = PluginManager;
|
||||
|
||||
@ -69,10 +69,11 @@ class Service {
|
||||
};
|
||||
}
|
||||
|
||||
if (['aws', 'azure', 'google', 'ibm'].indexOf(serverlessFile.provider.name)) {
|
||||
const providers = ['aws', 'azure', 'google', 'ibm'];
|
||||
if (providers.indexOf(serverlessFile.provider.name) === -1) {
|
||||
const errorMessage = [
|
||||
`Provider "${serverlessFile.provider.name}" is not supported.`,
|
||||
' Valid values for provider are: aws, azure, google, ibm.',
|
||||
` Valid values for provider are: ${providers.join(', ')}.`,
|
||||
' Please provide one of those values to the "provider" property in serverless.yml.',
|
||||
].join('');
|
||||
throw new SError(errorMessage);
|
||||
@ -89,7 +90,6 @@ class Service {
|
||||
that.package.individually = serverlessFile.package.individually;
|
||||
that.package.artifact = serverlessFile.package.artifact;
|
||||
that.package.exclude = serverlessFile.package.exclude;
|
||||
that.package.include = serverlessFile.package.include;
|
||||
}
|
||||
|
||||
if (serverlessFile.defaults && serverlessFile.defaults.stage) {
|
||||
|
||||
@ -36,7 +36,8 @@ class AwsCompileApigEvents {
|
||||
_.forEach(this.serverless.service.functions, functionObj => {
|
||||
if (functionObj.events) {
|
||||
functionObj.events.forEach(event => {
|
||||
if (event.http) noEndpoints = false;
|
||||
// Allow events with empty http event to validate function
|
||||
if ({}.hasOwnProperty.call(event, 'http')) noEndpoints = false;
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
@ -38,21 +38,6 @@ module.exports = {
|
||||
|
||||
_.merge(this.serverless.service.provider.compiledCloudFormationTemplate.Resources,
|
||||
newApiKeyObject);
|
||||
|
||||
// Add API Key to Outputs section
|
||||
const newOutput = {
|
||||
Description: apiKey,
|
||||
Value: {
|
||||
Ref: `ApiGatewayApiKey${apiKeyNumber}`,
|
||||
},
|
||||
};
|
||||
|
||||
const newOutputObject = {
|
||||
[`ApiGatewayApiKey${apiKeyNumber}Value`]: newOutput,
|
||||
};
|
||||
|
||||
_.merge(this.serverless.service.provider.compiledCloudFormationTemplate.Outputs,
|
||||
newOutputObject);
|
||||
});
|
||||
}
|
||||
|
||||
|
||||
@ -8,15 +8,26 @@ module.exports = {
|
||||
// validate that path and method exists for each http event in service
|
||||
_.forEach(this.serverless.service.functions, (functionObject, functionName) => {
|
||||
functionObject.events.forEach(event => {
|
||||
if (event.http) {
|
||||
if ({}.hasOwnProperty.call(event, 'http')) {
|
||||
let method;
|
||||
let path;
|
||||
|
||||
if (!event.http) {
|
||||
const errorMessage = [
|
||||
`Empty http event in function "${functionName}"`,
|
||||
' in serverless.yml.',
|
||||
' If you define an http event, make sure you pass a valid value for it,',
|
||||
' either as string syntax, or object syntax.',
|
||||
' Please check the docs for more options.',
|
||||
].join('');
|
||||
throw new this.serverless.classes.Error(errorMessage);
|
||||
}
|
||||
|
||||
if (typeof event.http === 'object') {
|
||||
method = event.http.method.toLowerCase();
|
||||
method = event.http.method;
|
||||
path = event.http.path;
|
||||
} else if (typeof event.http === 'string') {
|
||||
method = event.http.split(' ')[0].toLowerCase();
|
||||
method = event.http.split(' ')[0];
|
||||
path = event.http.split(' ')[1];
|
||||
}
|
||||
|
||||
@ -43,13 +54,15 @@ module.exports = {
|
||||
throw new this.serverless.classes
|
||||
.Error(errorMessage);
|
||||
}
|
||||
method = method.toLowerCase();
|
||||
|
||||
const allowedMethods = [
|
||||
'get', 'post', 'put', 'patch', 'options', 'head', 'delete', 'any',
|
||||
];
|
||||
if (allowedMethods.indexOf(method) === -1) {
|
||||
const errorMessage = [
|
||||
`Invalid APIG method "${method}" in function "${functionName}".`,
|
||||
' AWS supported methods are: get, post, put, patch, options, head, delete, any.',
|
||||
` AWS supported methods are: ${allowedMethods.join(', ')}.`,
|
||||
].join('');
|
||||
throw new this.serverless.classes.Error(errorMessage);
|
||||
}
|
||||
|
||||
@ -82,20 +82,6 @@ describe('#compileApiKeys()', () => {
|
||||
})
|
||||
);
|
||||
|
||||
it('should add api keys cf output template', () => awsCompileApigEvents
|
||||
.compileApiKeys().then(() => {
|
||||
expect(
|
||||
awsCompileApigEvents.serverless.service.provider.compiledCloudFormationTemplate
|
||||
.Outputs.ApiGatewayApiKey1Value.Description
|
||||
).to.equal('1234567890');
|
||||
|
||||
expect(
|
||||
awsCompileApigEvents.serverless.service.provider.compiledCloudFormationTemplate
|
||||
.Outputs.ApiGatewayApiKey1Value.Value.Ref
|
||||
).to.equal('ApiGatewayApiKey1');
|
||||
})
|
||||
);
|
||||
|
||||
it('throw error if apiKey property is not an array', () => {
|
||||
awsCompileApigEvents.serverless.service.provider.apiKeys = 2;
|
||||
expect(() => awsCompileApigEvents.compileApiKeys()).to.throw(Error);
|
||||
|
||||
@ -30,6 +30,19 @@ describe('#validate()', () => {
|
||||
};
|
||||
});
|
||||
|
||||
it('should reject an empty http event', () => {
|
||||
awsCompileApigEvents.serverless.service.functions = {
|
||||
first: {
|
||||
events: [
|
||||
{
|
||||
http: null,
|
||||
},
|
||||
],
|
||||
},
|
||||
};
|
||||
expect(() => awsCompileApigEvents.validate()).to.throw(Error);
|
||||
});
|
||||
|
||||
it('should validate the http events "path" property', () => {
|
||||
awsCompileApigEvents.serverless.service.functions = {
|
||||
first: {
|
||||
|
||||
@ -58,3 +58,19 @@ functions:
|
||||
bucket: confidential-information
|
||||
event: s3:ObjectRemoved:*
|
||||
```
|
||||
|
||||
We can also specify filter rules.
|
||||
|
||||
```yml
|
||||
# serverless.yml
|
||||
functions:
|
||||
mail:
|
||||
handler: mail.removal
|
||||
events:
|
||||
- s3:
|
||||
bucket: confidential-information
|
||||
event: s3:ObjectRemoved:*
|
||||
rules:
|
||||
- prefix: inbox/
|
||||
- suffix: .eml
|
||||
```
|
||||
@ -23,6 +23,7 @@ class AwsCompileS3Events {
|
||||
if (event.s3) {
|
||||
let bucketName;
|
||||
let notificationEvent = 's3:ObjectCreated:*';
|
||||
let filter = {};
|
||||
|
||||
if (typeof event.s3 === 'object') {
|
||||
if (!event.s3.bucket) {
|
||||
@ -38,6 +39,33 @@ class AwsCompileS3Events {
|
||||
if (event.s3.event) {
|
||||
notificationEvent = event.s3.event;
|
||||
}
|
||||
if (event.s3.rules) {
|
||||
if (!_.isArray(event.s3.rules)) {
|
||||
const errorMessage = [
|
||||
`S3 filter rules of function ${functionName} is not an array`,
|
||||
' The correct syntax is: rules: [{ Name: Value }]',
|
||||
' Please check the docs for more info.',
|
||||
].join('');
|
||||
throw new this.serverless.classes
|
||||
.Error(errorMessage);
|
||||
}
|
||||
const rules = [];
|
||||
event.s3.rules.forEach(rule => {
|
||||
if (!_.isPlainObject(rule)) {
|
||||
const errorMessage = [
|
||||
`S3 filter rule ${rule} of function ${functionName} is not an object`,
|
||||
' The correct syntax is: { Name: Value }',
|
||||
' Please check the docs for more info.',
|
||||
].join('');
|
||||
throw new this.serverless.classes
|
||||
.Error(errorMessage);
|
||||
}
|
||||
const name = Object.keys(rule)[0];
|
||||
const value = rule[name];
|
||||
rules.push({ Name: name, Value: value });
|
||||
});
|
||||
filter = { Filter: { S3Key: { Rules: rules } } };
|
||||
}
|
||||
} else if (typeof event.s3 === 'string') {
|
||||
bucketName = event.s3;
|
||||
} else {
|
||||
@ -56,7 +84,7 @@ class AwsCompileS3Events {
|
||||
// check if the bucket already defined
|
||||
// in another S3 event in the service
|
||||
if (bucketsLambdaConfigurations[bucketName]) {
|
||||
const newLambdaConfiguration = {
|
||||
let newLambdaConfiguration = {
|
||||
Event: notificationEvent,
|
||||
Function: {
|
||||
'Fn::GetAtt': [
|
||||
@ -66,6 +94,11 @@ class AwsCompileS3Events {
|
||||
},
|
||||
};
|
||||
|
||||
// Assign 'filter' if not empty
|
||||
newLambdaConfiguration = _.assign(
|
||||
newLambdaConfiguration,
|
||||
filter
|
||||
);
|
||||
bucketsLambdaConfigurations[bucketName]
|
||||
.push(newLambdaConfiguration);
|
||||
} else {
|
||||
@ -80,6 +113,11 @@ class AwsCompileS3Events {
|
||||
},
|
||||
},
|
||||
];
|
||||
// Assign 'filter' if not empty
|
||||
bucketsLambdaConfigurations[bucketName][0] = _.assign(
|
||||
bucketsLambdaConfigurations[bucketName][0],
|
||||
filter
|
||||
);
|
||||
}
|
||||
s3EnabledFunctions.push(functionName);
|
||||
}
|
||||
|
||||
@ -51,6 +51,42 @@ describe('AwsCompileS3Events', () => {
|
||||
expect(() => awsCompileS3Events.compileS3Events()).to.throw(Error);
|
||||
});
|
||||
|
||||
it('should throw an error if the "rules" property is not an array', () => {
|
||||
awsCompileS3Events.serverless.service.functions = {
|
||||
first: {
|
||||
events: [
|
||||
{
|
||||
s3: {
|
||||
bucket: 'first-function-bucket',
|
||||
event: 's3:ObjectCreated:Put',
|
||||
rules: {},
|
||||
},
|
||||
},
|
||||
],
|
||||
},
|
||||
};
|
||||
|
||||
expect(() => awsCompileS3Events.compileS3Events()).to.throw(Error);
|
||||
});
|
||||
|
||||
it('should throw an error if the "rules" property is invalid', () => {
|
||||
awsCompileS3Events.serverless.service.functions = {
|
||||
first: {
|
||||
events: [
|
||||
{
|
||||
s3: {
|
||||
bucket: 'first-function-bucket',
|
||||
event: 's3:ObjectCreated:Put',
|
||||
rules: [[]],
|
||||
},
|
||||
},
|
||||
],
|
||||
},
|
||||
};
|
||||
|
||||
expect(() => awsCompileS3Events.compileS3Events()).to.throw(Error);
|
||||
});
|
||||
|
||||
it('should create corresponding resources when S3 events are given', () => {
|
||||
awsCompileS3Events.serverless.service.functions = {
|
||||
first: {
|
||||
@ -62,6 +98,9 @@ describe('AwsCompileS3Events', () => {
|
||||
s3: {
|
||||
bucket: 'first-function-bucket-two',
|
||||
event: 's3:ObjectCreated:Put',
|
||||
rules: [
|
||||
{ prefix: 'subfolder/' },
|
||||
],
|
||||
},
|
||||
},
|
||||
],
|
||||
@ -79,6 +118,11 @@ describe('AwsCompileS3Events', () => {
|
||||
expect(awsCompileS3Events.serverless.service.provider.compiledCloudFormationTemplate
|
||||
.Resources.FirstLambdaPermissionS3.Type
|
||||
).to.equal('AWS::Lambda::Permission');
|
||||
expect(awsCompileS3Events.serverless.service.provider.compiledCloudFormationTemplate
|
||||
.Resources.S3BucketFirstfunctionbuckettwo.Properties.NotificationConfiguration
|
||||
.LambdaConfigurations[0].Filter).to.deep.equal({
|
||||
S3Key: { Rules: [{ Name: 'prefix', Value: 'subfolder/' }] },
|
||||
});
|
||||
});
|
||||
|
||||
it('should create single bucket resource when the same bucket referenced repeatedly', () => {
|
||||
@ -92,6 +136,9 @@ describe('AwsCompileS3Events', () => {
|
||||
s3: {
|
||||
bucket: 'first-function-bucket-one',
|
||||
event: 's3:ObjectCreated:Put',
|
||||
rules: [
|
||||
{ prefix: 'subfolder/' },
|
||||
],
|
||||
},
|
||||
},
|
||||
],
|
||||
|
||||
@ -9,180 +9,108 @@ class AwsCompileFunctions {
|
||||
this.options = options;
|
||||
this.provider = 'aws';
|
||||
|
||||
this.compileFunctions = this.compileFunctions.bind(this);
|
||||
this.compileFunction = this.compileFunction.bind(this);
|
||||
|
||||
this.hooks = {
|
||||
'deploy:compileFunctions': this.compileFunctions.bind(this),
|
||||
'deploy:compileFunctions': this.compileFunctions,
|
||||
};
|
||||
}
|
||||
|
||||
compileFunctions() {
|
||||
if (typeof this.serverless.service.provider.iamRoleARN !== 'string') {
|
||||
// merge in the iamRoleLambdaTemplate
|
||||
const iamRoleLambdaExecutionTemplate = this.serverless.utils.readFileSync(
|
||||
path.join(this.serverless.config.serverlessPath,
|
||||
'plugins',
|
||||
'aws',
|
||||
'deploy',
|
||||
'compile',
|
||||
'functions',
|
||||
'iam-role-lambda-execution-template.json')
|
||||
);
|
||||
compileFunction(functionName) {
|
||||
const newFunction = this.cfLambdaFunctionTemplate();
|
||||
const functionObject = this.serverless.service.getFunction(functionName);
|
||||
|
||||
_.merge(this.serverless.service.provider.compiledCloudFormationTemplate.Resources,
|
||||
iamRoleLambdaExecutionTemplate);
|
||||
const artifactFilePath = this.serverless.service.package.individually ?
|
||||
functionObject.artifact :
|
||||
this.serverless.service.package.artifact;
|
||||
|
||||
// merge in the iamPolicyLambdaTemplate
|
||||
const iamPolicyLambdaExecutionTemplate = this.serverless.utils.readFileSync(
|
||||
path.join(this.serverless.config.serverlessPath,
|
||||
'plugins',
|
||||
'aws',
|
||||
'deploy',
|
||||
'compile',
|
||||
'functions',
|
||||
'iam-policy-lambda-execution-template.json')
|
||||
);
|
||||
|
||||
_.merge(this.serverless.service.provider.compiledCloudFormationTemplate.Resources,
|
||||
iamPolicyLambdaExecutionTemplate);
|
||||
|
||||
// set the necessary variables for the IamPolicyLambda
|
||||
this.serverless.service.provider.compiledCloudFormationTemplate
|
||||
.Resources
|
||||
.IamPolicyLambdaExecution
|
||||
.Properties
|
||||
.PolicyName = `${this.options.stage}-${this.serverless.service.service}-lambda`;
|
||||
|
||||
// augment with user-supplied custom iam role statements
|
||||
if (this.serverless.service.provider.iamRoleStatements &&
|
||||
this.serverless.service.provider.iamRoleStatements instanceof Array) {
|
||||
this.serverless.service.provider.compiledCloudFormationTemplate
|
||||
.Resources
|
||||
.IamPolicyLambdaExecution
|
||||
.Properties
|
||||
.PolicyDocument
|
||||
.Statement = this.serverless.service.provider.compiledCloudFormationTemplate
|
||||
.Resources
|
||||
.IamPolicyLambdaExecution
|
||||
.Properties
|
||||
.PolicyDocument
|
||||
.Statement.concat(this.serverless.service.provider.iamRoleStatements);
|
||||
}
|
||||
if (!artifactFilePath) {
|
||||
throw new Error(`No artifact path is set for function: ${functionName}`);
|
||||
}
|
||||
|
||||
const functionTemplate = `
|
||||
{
|
||||
"Type": "AWS::Lambda::Function",
|
||||
"Properties": {
|
||||
"Code": {
|
||||
"S3Bucket": { "Ref": "ServerlessDeploymentBucket" },
|
||||
"S3Key": "S3Key"
|
||||
},
|
||||
"FunctionName": "FunctionName",
|
||||
"Handler": "Handler",
|
||||
"MemorySize": "MemorySize",
|
||||
"Role": "Role",
|
||||
"Runtime": "Runtime",
|
||||
"Timeout": "Timeout"
|
||||
}
|
||||
}
|
||||
`;
|
||||
if (this.serverless.service.package.deploymentBucket) {
|
||||
newFunction.Properties.Code.S3Bucket = this.serverless.service.package.deploymentBucket;
|
||||
}
|
||||
|
||||
const outputTemplate = `
|
||||
{
|
||||
"Description": "Lambda function info",
|
||||
"Value": "Value"
|
||||
}
|
||||
`;
|
||||
const s3Folder = this.serverless.service.package.artifactDirectoryName;
|
||||
const s3FileName = artifactFilePath.split(path.sep).pop();
|
||||
newFunction.Properties.Code.S3Key = `${s3Folder}/${s3FileName}`;
|
||||
|
||||
this.serverless.service.getAllFunctions().forEach((functionName) => {
|
||||
const newFunction = JSON.parse(functionTemplate);
|
||||
const functionObject = this.serverless.service.getFunction(functionName);
|
||||
if (!functionObject.handler) {
|
||||
const errorMessage = [
|
||||
`Missing "handler" property in function ${functionName}`,
|
||||
' Please make sure you point to the correct lambda handler.',
|
||||
' For example: handler.hello.',
|
||||
' Please check the docs for more info',
|
||||
].join('');
|
||||
throw new this.serverless.classes
|
||||
.Error(errorMessage);
|
||||
}
|
||||
|
||||
const artifactFilePath = this.serverless.service.package.individually ?
|
||||
functionObject.artifact :
|
||||
this.serverless.service.package.artifact;
|
||||
const Handler = functionObject.handler;
|
||||
const FunctionName = functionObject.name;
|
||||
const MemorySize = Number(functionObject.memorySize)
|
||||
|| Number(this.serverless.service.provider.memorySize)
|
||||
|| 1024;
|
||||
const Timeout = Number(functionObject.timeout)
|
||||
|| Number(this.serverless.service.provider.timeout)
|
||||
|| 6;
|
||||
const Runtime = this.serverless.service.provider.runtime
|
||||
|| 'nodejs4.3';
|
||||
|
||||
if (!artifactFilePath) {
|
||||
throw new Error(`No artifact path is set for function: ${functionName}`);
|
||||
}
|
||||
newFunction.Properties.Handler = Handler;
|
||||
newFunction.Properties.FunctionName = FunctionName;
|
||||
newFunction.Properties.MemorySize = MemorySize;
|
||||
newFunction.Properties.Timeout = Timeout;
|
||||
newFunction.Properties.Runtime = Runtime;
|
||||
|
||||
const s3Folder = this.serverless.service.package.artifactDirectoryName;
|
||||
const s3FileName = artifactFilePath.split(path.sep).pop();
|
||||
newFunction.Properties.Code.S3Key = `${s3Folder}/${s3FileName}`;
|
||||
if (functionObject.description) {
|
||||
newFunction.Properties.Description = functionObject.description;
|
||||
}
|
||||
|
||||
if (!functionObject.handler) {
|
||||
const errorMessage = [
|
||||
`Missing "handler" property in function ${functionName}`,
|
||||
' Please make sure you point to the correct lambda handler.',
|
||||
' For example: handler.hello.',
|
||||
' Please check the docs for more info',
|
||||
].join('');
|
||||
throw new this.serverless.classes
|
||||
.Error(errorMessage);
|
||||
}
|
||||
if (typeof this.serverless.service.provider.iamRoleARN === 'string') {
|
||||
newFunction.Properties.Role = this.serverless.service.provider.iamRoleARN;
|
||||
} else {
|
||||
newFunction.Properties.Role = { 'Fn::GetAtt': ['IamRoleLambdaExecution', 'Arn'] };
|
||||
}
|
||||
|
||||
const Handler = functionObject.handler;
|
||||
const FunctionName = functionObject.name;
|
||||
const MemorySize = Number(functionObject.memorySize)
|
||||
|| Number(this.serverless.service.provider.memorySize)
|
||||
|| 1024;
|
||||
const Timeout = Number(functionObject.timeout)
|
||||
|| Number(this.serverless.service.provider.timeout)
|
||||
|| 6;
|
||||
const Runtime = this.serverless.service.provider.runtime
|
||||
|| 'nodejs4.3';
|
||||
if (!functionObject.vpc) functionObject.vpc = {};
|
||||
if (!this.serverless.service.provider.vpc) this.serverless.service.provider.vpc = {};
|
||||
|
||||
newFunction.Properties.Handler = Handler;
|
||||
newFunction.Properties.FunctionName = FunctionName;
|
||||
newFunction.Properties.MemorySize = MemorySize;
|
||||
newFunction.Properties.Timeout = Timeout;
|
||||
newFunction.Properties.Runtime = Runtime;
|
||||
newFunction.Properties.VpcConfig = {
|
||||
SecurityGroupIds: functionObject.vpc.securityGroupIds ||
|
||||
this.serverless.service.provider.vpc.securityGroupIds,
|
||||
SubnetIds: functionObject.vpc.subnetIds || this.serverless.service.provider.vpc.subnetIds,
|
||||
};
|
||||
|
||||
if (functionObject.description) {
|
||||
newFunction.Properties.Description = functionObject.description;
|
||||
}
|
||||
if (!newFunction.Properties.VpcConfig.SecurityGroupIds
|
||||
|| !newFunction.Properties.VpcConfig.SubnetIds) {
|
||||
delete newFunction.Properties.VpcConfig;
|
||||
}
|
||||
|
||||
if (typeof this.serverless.service.provider.iamRoleARN === 'string') {
|
||||
newFunction.Properties.Role = this.serverless.service.provider.iamRoleARN;
|
||||
} else {
|
||||
newFunction.Properties.Role = { 'Fn::GetAtt': ['IamRoleLambdaExecution', 'Arn'] };
|
||||
}
|
||||
const normalizedFunctionName = functionName[0].toUpperCase() + functionName.substr(1);
|
||||
const functionLogicalId = `${normalizedFunctionName}LambdaFunction`;
|
||||
const newFunctionObject = {
|
||||
[functionLogicalId]: newFunction,
|
||||
};
|
||||
|
||||
if (!functionObject.vpc) functionObject.vpc = {};
|
||||
if (!this.serverless.service.provider.vpc) this.serverless.service.provider.vpc = {};
|
||||
_.merge(this.serverless.service.provider.compiledCloudFormationTemplate.Resources,
|
||||
newFunctionObject);
|
||||
|
||||
newFunction.Properties.VpcConfig = {
|
||||
SecurityGroupIds: functionObject.vpc.securityGroupIds ||
|
||||
this.serverless.service.provider.vpc.securityGroupIds,
|
||||
SubnetIds: functionObject.vpc.subnetIds || this.serverless.service.provider.vpc.subnetIds,
|
||||
};
|
||||
// Add function to Outputs section
|
||||
const newOutput = this.cfOutputDescriptionTemplate();
|
||||
newOutput.Value = { 'Fn::GetAtt': [functionLogicalId, 'Arn'] };
|
||||
|
||||
if (!newFunction.Properties.VpcConfig.SecurityGroupIds
|
||||
|| !newFunction.Properties.VpcConfig.SubnetIds) {
|
||||
delete newFunction.Properties.VpcConfig;
|
||||
}
|
||||
const newOutputObject = {
|
||||
[`${functionLogicalId}Arn`]: newOutput,
|
||||
};
|
||||
|
||||
const normalizedFunctionName = functionName[0].toUpperCase() + functionName.substr(1);
|
||||
const functionLogicalId = `${normalizedFunctionName}LambdaFunction`;
|
||||
const newFunctionObject = {
|
||||
[functionLogicalId]: newFunction,
|
||||
};
|
||||
_.merge(this.serverless.service.provider.compiledCloudFormationTemplate.Outputs,
|
||||
newOutputObject);
|
||||
|
||||
_.merge(this.serverless.service.provider.compiledCloudFormationTemplate.Resources,
|
||||
newFunctionObject);
|
||||
|
||||
// Add function to Outputs section
|
||||
const newOutput = JSON.parse(outputTemplate);
|
||||
newOutput.Value = { 'Fn::GetAtt': [functionLogicalId, 'Arn'] };
|
||||
|
||||
const newOutputObject = {
|
||||
[`${functionLogicalId}Arn`]: newOutput,
|
||||
};
|
||||
|
||||
_.merge(this.serverless.service.provider.compiledCloudFormationTemplate.Outputs,
|
||||
newOutputObject);
|
||||
|
||||
if (typeof this.serverless.service.provider.iamRoleARN !== 'string') { // only if default IAM
|
||||
const logGroupTemplate = `
|
||||
if (typeof this.serverless.service.provider.iamRoleARN !== 'string') { // only if default IAM
|
||||
const logGroupTemplate = `
|
||||
{
|
||||
"${normalizedFunctionName}LogGroup": {
|
||||
"Type" : "AWS::Logs::LogGroup",
|
||||
@ -192,29 +120,62 @@ class AwsCompileFunctions {
|
||||
}
|
||||
}
|
||||
`;
|
||||
const newLogGroup = JSON.parse(logGroupTemplate);
|
||||
_.merge(this.serverless.service.provider.compiledCloudFormationTemplate.Resources,
|
||||
newLogGroup);
|
||||
const newLogGroup = JSON.parse(logGroupTemplate);
|
||||
_.merge(this.serverless.service.provider.compiledCloudFormationTemplate.Resources,
|
||||
newLogGroup);
|
||||
|
||||
this.serverless.service.provider.compiledCloudFormationTemplate
|
||||
.Resources
|
||||
.IamPolicyLambdaExecution
|
||||
.Properties
|
||||
.PolicyDocument
|
||||
.Statement[0]
|
||||
.Resource
|
||||
.push({
|
||||
'Fn::Join': [
|
||||
':',
|
||||
[
|
||||
{ 'Fn::GetAtt': [`${normalizedFunctionName}LogGroup`, 'Arn'] },
|
||||
'*',
|
||||
'*',
|
||||
],
|
||||
],
|
||||
});
|
||||
}
|
||||
});
|
||||
this.serverless.service.provider.compiledCloudFormationTemplate
|
||||
.Resources
|
||||
.IamPolicyLambdaExecution
|
||||
.Properties
|
||||
.PolicyDocument
|
||||
.Statement[0]
|
||||
.Resource
|
||||
.push({
|
||||
'Fn::Join': [
|
||||
':',
|
||||
[
|
||||
{ 'Fn::GetAtt': [`${normalizedFunctionName}LogGroup`, 'Arn'] },
|
||||
'*',
|
||||
'*',
|
||||
],
|
||||
],
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
compileFunctions() {
|
||||
this.serverless.service
|
||||
.getAllFunctions()
|
||||
.forEach((functionName) => this.compileFunction(functionName));
|
||||
}
|
||||
|
||||
// Helper functions
|
||||
cfLambdaFunctionTemplate() {
|
||||
return {
|
||||
Type: 'AWS::Lambda::Function',
|
||||
Properties: {
|
||||
Code: {
|
||||
S3Bucket: {
|
||||
Ref: 'ServerlessDeploymentBucket',
|
||||
},
|
||||
S3Key: 'S3Key',
|
||||
},
|
||||
FunctionName: 'FunctionName',
|
||||
Handler: 'Handler',
|
||||
MemorySize: 'MemorySize',
|
||||
Role: 'Role',
|
||||
Runtime: 'Runtime',
|
||||
Timeout: 'Timeout',
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
cfOutputDescriptionTemplate() {
|
||||
return {
|
||||
Description: 'Lambda function info',
|
||||
Value: 'Value',
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@ -80,175 +80,6 @@ describe('AwsCompileFunctions', () => {
|
||||
.to.deep.equal(`${s3Folder}/${s3FileName}`);
|
||||
});
|
||||
|
||||
it('should merge the IamRoleLambdaExecution template into the CloudFormation template', () => {
|
||||
const IamRoleLambdaExecutionTemplate = awsCompileFunctions.serverless.utils.readFileSync(
|
||||
path.join(
|
||||
__dirname,
|
||||
'..',
|
||||
'iam-role-lambda-execution-template.json'
|
||||
)
|
||||
);
|
||||
|
||||
awsCompileFunctions.compileFunctions();
|
||||
|
||||
expect(awsCompileFunctions.serverless.service.provider.compiledCloudFormationTemplate
|
||||
.Resources.IamRoleLambdaExecution
|
||||
).to.deep.equal(IamRoleLambdaExecutionTemplate.IamRoleLambdaExecution);
|
||||
});
|
||||
|
||||
it('should merge IamPolicyLambdaExecution template into the CloudFormation template', () => {
|
||||
awsCompileFunctions.compileFunctions();
|
||||
|
||||
// we check for the type here because a deep equality check will error out due to
|
||||
// the updates which are made after the merge (they are tested in a separate test)
|
||||
expect(awsCompileFunctions.serverless.service.provider.compiledCloudFormationTemplate
|
||||
.Resources.IamPolicyLambdaExecution.Type
|
||||
).to.deep.equal('AWS::IAM::Policy');
|
||||
});
|
||||
|
||||
it('should update IamPolicyLambdaExecution PolicyName to join $stage-$service-lambda', () => {
|
||||
awsCompileFunctions.compileFunctions();
|
||||
|
||||
expect(awsCompileFunctions.serverless.service.provider.compiledCloudFormationTemplate
|
||||
.Resources
|
||||
.IamPolicyLambdaExecution
|
||||
.Properties
|
||||
.PolicyName
|
||||
).to.equal(`${
|
||||
awsCompileFunctions.options.stage
|
||||
}-${
|
||||
awsCompileFunctions.serverless.service.service
|
||||
}-lambda`);
|
||||
});
|
||||
|
||||
it('should add a CloudWatch LogGroup resource', () => {
|
||||
const normalizedName = `${functionName[0].toUpperCase()}${functionName.substr(1)}LogGroup`;
|
||||
awsCompileFunctions.compileFunctions();
|
||||
|
||||
expect(awsCompileFunctions.serverless.service.provider.compiledCloudFormationTemplate
|
||||
.Resources[normalizedName]
|
||||
).to.deep.equal(
|
||||
{
|
||||
Type: 'AWS::Logs::LogGroup',
|
||||
Properties: {
|
||||
LogGroupName: `/aws/lambda/${functionName}`,
|
||||
},
|
||||
}
|
||||
);
|
||||
});
|
||||
|
||||
it('should update IamPolicyLambdaExecution with a logging resource for the function', () => {
|
||||
const service = awsCompileFunctions.serverless.service; // avoid 100 char lines below
|
||||
service.functions = {
|
||||
func0: {
|
||||
handler: 'func.function.handler',
|
||||
name: 'func0',
|
||||
},
|
||||
func1: {
|
||||
handler: 'func.function.handler',
|
||||
name: 'func1',
|
||||
},
|
||||
};
|
||||
const f = service.functions; // avoid 100 char lines below
|
||||
const normalizedNames = [
|
||||
`${f.func0.name[0].toUpperCase()}${f.func0.name.substr(1)}LogGroup`,
|
||||
`${f.func1.name[0].toUpperCase()}${f.func1.name.substr(1)}LogGroup`,
|
||||
];
|
||||
awsCompileFunctions.compileFunctions();
|
||||
|
||||
expect(awsCompileFunctions.serverless.service.provider.compiledCloudFormationTemplate
|
||||
.Resources[normalizedNames[0]]
|
||||
).to.deep.equal(
|
||||
{
|
||||
Type: 'AWS::Logs::LogGroup',
|
||||
Properties: {
|
||||
LogGroupName: `/aws/lambda/${service.functions.func0.name}`,
|
||||
},
|
||||
}
|
||||
);
|
||||
expect(awsCompileFunctions.serverless.service.provider.compiledCloudFormationTemplate
|
||||
.Resources[normalizedNames[1]]
|
||||
).to.deep.equal(
|
||||
{
|
||||
Type: 'AWS::Logs::LogGroup',
|
||||
Properties: {
|
||||
LogGroupName: `/aws/lambda/${service.functions.func1.name}`,
|
||||
},
|
||||
}
|
||||
);
|
||||
});
|
||||
|
||||
it('should update IamPolicyLambdaExecution with a logging resource for the function', () => {
|
||||
const normalizedName = `${functionName[0].toUpperCase()}${functionName.substr(1)}LogGroup`;
|
||||
awsCompileFunctions.compileFunctions();
|
||||
|
||||
expect(awsCompileFunctions.serverless.service.provider.compiledCloudFormationTemplate
|
||||
.Resources
|
||||
.IamPolicyLambdaExecution
|
||||
.Properties
|
||||
.PolicyDocument
|
||||
.Statement[0]
|
||||
.Resource
|
||||
).to.deep.equal(
|
||||
[
|
||||
{ 'Fn::Join': [':', [{ 'Fn::GetAtt': [normalizedName, 'Arn'] }, '*', '*']] },
|
||||
]
|
||||
);
|
||||
});
|
||||
|
||||
it('should update IamPolicyLambdaExecution with each function\'s logging resources', () => {
|
||||
const service = awsCompileFunctions.serverless.service; // avoid 100 char lines below
|
||||
service.functions = {
|
||||
func0: {
|
||||
handler: 'func.function.handler',
|
||||
name: 'func0',
|
||||
},
|
||||
func1: {
|
||||
handler: 'func.function.handler',
|
||||
name: 'func1',
|
||||
},
|
||||
};
|
||||
const f = service.functions; // avoid 100 char lines below
|
||||
const normalizedNames = [
|
||||
`${f.func0.name[0].toUpperCase()}${f.func0.name.substr(1)}LogGroup`,
|
||||
`${f.func1.name[0].toUpperCase()}${f.func1.name.substr(1)}LogGroup`,
|
||||
];
|
||||
awsCompileFunctions.compileFunctions();
|
||||
|
||||
expect(awsCompileFunctions.serverless.service.provider.compiledCloudFormationTemplate
|
||||
.Resources
|
||||
.IamPolicyLambdaExecution
|
||||
.Properties
|
||||
.PolicyDocument
|
||||
.Statement[0]
|
||||
.Resource
|
||||
).to.deep.equal(
|
||||
[
|
||||
{ 'Fn::Join': [':', [{ 'Fn::GetAtt': [normalizedNames[0], 'Arn'] }, '*', '*']] },
|
||||
{ 'Fn::Join': [':', [{ 'Fn::GetAtt': [normalizedNames[1], 'Arn'] }, '*', '*']] },
|
||||
]
|
||||
);
|
||||
});
|
||||
|
||||
it('should add custom IAM policy statements', () => {
|
||||
awsCompileFunctions.serverless.service.provider.name = 'aws';
|
||||
awsCompileFunctions.serverless.service.provider.iamRoleStatements = [
|
||||
{
|
||||
Effect: 'Allow',
|
||||
Action: [
|
||||
'something:SomethingElse',
|
||||
],
|
||||
Resource: 'some:aws:arn:xxx:*:*',
|
||||
},
|
||||
];
|
||||
|
||||
awsCompileFunctions.compileFunctions();
|
||||
|
||||
expect(awsCompileFunctions.serverless.service.provider.compiledCloudFormationTemplate
|
||||
.Resources.IamPolicyLambdaExecution.Properties.PolicyDocument.Statement[1]
|
||||
).to.deep.equal(awsCompileFunctions.serverless.service.provider.iamRoleStatements[0]);
|
||||
});
|
||||
|
||||
it('should add iamRoleARN', () => {
|
||||
awsCompileFunctions.serverless.service.provider.name = 'aws';
|
||||
awsCompileFunctions.serverless.service.provider.iamRoleARN = 'some:aws:arn:xxx:*:*';
|
||||
@ -283,7 +114,7 @@ describe('AwsCompileFunctions', () => {
|
||||
name: 'new-service-dev-func',
|
||||
},
|
||||
};
|
||||
const compliedFunction = {
|
||||
const compiledFunction = {
|
||||
Type: 'AWS::Lambda::Function',
|
||||
Properties: {
|
||||
Code: {
|
||||
@ -305,7 +136,7 @@ describe('AwsCompileFunctions', () => {
|
||||
expect(
|
||||
awsCompileFunctions.serverless.service.provider.compiledCloudFormationTemplate
|
||||
.Resources.FuncLambdaFunction
|
||||
).to.deep.equal(compliedFunction);
|
||||
).to.deep.equal(compiledFunction);
|
||||
});
|
||||
|
||||
it('should create a function resource with VPC config', () => {
|
||||
@ -456,6 +287,55 @@ describe('AwsCompileFunctions', () => {
|
||||
).to.deep.equal(compiledFunction);
|
||||
});
|
||||
|
||||
it('should use a custom bucket if specified', () => {
|
||||
const bucketName = 'com.serverless.deploys';
|
||||
|
||||
awsCompileFunctions.serverless.service.package.deploymentBucket = bucketName;
|
||||
awsCompileFunctions.serverless.service.provider.runtime = 'python2.7';
|
||||
awsCompileFunctions.serverless.service.provider.memorySize = 128;
|
||||
awsCompileFunctions.serverless.service.functions = {
|
||||
func: {
|
||||
handler: 'func.function.handler',
|
||||
name: 'new-service-dev-func',
|
||||
},
|
||||
};
|
||||
const compiledFunction = {
|
||||
Type: 'AWS::Lambda::Function',
|
||||
Properties: {
|
||||
Code: {
|
||||
S3Key: `${awsCompileFunctions.serverless.service.package.artifactDirectoryName}/${
|
||||
awsCompileFunctions.serverless.service.package.artifact}`,
|
||||
S3Bucket: bucketName,
|
||||
},
|
||||
FunctionName: 'new-service-dev-func',
|
||||
Handler: 'func.function.handler',
|
||||
MemorySize: 128,
|
||||
Role: { 'Fn::GetAtt': ['IamRoleLambdaExecution', 'Arn'] },
|
||||
Runtime: 'python2.7',
|
||||
Timeout: 6,
|
||||
},
|
||||
};
|
||||
const coreCloudFormationTemplate = awsCompileFunctions.serverless.utils.readFileSync(
|
||||
path.join(
|
||||
__dirname,
|
||||
'..',
|
||||
'..',
|
||||
'..',
|
||||
'lib',
|
||||
'core-cloudformation-template.json'
|
||||
)
|
||||
);
|
||||
awsCompileFunctions.serverless.service.provider
|
||||
.compiledCloudFormationTemplate = coreCloudFormationTemplate;
|
||||
|
||||
awsCompileFunctions.compileFunctions();
|
||||
|
||||
expect(
|
||||
awsCompileFunctions.serverless.service.provider.compiledCloudFormationTemplate
|
||||
.Resources.FuncLambdaFunction
|
||||
).to.deep.equal(compiledFunction);
|
||||
});
|
||||
|
||||
it('should include description if specified', () => {
|
||||
awsCompileFunctions.serverless.service.functions = {
|
||||
func: {
|
||||
|
||||
@ -10,6 +10,7 @@ const setBucketName = require('./lib/setBucketName');
|
||||
const cleanupS3Bucket = require('./lib/cleanupS3Bucket');
|
||||
const uploadArtifacts = require('./lib/uploadArtifacts');
|
||||
const updateStack = require('./lib/updateStack');
|
||||
const configureStack = require('./lib/configureStack');
|
||||
|
||||
const SDK = require('../');
|
||||
|
||||
@ -30,12 +31,17 @@ class AwsDeploy {
|
||||
cleanupS3Bucket,
|
||||
uploadArtifacts,
|
||||
updateStack,
|
||||
monitorStack
|
||||
monitorStack,
|
||||
configureStack
|
||||
);
|
||||
|
||||
this.hooks = {
|
||||
'before:deploy:initialize': () => BbPromise.bind(this)
|
||||
.then(this.validate),
|
||||
.then(this.validate),
|
||||
|
||||
'deploy:initialize': () => BbPromise.bind(this)
|
||||
.then(this.configureStack)
|
||||
.then(this.mergeCustomProviderResources),
|
||||
|
||||
'deploy:setupProviderConfiguration': () => BbPromise.bind(this)
|
||||
.then(this.createStack)
|
||||
@ -44,8 +50,6 @@ class AwsDeploy {
|
||||
'before:deploy:compileFunctions': () => BbPromise.bind(this)
|
||||
.then(this.generateArtifactDirectoryName),
|
||||
|
||||
'before:deploy:deploy': () => BbPromise.bind(this).then(this.mergeCustomProviderResources),
|
||||
|
||||
'deploy:deploy': () => BbPromise.bind(this)
|
||||
.then(this.setBucketName)
|
||||
.then(this.cleanupS3Bucket)
|
||||
|
||||
@ -7,19 +7,26 @@ module.exports = {
|
||||
getObjectsToRemove() {
|
||||
// 4 old ones + the one which will be uploaded after the cleanup = 5
|
||||
const directoriesToKeepCount = 4;
|
||||
const serviceStage = `${this.serverless.service.service}/${this.options.stage}`;
|
||||
|
||||
return this.sdk.request('S3',
|
||||
'listObjectsV2',
|
||||
{ Bucket: this.bucketName },
|
||||
{
|
||||
Bucket: this.bucketName,
|
||||
Prefix: `serverless/${serviceStage}`,
|
||||
},
|
||||
this.options.stage,
|
||||
this.options.region)
|
||||
.then((result) => {
|
||||
if (result.Contents.length) {
|
||||
let directories = [];
|
||||
const regex = new RegExp(
|
||||
`serverless/${serviceStage}/(.+-.+-.+-.+)`
|
||||
);
|
||||
|
||||
// get the unique directory names
|
||||
result.Contents.forEach((obj) => {
|
||||
const match = obj.Key.match(/(.+\-.+\-.+\-.+)\//);
|
||||
const match = obj.Key.match(regex);
|
||||
|
||||
if (match) {
|
||||
const directoryName = match[1];
|
||||
|
||||
109
lib/plugins/aws/deploy/lib/configureStack.js
Normal file
109
lib/plugins/aws/deploy/lib/configureStack.js
Normal file
@ -0,0 +1,109 @@
|
||||
'use strict';
|
||||
|
||||
const _ = require('lodash');
|
||||
const BbPromise = require('bluebird');
|
||||
const path = require('path');
|
||||
|
||||
module.exports = {
|
||||
configureStack() {
|
||||
this.serverless.service.provider
|
||||
.compiledCloudFormationTemplate = this.serverless.utils.readFileSync(
|
||||
path.join(this.serverless.config.serverlessPath,
|
||||
'plugins',
|
||||
'aws',
|
||||
'deploy',
|
||||
'lib',
|
||||
'core-cloudformation-template.json')
|
||||
);
|
||||
|
||||
if (typeof this.serverless.service.provider.iamRoleARN !== 'string') {
|
||||
// merge in the iamRoleLambdaTemplate
|
||||
const iamRoleLambdaExecutionTemplate = this.serverless.utils.readFileSync(
|
||||
path.join(this.serverless.config.serverlessPath,
|
||||
'plugins',
|
||||
'aws',
|
||||
'deploy',
|
||||
'lib',
|
||||
'iam-role-lambda-execution-template.json')
|
||||
);
|
||||
|
||||
_.merge(this.serverless.service.provider.compiledCloudFormationTemplate.Resources,
|
||||
iamRoleLambdaExecutionTemplate);
|
||||
|
||||
// merge in the iamPolicyLambdaTemplate
|
||||
const iamPolicyLambdaExecutionTemplate = this.serverless.utils.readFileSync(
|
||||
path.join(this.serverless.config.serverlessPath,
|
||||
'plugins',
|
||||
'aws',
|
||||
'deploy',
|
||||
'lib',
|
||||
'iam-policy-lambda-execution-template.json')
|
||||
);
|
||||
|
||||
// set the necessary variables for the IamPolicyLambda
|
||||
iamPolicyLambdaExecutionTemplate
|
||||
.IamPolicyLambdaExecution
|
||||
.Properties
|
||||
.PolicyName = `${this.options.stage}-${this.serverless.service.service}-lambda`;
|
||||
|
||||
iamPolicyLambdaExecutionTemplate
|
||||
.IamPolicyLambdaExecution
|
||||
.Properties
|
||||
.PolicyDocument
|
||||
.Statement[0]
|
||||
.Resource = `arn:aws:logs:${this.options.region}:*:*`;
|
||||
|
||||
_.merge(this.serverless.service.provider.compiledCloudFormationTemplate.Resources,
|
||||
iamPolicyLambdaExecutionTemplate);
|
||||
|
||||
|
||||
// add custom iam role statements
|
||||
if (this.serverless.service.provider.iamRoleStatements &&
|
||||
this.serverless.service.provider.iamRoleStatements instanceof Array) {
|
||||
this.serverless.service.provider.compiledCloudFormationTemplate
|
||||
.Resources
|
||||
.IamPolicyLambdaExecution
|
||||
.Properties
|
||||
.PolicyDocument
|
||||
.Statement = this.serverless.service.provider.compiledCloudFormationTemplate
|
||||
.Resources
|
||||
.IamPolicyLambdaExecution
|
||||
.Properties
|
||||
.PolicyDocument
|
||||
.Statement.concat(this.serverless.service.provider.iamRoleStatements);
|
||||
}
|
||||
}
|
||||
|
||||
const bucketName = this.serverless.service.provider.deploymentBucket;
|
||||
|
||||
if (bucketName) {
|
||||
return BbPromise.bind(this)
|
||||
.then(() => this.validateS3BucketName(bucketName))
|
||||
.then(() => this.sdk.request('S3',
|
||||
'getBucketLocation',
|
||||
{
|
||||
Bucket: bucketName,
|
||||
},
|
||||
this.options.stage,
|
||||
this.options.region
|
||||
))
|
||||
.then(result => {
|
||||
if (result.LocationConstraint !== this.options.region) {
|
||||
throw new this.serverless.classes.Error(
|
||||
'Deployment bucket is not in the same region as the lambda function'
|
||||
);
|
||||
}
|
||||
this.bucketName = bucketName;
|
||||
this.serverless.service.package.deploymentBucket = bucketName;
|
||||
this.serverless.service.provider.compiledCloudFormationTemplate
|
||||
.Outputs.ServerlessDeploymentBucketName.Value = bucketName;
|
||||
|
||||
delete this.serverless.service.provider.compiledCloudFormationTemplate
|
||||
.Resources.ServerlessDeploymentBucket;
|
||||
});
|
||||
}
|
||||
|
||||
return BbPromise.resolve();
|
||||
},
|
||||
|
||||
};
|
||||
@ -7,7 +7,7 @@ module.exports = {
|
||||
create() {
|
||||
this.serverless.cli.log('Creating Stack...');
|
||||
const stackName = `${this.serverless.service.service}-${this.options.stage}`;
|
||||
const coreCloudFormationTemplate = this.loadCoreCloudFormationTemplate();
|
||||
|
||||
const params = {
|
||||
StackName: stackName,
|
||||
OnFailure: 'ROLLBACK',
|
||||
@ -15,7 +15,8 @@ module.exports = {
|
||||
'CAPABILITY_IAM',
|
||||
],
|
||||
Parameters: [],
|
||||
TemplateBody: JSON.stringify(coreCloudFormationTemplate),
|
||||
TemplateBody: JSON.stringify(this.serverless.service.provider
|
||||
.compiledCloudFormationTemplate),
|
||||
Tags: [{
|
||||
Key: 'STAGE',
|
||||
Value: this.options.stage,
|
||||
@ -31,9 +32,16 @@ module.exports = {
|
||||
|
||||
createStack() {
|
||||
const stackName = `${this.serverless.service.service}-${this.options.stage}`;
|
||||
|
||||
this.serverless.service.provider
|
||||
.compiledCloudFormationTemplate = this.loadCoreCloudFormationTemplate();
|
||||
if (/^[^a-zA-Z].+|.*[^a-zA-Z0-9\-].*/.test(stackName) || stackName.length > 128) {
|
||||
const errorMessage = [
|
||||
`The stack service name "${stackName}" is not valid. `,
|
||||
'A service name should only contain alphanumeric',
|
||||
' (case sensitive) and hyphens. It should start',
|
||||
' with an alphabetic character and shouldn\'t',
|
||||
' exceed 128 characters.',
|
||||
].join('');
|
||||
throw new this.serverless.classes.Error(errorMessage);
|
||||
}
|
||||
|
||||
return BbPromise.bind(this)
|
||||
// always write the template to disk, whether we are deploying or not
|
||||
@ -61,17 +69,6 @@ module.exports = {
|
||||
},
|
||||
|
||||
// helper methods
|
||||
loadCoreCloudFormationTemplate() {
|
||||
return this.serverless.utils.readFileSync(
|
||||
path.join(this.serverless.config.serverlessPath,
|
||||
'plugins',
|
||||
'aws',
|
||||
'deploy',
|
||||
'lib',
|
||||
'core-cloudformation-template.json')
|
||||
);
|
||||
},
|
||||
|
||||
writeCreateTemplateToDisk() {
|
||||
const cfTemplateFilePath = path.join(this.serverless.config.servicePath,
|
||||
'.serverless', 'cloudformation-template-create-stack.json');
|
||||
|
||||
@ -5,8 +5,10 @@ const BbPromise = require('bluebird');
|
||||
module.exports = {
|
||||
generateArtifactDirectoryName() {
|
||||
const date = new Date();
|
||||
const serviceStage = `${this.serverless.service.service}/${this.options.stage}`;
|
||||
const dateString = `${date.getTime().toString()}-${date.toISOString()}`;
|
||||
this.serverless.service.package
|
||||
.artifactDirectoryName = `${date.getTime().toString()}-${date.toISOString()}`;
|
||||
.artifactDirectoryName = `serverless/${serviceStage}/${dateString}`;
|
||||
|
||||
return BbPromise.resolve();
|
||||
},
|
||||
|
||||
@ -4,6 +4,10 @@ const BbPromise = require('bluebird');
|
||||
|
||||
module.exports = {
|
||||
setBucketName() {
|
||||
if (this.bucketName) {
|
||||
return BbPromise.resolve(this.bucketName);
|
||||
}
|
||||
|
||||
if (this.options.noDeploy) {
|
||||
return BbPromise.resolve();
|
||||
}
|
||||
|
||||
@ -16,6 +16,7 @@ module.exports = {
|
||||
Bucket: this.bucketName,
|
||||
Key: `${this.serverless.service.package.artifactDirectoryName}/${fileName}`,
|
||||
Body: body,
|
||||
ContentType: 'application/json',
|
||||
};
|
||||
|
||||
return this.sdk.request('S3',
|
||||
@ -38,6 +39,7 @@ module.exports = {
|
||||
Bucket: this.bucketName,
|
||||
Key: `${this.serverless.service.package.artifactDirectoryName}/${fileName}`,
|
||||
Body: body,
|
||||
ContentType: 'application/zip',
|
||||
};
|
||||
|
||||
return this.sdk.request('S3',
|
||||
|
||||
@ -8,3 +8,4 @@ require('./cleanupS3Bucket');
|
||||
require('./uploadArtifacts');
|
||||
require('./updateStack');
|
||||
require('./index');
|
||||
require('./configureStack');
|
||||
|
||||
@ -9,13 +9,16 @@ const Serverless = require('../../../../Serverless');
|
||||
describe('cleanupS3Bucket', () => {
|
||||
let serverless;
|
||||
let awsDeploy;
|
||||
let s3Key;
|
||||
|
||||
beforeEach(() => {
|
||||
serverless = new Serverless();
|
||||
serverless.service.service = 'cleanupS3Bucket';
|
||||
const options = {
|
||||
stage: 'dev',
|
||||
region: 'us-east-1',
|
||||
};
|
||||
s3Key = `serverless/${serverless.service.service}/${options.stage}`;
|
||||
awsDeploy = new AwsDeploy(serverless, options);
|
||||
awsDeploy.bucketName = 'deployment-bucket';
|
||||
awsDeploy.serverless.cli = new serverless.classes.CLI();
|
||||
@ -35,6 +38,7 @@ describe('cleanupS3Bucket', () => {
|
||||
expect(listObjectsStub.args[0][0]).to.be.equal('S3');
|
||||
expect(listObjectsStub.args[0][1]).to.be.equal('listObjectsV2');
|
||||
expect(listObjectsStub.args[0][2].Bucket).to.be.equal(awsDeploy.bucketName);
|
||||
expect(listObjectsStub.args[0][2].Prefix).to.be.equal(`${s3Key}`);
|
||||
expect(listObjectsStub.calledWith(awsDeploy.options.stage, awsDeploy.options.region));
|
||||
awsDeploy.sdk.request.restore();
|
||||
});
|
||||
@ -43,18 +47,18 @@ describe('cleanupS3Bucket', () => {
|
||||
it('should return all to be removed service objects (except the last 4)', () => {
|
||||
const serviceObjects = {
|
||||
Contents: [
|
||||
{ Key: '151224711231-2016-08-18T15:42:00/artifact.zip' },
|
||||
{ Key: '151224711231-2016-08-18T15:42:00/cloudformation.json' },
|
||||
{ Key: '141264711231-2016-08-18T15:42:00/artifact.zip' },
|
||||
{ Key: '141264711231-2016-08-18T15:42:00/cloudformation.json' },
|
||||
{ Key: '141321321541-2016-08-18T11:23:02/artifact.zip' },
|
||||
{ Key: '141321321541-2016-08-18T11:23:02/cloudformation.json' },
|
||||
{ Key: '142003031341-2016-08-18T12:46:04/artifact.zip' },
|
||||
{ Key: '142003031341-2016-08-18T12:46:04/cloudformation.json' },
|
||||
{ Key: '113304333331-2016-08-18T13:40:06/artifact.zip' },
|
||||
{ Key: '113304333331-2016-08-18T13:40:06/cloudformation.json' },
|
||||
{ Key: '903940390431-2016-08-18T23:42:08/artifact.zip' },
|
||||
{ Key: '903940390431-2016-08-18T23:42:08/cloudformation.json' },
|
||||
{ Key: `${s3Key}/151224711231-2016-08-18T15:42:00/artifact.zip` },
|
||||
{ Key: `${s3Key}/151224711231-2016-08-18T15:42:00/cloudformation.json` },
|
||||
{ Key: `${s3Key}/141264711231-2016-08-18T15:42:00/artifact.zip` },
|
||||
{ Key: `${s3Key}/141264711231-2016-08-18T15:42:00/cloudformation.json` },
|
||||
{ Key: `${s3Key}/141321321541-2016-08-18T11:23:02/artifact.zip` },
|
||||
{ Key: `${s3Key}/141321321541-2016-08-18T11:23:02/cloudformation.json` },
|
||||
{ Key: `${s3Key}/142003031341-2016-08-18T12:46:04/artifact.zip` },
|
||||
{ Key: `${s3Key}/142003031341-2016-08-18T12:46:04/cloudformation.json` },
|
||||
{ Key: `${s3Key}/113304333331-2016-08-18T13:40:06/artifact.zip` },
|
||||
{ Key: `${s3Key}/113304333331-2016-08-18T13:40:06/cloudformation.json` },
|
||||
{ Key: `${s3Key}/903940390431-2016-08-18T23:42:08/artifact.zip` },
|
||||
{ Key: `${s3Key}/903940390431-2016-08-18T23:42:08/cloudformation.json` },
|
||||
],
|
||||
};
|
||||
|
||||
@ -63,25 +67,42 @@ describe('cleanupS3Bucket', () => {
|
||||
|
||||
return awsDeploy.getObjectsToRemove().then((objectsToRemove) => {
|
||||
expect(objectsToRemove).to.not
|
||||
.include({ Key: '141321321541-2016-08-18T11:23:02/artifact.zip' });
|
||||
.include(
|
||||
{ Key: `${s3Key}${s3Key}/141321321541-2016-08-18T11:23:02/artifact.zip` });
|
||||
|
||||
expect(objectsToRemove).to.not
|
||||
.include({ Key: '141321321541-2016-08-18T11:23:02/cloudformation.json' });
|
||||
.include(
|
||||
{ Key: `${s3Key}${s3Key}/141321321541-2016-08-18T11:23:02/cloudformation.json` });
|
||||
|
||||
expect(objectsToRemove).to.not
|
||||
.include({ Key: '142003031341-2016-08-18T12:46:04/artifact.zip' });
|
||||
.include(
|
||||
{ Key: `${s3Key}${s3Key}/142003031341-2016-08-18T12:46:04/artifact.zip` });
|
||||
|
||||
expect(objectsToRemove).to.not
|
||||
.include({ Key: '142003031341-2016-08-18T12:46:04/cloudformation.json' });
|
||||
.include(
|
||||
{ Key: `${s3Key}${s3Key}/142003031341-2016-08-18T12:46:04/cloudformation.json` });
|
||||
|
||||
expect(objectsToRemove).to.not
|
||||
.include({ Key: '151224711231-2016-08-18T15:42:00/artifact.zip' });
|
||||
.include(
|
||||
{ Key: `${s3Key}${s3Key}/151224711231-2016-08-18T15:42:00/artifact.zip` });
|
||||
|
||||
expect(objectsToRemove).to.not
|
||||
.include({ Key: '151224711231-2016-08-18T15:42:00/cloudformation.json' });
|
||||
.include(
|
||||
{ Key: `${s3Key}${s3Key}/151224711231-2016-08-18T15:42:00/cloudformation.json` });
|
||||
|
||||
expect(objectsToRemove).to.not
|
||||
.include({ Key: '903940390431-2016-08-18T23:42:08/artifact.zip' });
|
||||
.include(
|
||||
{ Key: `${s3Key}${s3Key}/903940390431-2016-08-18T23:42:08/artifact.zip` });
|
||||
|
||||
expect(objectsToRemove).to.not
|
||||
.include({ Key: '903940390431-2016-08-18T23:42:08/cloudformation.json' });
|
||||
.include(
|
||||
{ Key: `${s3Key}${s3Key}/903940390431-2016-08-18T23:42:08/cloudformation.json` });
|
||||
|
||||
expect(listObjectsStub.calledOnce).to.be.equal(true);
|
||||
expect(listObjectsStub.args[0][0]).to.be.equal('S3');
|
||||
expect(listObjectsStub.args[0][1]).to.be.equal('listObjectsV2');
|
||||
expect(listObjectsStub.args[0][2].Bucket).to.be.equal(awsDeploy.bucketName);
|
||||
expect(listObjectsStub.args[0][2].Prefix).to.be.equal(`${s3Key}`);
|
||||
expect(listObjectsStub.calledWith(awsDeploy.options.stage, awsDeploy.options.region));
|
||||
awsDeploy.sdk.request.restore();
|
||||
});
|
||||
@ -90,12 +111,12 @@ describe('cleanupS3Bucket', () => {
|
||||
it('should return an empty array if there are less than 4 directories available', () => {
|
||||
const serviceObjects = {
|
||||
Contents: [
|
||||
{ Key: '151224711231-2016-08-18T15:42:00/artifact.zip' },
|
||||
{ Key: '151224711231-2016-08-18T15:42:00/cloudformation.json' },
|
||||
{ Key: '141264711231-2016-08-18T15:42:00/artifact.zip' },
|
||||
{ Key: '141264711231-2016-08-18T15:42:00/cloudformation.json' },
|
||||
{ Key: '141321321541-2016-08-18T11:23:02/artifact.zip' },
|
||||
{ Key: '141321321541-2016-08-18T11:23:02/cloudformation.json' },
|
||||
{ Key: `${s3Key}151224711231-2016-08-18T15:42:00/artifact.zip` },
|
||||
{ Key: `${s3Key}151224711231-2016-08-18T15:42:00/cloudformation.json` },
|
||||
{ Key: `${s3Key}141264711231-2016-08-18T15:42:00/artifact.zip` },
|
||||
{ Key: `${s3Key}141264711231-2016-08-18T15:42:00/cloudformation.json` },
|
||||
{ Key: `${s3Key}141321321541-2016-08-18T11:23:02/artifact.zip` },
|
||||
{ Key: `${s3Key}141321321541-2016-08-18T11:23:02/cloudformation.json` },
|
||||
],
|
||||
};
|
||||
|
||||
@ -108,6 +129,7 @@ describe('cleanupS3Bucket', () => {
|
||||
expect(listObjectsStub.args[0][0]).to.be.equal('S3');
|
||||
expect(listObjectsStub.args[0][1]).to.be.equal('listObjectsV2');
|
||||
expect(listObjectsStub.args[0][2].Bucket).to.be.equal(awsDeploy.bucketName);
|
||||
expect(listObjectsStub.args[0][2].Prefix).to.be.equal(`${s3Key}`);
|
||||
expect(listObjectsStub.calledWith(awsDeploy.options.stage, awsDeploy.options.region));
|
||||
awsDeploy.sdk.request.restore();
|
||||
});
|
||||
@ -116,14 +138,14 @@ describe('cleanupS3Bucket', () => {
|
||||
it('should resolve if there are exactly 4 directories available', () => {
|
||||
const serviceObjects = {
|
||||
Contents: [
|
||||
{ Key: '151224711231-2016-08-18T15:42:00/artifact.zip' },
|
||||
{ Key: '151224711231-2016-08-18T15:42:00/cloudformation.json' },
|
||||
{ Key: '141264711231-2016-08-18T15:42:00/artifact.zip' },
|
||||
{ Key: '141264711231-2016-08-18T15:42:00/cloudformation.json' },
|
||||
{ Key: '141321321541-2016-08-18T11:23:02/artifact.zip' },
|
||||
{ Key: '141321321541-2016-08-18T11:23:02/cloudformation.json' },
|
||||
{ Key: '142003031341-2016-08-18T12:46:04/artifact.zip' },
|
||||
{ Key: '142003031341-2016-08-18T12:46:04/cloudformation.json' },
|
||||
{ Key: `${s3Key}151224711231-2016-08-18T15:42:00/artifact.zip` },
|
||||
{ Key: `${s3Key}151224711231-2016-08-18T15:42:00/cloudformation.json` },
|
||||
{ Key: `${s3Key}141264711231-2016-08-18T15:42:00/artifact.zip` },
|
||||
{ Key: `${s3Key}141264711231-2016-08-18T15:42:00/cloudformation.json` },
|
||||
{ Key: `${s3Key}141321321541-2016-08-18T11:23:02/artifact.zip` },
|
||||
{ Key: `${s3Key}141321321541-2016-08-18T11:23:02/cloudformation.json` },
|
||||
{ Key: `${s3Key}142003031341-2016-08-18T12:46:04/artifact.zip` },
|
||||
{ Key: `${s3Key}142003031341-2016-08-18T12:46:04/cloudformation.json` },
|
||||
],
|
||||
};
|
||||
|
||||
@ -136,6 +158,7 @@ describe('cleanupS3Bucket', () => {
|
||||
expect(listObjectsStub.args[0][0]).to.be.equal('S3');
|
||||
expect(listObjectsStub.args[0][1]).to.be.equal('listObjectsV2');
|
||||
expect(listObjectsStub.args[0][2].Bucket).to.be.equal(awsDeploy.bucketName);
|
||||
expect(listObjectsStub.args[0][2].Prefix).to.be.equal(`${s3Key}`);
|
||||
expect(listObjectsStub.calledWith(awsDeploy.options.stage, awsDeploy.options.region));
|
||||
awsDeploy.sdk.request.restore();
|
||||
});
|
||||
@ -159,10 +182,10 @@ describe('cleanupS3Bucket', () => {
|
||||
|
||||
it('should remove all old service files from the S3 bucket if available', () => {
|
||||
const objectsToRemove = [
|
||||
{ Key: '113304333331-2016-08-18T13:40:06/artifact.zip' },
|
||||
{ Key: '113304333331-2016-08-18T13:40:06/cloudformation.json' },
|
||||
{ Key: '141264711231-2016-08-18T15:42:00/artifact.zip' },
|
||||
{ Key: '141264711231-2016-08-18T15:42:00/cloudformation.json' },
|
||||
{ Key: `${s3Key}113304333331-2016-08-18T13:40:06/artifact.zip` },
|
||||
{ Key: `${s3Key}113304333331-2016-08-18T13:40:06/cloudformation.json` },
|
||||
{ Key: `${s3Key}141264711231-2016-08-18T15:42:00/artifact.zip` },
|
||||
{ Key: `${s3Key}141264711231-2016-08-18T15:42:00/cloudformation.json` },
|
||||
];
|
||||
|
||||
return awsDeploy.removeObjects(objectsToRemove).then(() => {
|
||||
|
||||
193
lib/plugins/aws/deploy/tests/configureStack.js
Normal file
193
lib/plugins/aws/deploy/tests/configureStack.js
Normal file
@ -0,0 +1,193 @@
|
||||
'use strict';
|
||||
|
||||
const sinon = require('sinon');
|
||||
const BbPromise = require('bluebird');
|
||||
const path = require('path');
|
||||
const expect = require('chai').expect;
|
||||
|
||||
const Serverless = require('../../../../Serverless');
|
||||
const AwsSdk = require('../');
|
||||
|
||||
describe('#configureStack', () => {
|
||||
let awsSdk;
|
||||
let serverless;
|
||||
|
||||
beforeEach(() => {
|
||||
serverless = new Serverless();
|
||||
const options = {
|
||||
stage: 'dev',
|
||||
region: 'us-east-1',
|
||||
};
|
||||
awsSdk = new AwsSdk(serverless, options);
|
||||
awsSdk.serverless.cli = new serverless.classes.CLI();
|
||||
});
|
||||
|
||||
it('should validate the region for the given S3 bucket', () => {
|
||||
const bucketName = 'com.serverless.deploys';
|
||||
|
||||
const getBucketLocationStub = sinon
|
||||
.stub(awsSdk.sdk, 'request').returns(
|
||||
BbPromise.resolve({ LocationConstraint: awsSdk.options.region })
|
||||
);
|
||||
|
||||
awsSdk.serverless.service.provider.deploymentBucket = bucketName;
|
||||
return awsSdk.configureStack()
|
||||
.then(() => {
|
||||
expect(getBucketLocationStub.args[0][0]).to.equal('S3');
|
||||
expect(getBucketLocationStub.args[0][1]).to.equal('getBucketLocation');
|
||||
expect(getBucketLocationStub.args[0][2].Bucket).to.equal(bucketName);
|
||||
});
|
||||
});
|
||||
|
||||
it('should reject an S3 bucket in the wrong region', () => {
|
||||
const bucketName = 'com.serverless.deploys';
|
||||
|
||||
const createStackStub = sinon
|
||||
.stub(awsSdk.sdk, 'request').returns(
|
||||
BbPromise.resolve({ LocationConstraint: 'us-west-1' })
|
||||
);
|
||||
|
||||
awsSdk.serverless.service.provider.deploymentBucket = 'com.serverless.deploys';
|
||||
return awsSdk.configureStack()
|
||||
.catch((err) => {
|
||||
expect(createStackStub.args[0][0]).to.equal('S3');
|
||||
expect(createStackStub.args[0][1]).to.equal('getBucketLocation');
|
||||
expect(createStackStub.args[0][2].Bucket).to.equal(bucketName);
|
||||
expect(err.message).to.contain('not in the same region');
|
||||
})
|
||||
.then(() => {});
|
||||
});
|
||||
|
||||
|
||||
it('should merge the IamRoleLambdaExecution template into the CloudFormation template', () => {
|
||||
const IamRoleLambdaExecutionTemplate = awsSdk.serverless.utils.readFileSync(
|
||||
path.join(
|
||||
__dirname,
|
||||
'..',
|
||||
'lib',
|
||||
'iam-role-lambda-execution-template.json'
|
||||
)
|
||||
);
|
||||
|
||||
return awsSdk.configureStack()
|
||||
.then(() => {
|
||||
expect(awsSdk.serverless.service.provider.compiledCloudFormationTemplate
|
||||
.Resources.IamRoleLambdaExecution
|
||||
).to.deep.equal(IamRoleLambdaExecutionTemplate.IamRoleLambdaExecution);
|
||||
});
|
||||
});
|
||||
|
||||
it('should merge IamPolicyLambdaExecution template into the CloudFormation template', () =>
|
||||
awsSdk.configureStack()
|
||||
.then(() => {
|
||||
// we check for the type here because a deep equality check will error out due to
|
||||
// the updates which are made after the merge (they are tested in a separate test)
|
||||
expect(awsSdk.serverless.service.provider.compiledCloudFormationTemplate
|
||||
.Resources.IamPolicyLambdaExecution.Type
|
||||
).to.deep.equal('AWS::IAM::Policy');
|
||||
})
|
||||
);
|
||||
|
||||
it('should update the necessary variables for the IamPolicyLambdaExecution', () =>
|
||||
awsSdk.configureStack()
|
||||
.then(() => {
|
||||
expect(awsSdk.serverless.service.provider.compiledCloudFormationTemplate
|
||||
.Resources
|
||||
.IamPolicyLambdaExecution
|
||||
.Properties
|
||||
.PolicyName
|
||||
).to.equal(
|
||||
`${
|
||||
awsSdk.options.stage
|
||||
}-${
|
||||
awsSdk.serverless.service.service
|
||||
}-lambda`
|
||||
);
|
||||
|
||||
expect(awsSdk.serverless.service.provider.compiledCloudFormationTemplate
|
||||
.Resources
|
||||
.IamPolicyLambdaExecution
|
||||
.Properties
|
||||
.PolicyDocument
|
||||
.Statement[0]
|
||||
.Resource
|
||||
).to.equal(`arn:aws:logs:${awsSdk.options.region}:*:*`);
|
||||
})
|
||||
);
|
||||
|
||||
it('should add custom IAM policy statements', () => {
|
||||
awsSdk.serverless.service.provider.name = 'aws';
|
||||
awsSdk.serverless.service.provider.iamRoleStatements = [
|
||||
{
|
||||
Effect: 'Allow',
|
||||
Action: [
|
||||
'something:SomethingElse',
|
||||
],
|
||||
Resource: 'some:aws:arn:xxx:*:*',
|
||||
},
|
||||
];
|
||||
|
||||
|
||||
return awsSdk.configureStack()
|
||||
.then(() => {
|
||||
expect(awsSdk.serverless.service.provider.compiledCloudFormationTemplate
|
||||
.Resources.IamPolicyLambdaExecution.Properties.PolicyDocument.Statement[1]
|
||||
).to.deep.equal(awsSdk.serverless.service.provider.iamRoleStatements[0]);
|
||||
});
|
||||
});
|
||||
|
||||
it('should use a custom bucket if specified', () => {
|
||||
const bucketName = 'com.serverless.deploys';
|
||||
|
||||
awsSdk.serverless.service.provider.deploymentBucket = bucketName;
|
||||
|
||||
const coreCloudFormationTemplate = awsSdk.serverless.utils.readFileSync(
|
||||
path.join(
|
||||
__dirname,
|
||||
'..',
|
||||
'lib',
|
||||
'core-cloudformation-template.json'
|
||||
)
|
||||
);
|
||||
awsSdk.serverless.service.provider
|
||||
.compiledCloudFormationTemplate = coreCloudFormationTemplate;
|
||||
|
||||
sinon
|
||||
.stub(awsSdk.sdk, 'request')
|
||||
.returns(BbPromise.resolve({ LocationConstraint: awsSdk.options.region }));
|
||||
|
||||
return awsSdk.configureStack()
|
||||
.then(() => {
|
||||
expect(
|
||||
awsSdk.serverless.service.provider.compiledCloudFormationTemplate
|
||||
.Outputs.ServerlessDeploymentBucketName.Value
|
||||
).to.equal(bucketName);
|
||||
// eslint-disable-next-line no-unused-expressions
|
||||
expect(
|
||||
awsSdk.serverless.service.provider.compiledCloudFormationTemplate
|
||||
.Resources.ServerlessDeploymentBucket
|
||||
).to.not.exist;
|
||||
});
|
||||
});
|
||||
|
||||
it('should not add IamPolicyLambdaExecution', () => {
|
||||
awsSdk.serverless.service.provider.iamRoleARN = 'some:aws:arn:xxx:*:*';
|
||||
|
||||
return awsSdk.configureStack()
|
||||
.then(() => expect(
|
||||
awsSdk.serverless.service.provider.compiledCloudFormationTemplate
|
||||
.Resources.IamPolicyLambdaExecution
|
||||
).to.not.exist);
|
||||
});
|
||||
|
||||
|
||||
it('should not add IamRole', () => {
|
||||
awsSdk.serverless.service.provider.iamRoleARN = 'some:aws:arn:xxx:*:*';
|
||||
|
||||
return awsSdk.configureStack()
|
||||
.then(() => expect(
|
||||
awsSdk.serverless.service.provider.compiledCloudFormationTemplate
|
||||
.Resources.IamRoleLambdaExecution
|
||||
).to.not.exist);
|
||||
});
|
||||
});
|
||||
@ -9,7 +9,6 @@ const Serverless = require('../../../../Serverless');
|
||||
const testUtils = require('../../../../../tests/utils');
|
||||
|
||||
describe('createStack', () => {
|
||||
let serverless;
|
||||
let awsDeploy;
|
||||
const tmpDirPath = testUtils.getTmpDirPath();
|
||||
|
||||
@ -25,7 +24,7 @@ describe('createStack', () => {
|
||||
};
|
||||
|
||||
beforeEach(() => {
|
||||
serverless = new Serverless();
|
||||
const serverless = new Serverless();
|
||||
serverless.utils.writeFileSync(serverlessYmlPath, serverlessYml);
|
||||
serverless.config.servicePath = tmpDirPath;
|
||||
const options = {
|
||||
@ -46,6 +45,9 @@ describe('createStack', () => {
|
||||
'core-cloudformation-template.json')
|
||||
);
|
||||
|
||||
awsDeploy.serverless.service.provider
|
||||
.compiledCloudFormationTemplate = coreCloudFormationTemplate;
|
||||
|
||||
const createStackStub = sinon
|
||||
.stub(awsDeploy.sdk, 'request').returns(BbPromise.resolve());
|
||||
|
||||
@ -60,7 +62,6 @@ describe('createStack', () => {
|
||||
.to.deep.equal([{ Key: 'STAGE', Value: awsDeploy.options.stage }]);
|
||||
expect(createStackStub.calledOnce).to.be.equal(true);
|
||||
expect(createStackStub.calledWith(awsDeploy.options.stage, awsDeploy.options.region));
|
||||
awsDeploy.sdk.request.restore();
|
||||
});
|
||||
});
|
||||
});
|
||||
@ -69,7 +70,16 @@ describe('createStack', () => {
|
||||
it('should store the core CloudFormation template in the provider object', () => {
|
||||
sinon.stub(awsDeploy.sdk, 'request').returns(BbPromise.resolve());
|
||||
|
||||
const coreCloudFormationTemplate = awsDeploy.loadCoreCloudFormationTemplate();
|
||||
const coreCloudFormationTemplate = awsDeploy.serverless.utils.readFileSync(
|
||||
path.join(__dirname,
|
||||
'..',
|
||||
'lib',
|
||||
'core-cloudformation-template.json')
|
||||
);
|
||||
|
||||
awsDeploy.serverless.service.provider
|
||||
.compiledCloudFormationTemplate = coreCloudFormationTemplate;
|
||||
|
||||
const writeCreateTemplateToDiskStub = sinon
|
||||
.stub(awsDeploy, 'writeCreateTemplateToDisk').returns(BbPromise.resolve());
|
||||
|
||||
@ -77,8 +87,6 @@ describe('createStack', () => {
|
||||
expect(writeCreateTemplateToDiskStub.calledOnce).to.be.equal(true);
|
||||
expect(awsDeploy.serverless.service.provider.compiledCloudFormationTemplate)
|
||||
.to.deep.equal(coreCloudFormationTemplate);
|
||||
|
||||
awsDeploy.sdk.request.restore();
|
||||
});
|
||||
});
|
||||
|
||||
@ -90,8 +98,6 @@ describe('createStack', () => {
|
||||
|
||||
return awsDeploy.createStack().then(() => {
|
||||
expect(createStub.called).to.be.equal(false);
|
||||
awsDeploy.create.restore();
|
||||
awsDeploy.sdk.request.restore();
|
||||
});
|
||||
});
|
||||
|
||||
@ -106,9 +112,6 @@ describe('createStack', () => {
|
||||
return awsDeploy.createStack().then(() => {
|
||||
expect(writeCreateTemplateToDiskStub.calledOnce).to.be.equal(true);
|
||||
expect(createStub.called).to.be.equal(false);
|
||||
|
||||
awsDeploy.writeCreateTemplateToDisk.restore();
|
||||
awsDeploy.create.restore();
|
||||
});
|
||||
});
|
||||
|
||||
@ -119,12 +122,10 @@ describe('createStack', () => {
|
||||
.stub(awsDeploy, 'writeCreateTemplateToDisk').returns(BbPromise.resolve());
|
||||
sinon.stub(awsDeploy.sdk, 'request').returns(BbPromise.resolve());
|
||||
|
||||
return awsDeploy.createStack().then(() => {
|
||||
return awsDeploy.createStack().then((res) => {
|
||||
expect(writeCreateTemplateToDiskStub.calledOnce).to.be.equal(true);
|
||||
expect(awsDeploy.sdk.request.called).to.be.equal(true);
|
||||
|
||||
awsDeploy.writeCreateTemplateToDisk.restore();
|
||||
awsDeploy.sdk.request.restore();
|
||||
expect(res).to.equal('alreadyCreated');
|
||||
});
|
||||
});
|
||||
|
||||
@ -142,9 +143,6 @@ describe('createStack', () => {
|
||||
expect(createStub.called).to.be.equal(false);
|
||||
expect(e.name).to.be.equal('ServerlessError');
|
||||
expect(e.message).to.be.equal(errorMock);
|
||||
|
||||
awsDeploy.create.restore();
|
||||
awsDeploy.sdk.request.restore();
|
||||
});
|
||||
});
|
||||
|
||||
@ -160,22 +158,10 @@ describe('createStack', () => {
|
||||
|
||||
return awsDeploy.createStack().then(() => {
|
||||
expect(createStub.calledOnce).to.be.equal(true);
|
||||
|
||||
awsDeploy.create.restore();
|
||||
awsDeploy.sdk.request.restore();
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe('#loadCoreCloudFormationTemplate', () => {
|
||||
it('should load the core CloudFormation template', () => {
|
||||
const template = awsDeploy.loadCoreCloudFormationTemplate();
|
||||
|
||||
expect(template.Resources.ServerlessDeploymentBucket.Type)
|
||||
.to.equal('AWS::S3::Bucket');
|
||||
});
|
||||
});
|
||||
|
||||
describe('#writeCreateTemplateToDisk', () => {
|
||||
it('should write the compiled CloudFormation template into the .serverless directory', () => {
|
||||
awsDeploy.serverless.service.provider.compiledCloudFormationTemplate = { key: 'value' };
|
||||
@ -185,8 +171,10 @@ describe('createStack', () => {
|
||||
'cloudformation-template-create-stack.json');
|
||||
|
||||
return awsDeploy.writeCreateTemplateToDisk().then(() => {
|
||||
expect(serverless.utils.fileExistsSync(templatePath)).to.equal(true);
|
||||
expect(serverless.utils.readFileSync(templatePath)).to.deep.equal({ key: 'value' });
|
||||
expect(awsDeploy.serverless.utils.fileExistsSync(templatePath)).to.equal(true);
|
||||
expect(awsDeploy.serverless.utils.readFileSync(templatePath)).to.deep.equal(
|
||||
{ key: 'value' }
|
||||
);
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
@ -7,27 +7,32 @@ const BbPromise = require('bluebird');
|
||||
const sinon = require('sinon');
|
||||
|
||||
describe('AwsDeploy', () => {
|
||||
const serverless = new Serverless();
|
||||
const options = {
|
||||
stage: 'dev',
|
||||
region: 'us-east-1',
|
||||
};
|
||||
const awsDeploy = new AwsDeploy(serverless, options);
|
||||
awsDeploy.serverless.cli = new serverless.classes.CLI();
|
||||
let awsDeploy;
|
||||
beforeEach(() => {
|
||||
const serverless = new Serverless();
|
||||
const options = {
|
||||
stage: 'dev',
|
||||
region: 'us-east-1',
|
||||
};
|
||||
|
||||
awsDeploy = new AwsDeploy(serverless, options);
|
||||
awsDeploy.serverless.cli = new serverless.classes.CLI();
|
||||
});
|
||||
|
||||
describe('#constructor()', () => {
|
||||
it('should have hooks', () => expect(awsDeploy.hooks).to.be.not.empty);
|
||||
|
||||
it('should set the provider variable to "aws"', () => expect(awsDeploy.provider)
|
||||
.to.equal('aws'));
|
||||
});
|
||||
|
||||
describe('hooks', () => {
|
||||
it('should run "before:deploy:initialize" hook promise chain in order', () => {
|
||||
const validateStub = sinon
|
||||
.stub(awsDeploy, 'validate').returns(BbPromise.resolve());
|
||||
|
||||
return awsDeploy.hooks['before:deploy:initialize']().then(() => {
|
||||
expect(validateStub.calledOnce).to.be.equal(true);
|
||||
awsDeploy.validate.restore();
|
||||
});
|
||||
});
|
||||
|
||||
@ -40,8 +45,6 @@ describe('AwsDeploy', () => {
|
||||
return awsDeploy.hooks['deploy:setupProviderConfiguration']().then(() => {
|
||||
expect(createStackStub.calledOnce).to.be.equal(true);
|
||||
expect(monitorStackStub.calledOnce).to.be.equal(true);
|
||||
awsDeploy.createStack.restore();
|
||||
awsDeploy.monitorStack.restore();
|
||||
});
|
||||
});
|
||||
|
||||
@ -51,17 +54,19 @@ describe('AwsDeploy', () => {
|
||||
|
||||
return awsDeploy.hooks['before:deploy:compileFunctions']().then(() => {
|
||||
expect(generateArtifactDirectoryNameStub.calledOnce).to.be.equal(true);
|
||||
awsDeploy.generateArtifactDirectoryName.restore();
|
||||
});
|
||||
});
|
||||
|
||||
it('should run "before:deploy:deploy" promise chain in order', () => {
|
||||
it('should run "deploy:initialize" promise chain in order', () => {
|
||||
const configureStackStub = sinon
|
||||
.stub(awsDeploy, 'configureStack').returns(BbPromise.resolve());
|
||||
|
||||
const mergeCustomProviderResourcesStub = sinon
|
||||
.stub(awsDeploy, 'mergeCustomProviderResources').returns(BbPromise.resolve());
|
||||
|
||||
return awsDeploy.hooks['before:deploy:deploy']().then(() => {
|
||||
return awsDeploy.hooks['deploy:initialize']().then(() => {
|
||||
expect(configureStackStub.calledOnce).to.be.equal(true);
|
||||
expect(mergeCustomProviderResourcesStub.calledOnce).to.be.equal(true);
|
||||
awsDeploy.mergeCustomProviderResources.restore();
|
||||
});
|
||||
});
|
||||
|
||||
@ -87,12 +92,20 @@ describe('AwsDeploy', () => {
|
||||
.to.be.equal(true);
|
||||
expect(monitorStackStub.calledAfter(updateStackStub))
|
||||
.to.be.equal(true);
|
||||
});
|
||||
});
|
||||
|
||||
it('should notify about noDeploy', () => {
|
||||
sinon.stub(awsDeploy, 'setBucketName').returns(BbPromise.resolve());
|
||||
sinon.stub(awsDeploy, 'cleanupS3Bucket').returns(BbPromise.resolve());
|
||||
sinon.stub(awsDeploy, 'uploadArtifacts').returns(BbPromise.resolve());
|
||||
sinon.stub(awsDeploy, 'updateStack').returns(BbPromise.resolve());
|
||||
sinon.stub(awsDeploy, 'monitorStack').returns(BbPromise.resolve());
|
||||
sinon.stub(awsDeploy.serverless.cli, 'log').returns();
|
||||
awsDeploy.options.noDeploy = true;
|
||||
|
||||
return awsDeploy.hooks['deploy:deploy']().then(() => {
|
||||
|
||||
awsDeploy.setBucketName.restore();
|
||||
awsDeploy.cleanupS3Bucket.restore();
|
||||
awsDeploy.uploadArtifacts.restore();
|
||||
awsDeploy.updateStack.restore();
|
||||
awsDeploy.monitorStack.restore();
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
@ -42,4 +42,12 @@ describe('#setBucketName()', () => {
|
||||
awsDeploy.sdk.getServerlessDeploymentBucketName.restore();
|
||||
});
|
||||
});
|
||||
|
||||
it('should resolve if the bucketName is already set', () => {
|
||||
const bucketName = 'someBucket';
|
||||
awsDeploy.bucketName = bucketName;
|
||||
return awsDeploy.setBucketName()
|
||||
.then(() => expect(getServerlessDeploymentBucketNameStub.calledOnce).to.be.false)
|
||||
.then(() => expect(awsDeploy.bucketName).to.equal(bucketName));
|
||||
});
|
||||
});
|
||||
|
||||
@ -69,16 +69,14 @@ class AwsDeployFunction {
|
||||
ZipFile: data,
|
||||
};
|
||||
|
||||
this.sdk.request(
|
||||
return this.sdk.request(
|
||||
'Lambda',
|
||||
'updateFunctionCode',
|
||||
params,
|
||||
this.options.stage, this.options.region
|
||||
);
|
||||
|
||||
this.serverless.cli.log(`Successfully deployed function "${this.options.function}"`);
|
||||
|
||||
return BbPromise.resolve();
|
||||
).then(() => {
|
||||
this.serverless.cli.log(`Successfully deployed function "${this.options.function}"`);
|
||||
});
|
||||
}
|
||||
|
||||
cleanup() {
|
||||
|
||||
@ -69,7 +69,7 @@ class SDK {
|
||||
].join('');
|
||||
err.message = errorMessage;
|
||||
}
|
||||
reject(new this.serverless.classes.Error(err.message));
|
||||
reject(new this.serverless.classes.Error(err.message, err.statusCode));
|
||||
} else {
|
||||
resolve(data);
|
||||
}
|
||||
@ -99,9 +99,7 @@ class SDK {
|
||||
},
|
||||
stage,
|
||||
region
|
||||
).then((result) =>
|
||||
result.StackResourceDetail.PhysicalResourceId
|
||||
);
|
||||
).then((result) => result.StackResourceDetail.PhysicalResourceId);
|
||||
}
|
||||
|
||||
getStackName(stage) {
|
||||
|
||||
@ -51,12 +51,13 @@ class AwsInfo {
|
||||
this.options.stage,
|
||||
this.options.region)
|
||||
.then((result) => {
|
||||
let outputs;
|
||||
|
||||
if (result) {
|
||||
const outputs = result.Stacks[0].Outputs;
|
||||
outputs = result.Stacks[0].Outputs;
|
||||
|
||||
// Functions
|
||||
info.functions = [];
|
||||
info.apiKeys = [];
|
||||
outputs.filter(x => x.OutputKey.match(/LambdaFunctionArn$/))
|
||||
.forEach(x => {
|
||||
const functionInfo = {};
|
||||
@ -71,21 +72,23 @@ class AwsInfo {
|
||||
info.endpoint = x.OutputValue;
|
||||
});
|
||||
|
||||
// API Keys
|
||||
outputs.filter(x => x.OutputKey.match(/^ApiGatewayApiKey/))
|
||||
.forEach(x => {
|
||||
const apiKeyInfo = {};
|
||||
apiKeyInfo.name = x.Description;
|
||||
apiKeyInfo.value = x.OutputValue;
|
||||
info.apiKeys.push(apiKeyInfo);
|
||||
});
|
||||
|
||||
// Resources
|
||||
info.resources = [];
|
||||
|
||||
// API Keys
|
||||
info.apiKeys = [];
|
||||
}
|
||||
|
||||
return BbPromise.resolve(info);
|
||||
// create a gatheredData object which can be passed around ("[call] by reference")
|
||||
const gatheredData = {
|
||||
outputs,
|
||||
info,
|
||||
};
|
||||
|
||||
return BbPromise.resolve(gatheredData);
|
||||
})
|
||||
.then((gatheredData) => this.getApiKeyValues(gatheredData))
|
||||
.then((gatheredData) => BbPromise.resolve(gatheredData.info)) // resolve the info at the end
|
||||
.catch((e) => {
|
||||
let result;
|
||||
|
||||
@ -102,6 +105,38 @@ class AwsInfo {
|
||||
});
|
||||
}
|
||||
|
||||
getApiKeyValues(gatheredData) {
|
||||
const info = gatheredData.info;
|
||||
|
||||
// check if the user has set api keys
|
||||
const apiKeyNames = this.serverless.service.provider.apiKeys || [];
|
||||
|
||||
if (apiKeyNames.length) {
|
||||
return this.sdk.request('APIGateway',
|
||||
'getApiKeys',
|
||||
{ includeValues: true },
|
||||
this.options.stage,
|
||||
this.options.region
|
||||
).then((allApiKeys) => {
|
||||
const items = allApiKeys.items;
|
||||
if (items) {
|
||||
// filter out the API keys only created for this stack
|
||||
const filteredItems = items.filter((item) => _.includes(apiKeyNames, item.name));
|
||||
|
||||
// iterate over all apiKeys and push the API key info and update the info object
|
||||
filteredItems.forEach((item) => {
|
||||
const apiKeyInfo = {};
|
||||
apiKeyInfo.name = item.name;
|
||||
apiKeyInfo.value = item.value;
|
||||
info.apiKeys.push(apiKeyInfo);
|
||||
});
|
||||
}
|
||||
return BbPromise.resolve(gatheredData);
|
||||
});
|
||||
}
|
||||
return BbPromise.resolve(gatheredData);
|
||||
}
|
||||
|
||||
/**
|
||||
* Display service information
|
||||
*/
|
||||
|
||||
@ -163,7 +163,7 @@ describe('AwsInfo', () => {
|
||||
|
||||
it('should gather with correct params', () => awsInfo.gather()
|
||||
.then(() => {
|
||||
expect(describeStackStub.calledOnce).to.equal(true);
|
||||
expect(describeStackStub.called).to.equal(true);
|
||||
expect(describeStackStub.args[0][0]).to.equal('CloudFormation');
|
||||
expect(describeStackStub.args[0][1]).to.equal('describeStacks');
|
||||
expect(describeStackStub.args[0][2].StackName)
|
||||
@ -218,23 +218,6 @@ describe('AwsInfo', () => {
|
||||
});
|
||||
});
|
||||
|
||||
it('should get api keys', () => {
|
||||
const expectedApiKeys = [
|
||||
{
|
||||
name: 'first',
|
||||
value: 'xxx',
|
||||
},
|
||||
{
|
||||
name: 'second',
|
||||
value: 'yyy',
|
||||
},
|
||||
];
|
||||
|
||||
return awsInfo.gather().then((info) => {
|
||||
expect(info.apiKeys).to.deep.equal(expectedApiKeys);
|
||||
});
|
||||
});
|
||||
|
||||
it("should provide only general info when stack doesn't exist (ValidationError)", () => {
|
||||
awsInfo.sdk.request.restore();
|
||||
|
||||
@ -267,6 +250,85 @@ describe('AwsInfo', () => {
|
||||
});
|
||||
});
|
||||
|
||||
describe('#getApiKeyValues()', () => {
|
||||
it('should return the api keys in the info object', () => {
|
||||
// TODO: implement a pattern for stub restoring to get rid of this
|
||||
awsInfo.sdk.request.restore();
|
||||
|
||||
// set the API Keys for the service
|
||||
awsInfo.serverless.service.provider.apiKeys = ['foo', 'bar'];
|
||||
|
||||
const gatheredData = {
|
||||
outputs: [],
|
||||
info: {
|
||||
apiKeys: [],
|
||||
},
|
||||
};
|
||||
|
||||
const apiKeyItems = {
|
||||
items: [
|
||||
{
|
||||
id: '4711',
|
||||
name: 'SomeRandomIdInUsersAccount',
|
||||
value: 'ShouldNotBeConsidered',
|
||||
},
|
||||
{
|
||||
id: '1234',
|
||||
name: 'foo',
|
||||
value: 'valueForKeyFoo',
|
||||
},
|
||||
{
|
||||
id: '5678',
|
||||
name: 'bar',
|
||||
value: 'valueForKeyBar',
|
||||
},
|
||||
],
|
||||
};
|
||||
|
||||
const gatheredDataAfterKeyLookup = {
|
||||
info: {
|
||||
apiKeys: [
|
||||
{ name: 'foo', value: 'valueForKeyFoo' },
|
||||
{ name: 'bar', value: 'valueForKeyBar' },
|
||||
],
|
||||
},
|
||||
};
|
||||
|
||||
const getApiKeysStub = sinon
|
||||
.stub(awsInfo.sdk, 'request')
|
||||
.returns(BbPromise.resolve(apiKeyItems));
|
||||
|
||||
return awsInfo.getApiKeyValues(gatheredData).then((result) => {
|
||||
expect(getApiKeysStub.calledOnce).to.equal(true);
|
||||
expect(result.info.apiKeys).to.deep.equal(gatheredDataAfterKeyLookup.info.apiKeys);
|
||||
|
||||
awsInfo.sdk.request.restore();
|
||||
});
|
||||
});
|
||||
|
||||
it('should resolve with the passed-in data if no API key retrieval is necessary', () => {
|
||||
awsInfo.serverless.service.provider.apiKeys = null;
|
||||
|
||||
const gatheredData = {
|
||||
outputs: [],
|
||||
info: {
|
||||
apiKeys: [],
|
||||
},
|
||||
};
|
||||
|
||||
const getApiKeysStub = sinon
|
||||
.stub(awsInfo.sdk, 'request')
|
||||
.returns(BbPromise.resolve());
|
||||
|
||||
return awsInfo.getApiKeyValues(gatheredData).then((result) => {
|
||||
expect(getApiKeysStub.calledOnce).to.equal(false);
|
||||
expect(result).to.deep.equal(gatheredData);
|
||||
|
||||
awsInfo.sdk.request.restore();
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe('#display()', () => {
|
||||
it('should format information message correctly', () => {
|
||||
serverless.cli = new CLI(serverless);
|
||||
|
||||
@ -18,4 +18,45 @@ module.exports = {
|
||||
|
||||
return BbPromise.resolve();
|
||||
},
|
||||
|
||||
/**
|
||||
* Retrieved 9/27/2016 from http://docs.aws.amazon.com/AmazonS3/latest/dev/BucketRestrictions.html
|
||||
* Bucket names must be at least 3 and no more than 63 characters long.
|
||||
* Bucket names must be a series of one or more labels.
|
||||
* Adjacent labels are separated by a single period (.).
|
||||
* Bucket names can contain lowercase letters, numbers, and hyphens.
|
||||
* Each label must start and end with a lowercase letter or a number.
|
||||
* Bucket names must not be formatted as an IP address (e.g., 192.168.5.4).
|
||||
* @param bucketName
|
||||
*/
|
||||
validateS3BucketName(bucketName) {
|
||||
return BbPromise.resolve()
|
||||
.then(() => {
|
||||
let error;
|
||||
if (!bucketName) {
|
||||
error = 'Bucket name cannot be undefined or empty';
|
||||
} else if (bucketName.length < 3) {
|
||||
error = `Bucket name is shorter than 3 characters. ${bucketName}`;
|
||||
} else if (bucketName.length > 63) {
|
||||
error = `Bucket name is longer than 63 characters. ${bucketName}`;
|
||||
} else if (/^[^a-z0-9]/.test(bucketName)) {
|
||||
error = `Bucket name must start with a letter or number. ${bucketName}`;
|
||||
} else if (/[^a-z0-9]$/.test(bucketName)) {
|
||||
error = `Bucket name must end with a letter or number. ${bucketName}`;
|
||||
} else if (/[A-Z]/.test(bucketName)) {
|
||||
error = `Bucket name cannot contain uppercase letters. ${bucketName}`;
|
||||
} else if (!/^[a-z0-9][a-z.0-9-]+[a-z0-9]$/.test(bucketName)) {
|
||||
error = `Bucket name contains invalid characters, [a-z.0-9-] ${bucketName}`;
|
||||
} else if (/\.{2,}/.test(bucketName)) {
|
||||
error = `Bucket name cannot contain consecutive periods (.) ${bucketName}`;
|
||||
} else if (/^(?:[0-9]{1,3}\.){3}[0-9]{1,3}$/.test(bucketName)) {
|
||||
error = `Bucket name cannot look like an IPv4 address. ${bucketName}`;
|
||||
}
|
||||
|
||||
if (error) {
|
||||
throw new this.serverless.classes.Error(error);
|
||||
}
|
||||
return true;
|
||||
});
|
||||
},
|
||||
};
|
||||
|
||||
@ -14,8 +14,10 @@ module.exports = {
|
||||
this.objectsInBucket = [];
|
||||
|
||||
this.serverless.cli.log('Getting all objects in S3 bucket...');
|
||||
const serviceStage = `${this.serverless.service.service}/${this.options.stage}`;
|
||||
return this.sdk.request('S3', 'listObjectsV2', {
|
||||
Bucket: this.bucketName,
|
||||
Prefix: `serverless/${serviceStage}`,
|
||||
}, this.options.stage, this.options.region).then((result) => {
|
||||
if (result) {
|
||||
result.Contents.forEach((object) => {
|
||||
|
||||
@ -7,38 +7,43 @@ const Serverless = require('../../../Serverless');
|
||||
const AwsSdk = require('../');
|
||||
|
||||
describe('AWS SDK', () => {
|
||||
let awsSdk;
|
||||
let serverless;
|
||||
|
||||
beforeEach(() => {
|
||||
serverless = new Serverless();
|
||||
const options = {
|
||||
stage: 'dev',
|
||||
region: 'us-east-1',
|
||||
};
|
||||
awsSdk = new AwsSdk(serverless, options);
|
||||
awsSdk.serverless.cli = new serverless.classes.CLI();
|
||||
});
|
||||
|
||||
describe('#constructor()', () => {
|
||||
it('should set AWS instance', () => {
|
||||
const serverless = new Serverless();
|
||||
const awsSdk = new AwsSdk(serverless);
|
||||
|
||||
expect(typeof awsSdk.sdk).to.not.equal('undefined');
|
||||
});
|
||||
|
||||
it('should set Serverless instance', () => {
|
||||
const serverless = new Serverless();
|
||||
const awsSdk = new AwsSdk(serverless);
|
||||
|
||||
expect(typeof awsSdk.serverless).to.not.equal('undefined');
|
||||
});
|
||||
|
||||
it('should set AWS proxy', () => {
|
||||
const serverless = new Serverless();
|
||||
process.env.proxy = 'http://a.b.c.d:n';
|
||||
const awsSdk = new AwsSdk(serverless);
|
||||
const newAwsSdk = new AwsSdk(serverless);
|
||||
|
||||
expect(typeof awsSdk.sdk.config.httpOptions.agent).to.not.equal('undefined');
|
||||
expect(typeof newAwsSdk.sdk.config.httpOptions.agent).to.not.equal('undefined');
|
||||
|
||||
// clear env
|
||||
delete process.env.proxy;
|
||||
});
|
||||
|
||||
it('should set AWS timeout', () => {
|
||||
const serverless = new Serverless();
|
||||
process.env.AWS_CLIENT_TIMEOUT = '120000';
|
||||
const awsSdk = new AwsSdk(serverless);
|
||||
const newAwsSdk = new AwsSdk(serverless);
|
||||
|
||||
expect(typeof awsSdk.sdk.config.httpOptions.timeout).to.not.equal('undefined');
|
||||
expect(typeof newAwsSdk.sdk.config.httpOptions.timeout).to.not.equal('undefined');
|
||||
|
||||
// clear env
|
||||
delete process.env.AWS_CLIENT_TIMEOUT;
|
||||
@ -59,12 +64,10 @@ describe('AWS SDK', () => {
|
||||
};
|
||||
}
|
||||
}
|
||||
const serverless = new Serverless();
|
||||
const awsSdk = new AwsSdk(serverless);
|
||||
awsSdk.sdk = {
|
||||
S3: FakeS3,
|
||||
};
|
||||
serverless.service.environment = {
|
||||
awsSdk.serverless.service.environment = {
|
||||
vars: {},
|
||||
stages: {
|
||||
dev: {
|
||||
@ -75,42 +78,127 @@ describe('AWS SDK', () => {
|
||||
},
|
||||
},
|
||||
};
|
||||
serverless.service.environment.stages.dev.regions['us-east-1'] = {
|
||||
vars: {},
|
||||
};
|
||||
|
||||
return awsSdk.request('S3', 'putObject', {}, 'dev', 'us-east-1').then(data => {
|
||||
expect(data.called).to.equal(true);
|
||||
});
|
||||
});
|
||||
|
||||
it('should retry if error code is 429', function (done) {
|
||||
this.timeout(10000);
|
||||
let first = true;
|
||||
const error = {
|
||||
statusCode: 429,
|
||||
message: 'Testing retry',
|
||||
};
|
||||
class FakeS3 {
|
||||
constructor(credentials) {
|
||||
this.credentials = credentials;
|
||||
}
|
||||
|
||||
error() {
|
||||
return {
|
||||
send(cb) {
|
||||
if (first) {
|
||||
cb(error);
|
||||
} else {
|
||||
cb(undefined, {});
|
||||
}
|
||||
first = false;
|
||||
},
|
||||
};
|
||||
}
|
||||
}
|
||||
awsSdk.sdk = {
|
||||
S3: FakeS3,
|
||||
};
|
||||
awsSdk.request('S3', 'error', {}, 'dev', 'us-east-1')
|
||||
.then(data => {
|
||||
// eslint-disable-next-line no-unused-expressions
|
||||
expect(data).to.exist;
|
||||
// eslint-disable-next-line no-unused-expressions
|
||||
expect(first).to.be.false;
|
||||
done();
|
||||
})
|
||||
.catch(done);
|
||||
});
|
||||
|
||||
it('should reject errors', (done) => {
|
||||
const error = {
|
||||
statusCode: 500,
|
||||
message: 'Some error message',
|
||||
};
|
||||
class FakeS3 {
|
||||
constructor(credentials) {
|
||||
this.credentials = credentials;
|
||||
}
|
||||
|
||||
error() {
|
||||
return {
|
||||
send(cb) {
|
||||
cb(error);
|
||||
},
|
||||
};
|
||||
}
|
||||
}
|
||||
awsSdk.sdk = {
|
||||
S3: FakeS3,
|
||||
};
|
||||
awsSdk.request('S3', 'error', {}, 'dev', 'us-east-1')
|
||||
.then(() => done('Should not succeed'))
|
||||
.catch(() => done());
|
||||
});
|
||||
|
||||
it('should return ref to docs for missing credentials', (done) => {
|
||||
const error = {
|
||||
statusCode: 403,
|
||||
message: 'Missing credentials in config',
|
||||
};
|
||||
class FakeS3 {
|
||||
constructor(credentials) {
|
||||
this.credentials = credentials;
|
||||
}
|
||||
|
||||
error() {
|
||||
return {
|
||||
send(cb) {
|
||||
cb(error);
|
||||
},
|
||||
};
|
||||
}
|
||||
}
|
||||
awsSdk.sdk = {
|
||||
S3: FakeS3,
|
||||
};
|
||||
awsSdk.request('S3', 'error', {}, 'dev', 'us-east-1')
|
||||
.then(() => done('Should not succeed'))
|
||||
.catch((err) => {
|
||||
expect(err.message).to.contain('https://git.io/viZAC');
|
||||
done();
|
||||
})
|
||||
.catch(done);
|
||||
});
|
||||
});
|
||||
|
||||
describe('#getCredentials()', () => {
|
||||
it('should set region for credentials', () => {
|
||||
const serverless = new Serverless();
|
||||
const awsSdk = new AwsSdk(serverless);
|
||||
const credentials = awsSdk.getCredentials('testregion');
|
||||
expect(credentials.region).to.equal('testregion');
|
||||
});
|
||||
|
||||
it('should get credentials from provider', () => {
|
||||
const serverless = new Serverless();
|
||||
const awsSdk = new AwsSdk(serverless);
|
||||
serverless.service.provider.profile = 'notDefault';
|
||||
const credentials = awsSdk.getCredentials();
|
||||
expect(credentials.credentials.profile).to.equal('notDefault');
|
||||
});
|
||||
|
||||
it('should not set credentials if empty profile is set', () => {
|
||||
const serverless = new Serverless();
|
||||
const awsSdk = new AwsSdk(serverless);
|
||||
serverless.service.provider.profile = '';
|
||||
const credentials = awsSdk.getCredentials('testregion');
|
||||
expect(credentials).to.eql({ region: 'testregion' });
|
||||
});
|
||||
|
||||
it('should not set credentials if profile is not set', () => {
|
||||
const serverless = new Serverless();
|
||||
const awsSdk = new AwsSdk(serverless);
|
||||
serverless.service.provider.profile = undefined;
|
||||
const credentials = awsSdk.getCredentials('testregion');
|
||||
expect(credentials).to.eql({ region: 'testregion' });
|
||||
@ -119,8 +207,6 @@ describe('AWS SDK', () => {
|
||||
|
||||
describe('#getServerlessDeploymentBucketName', () => {
|
||||
it('should return the name of the serverless deployment bucket', () => {
|
||||
const serverless = new Serverless();
|
||||
const awsSdk = new AwsSdk(serverless);
|
||||
const options = {
|
||||
stage: 'dev',
|
||||
region: 'us-east-1',
|
||||
@ -152,9 +238,7 @@ describe('AWS SDK', () => {
|
||||
|
||||
describe('#getStackName', () => {
|
||||
it('should return the stack name', () => {
|
||||
const serverless = new Serverless();
|
||||
serverless.service.service = 'myservice';
|
||||
const awsSdk = new AwsSdk(serverless);
|
||||
|
||||
expect(awsSdk.getStackName('dev')).to.equal('myservice-dev');
|
||||
});
|
||||
|
||||
@ -4,7 +4,7 @@ const expect = require('chai').expect;
|
||||
const validate = require('../lib/validate');
|
||||
const Serverless = require('../../../Serverless');
|
||||
|
||||
describe('#validate()', () => {
|
||||
describe('#validate', () => {
|
||||
const serverless = new Serverless();
|
||||
const awsPlugin = {};
|
||||
|
||||
@ -20,50 +20,143 @@ describe('#validate()', () => {
|
||||
Object.assign(awsPlugin, validate);
|
||||
});
|
||||
|
||||
it('should succeed if inside service (servicePath defined)', () => {
|
||||
expect(() => awsPlugin.validate()).to.not.throw(Error);
|
||||
});
|
||||
describe('#validate()', () => {
|
||||
it('should succeed if inside service (servicePath defined)', () => {
|
||||
expect(() => awsPlugin.validate()).to.not.throw(Error);
|
||||
});
|
||||
|
||||
it('should throw error if not inside service (servicePath not defined)', () => {
|
||||
awsPlugin.serverless.config.servicePath = false;
|
||||
expect(() => awsPlugin.validate()).to.throw(Error);
|
||||
});
|
||||
it('should throw error if not inside service (servicePath not defined)', () => {
|
||||
awsPlugin.serverless.config.servicePath = false;
|
||||
expect(() => awsPlugin.validate()).to.throw(Error);
|
||||
});
|
||||
|
||||
// NOTE: starting here, test order is important
|
||||
// NOTE: starting here, test order is important
|
||||
|
||||
it('should default to "dev" if stage is not provided', () => {
|
||||
awsPlugin.options.stage = false;
|
||||
return awsPlugin.validate().then(() => {
|
||||
expect(awsPlugin.options.stage).to.equal('dev');
|
||||
it('should default to "dev" if stage is not provided', () => {
|
||||
awsPlugin.options.stage = false;
|
||||
return awsPlugin.validate().then(() => {
|
||||
expect(awsPlugin.options.stage).to.equal('dev');
|
||||
});
|
||||
});
|
||||
|
||||
it('should use the service.defaults stage if present', () => {
|
||||
awsPlugin.options.stage = false;
|
||||
awsPlugin.serverless.service.defaults = {
|
||||
stage: 'some-stage',
|
||||
};
|
||||
|
||||
return awsPlugin.validate().then(() => {
|
||||
expect(awsPlugin.options.stage).to.equal('some-stage');
|
||||
});
|
||||
});
|
||||
|
||||
it('should default to "us-east-1" region if region is not provided', () => {
|
||||
awsPlugin.options.region = false;
|
||||
return awsPlugin.validate().then(() => {
|
||||
expect(awsPlugin.options.region).to.equal('us-east-1');
|
||||
});
|
||||
});
|
||||
|
||||
it('should use the service.defaults region if present', () => {
|
||||
awsPlugin.options.region = false;
|
||||
awsPlugin.serverless.service.defaults = {
|
||||
region: 'some-region',
|
||||
};
|
||||
|
||||
return awsPlugin.validate().then(() => {
|
||||
expect(awsPlugin.options.region).to.equal('some-region');
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
it('should use the service.defaults stage if present', () => {
|
||||
awsPlugin.options.stage = false;
|
||||
awsPlugin.serverless.service.defaults = {
|
||||
stage: 'some-stage',
|
||||
};
|
||||
describe('#validateS3BucketName()', () => {
|
||||
it('should reject an ip address as a name', () =>
|
||||
awsPlugin.validateS3BucketName('127.0.0.1')
|
||||
.then(() => {
|
||||
throw new Error('Should not get here');
|
||||
})
|
||||
.catch(err => expect(err.message).to.contain('cannot look like an IPv4 address'))
|
||||
);
|
||||
|
||||
return awsPlugin.validate().then(() => {
|
||||
expect(awsPlugin.options.stage).to.equal('some-stage');
|
||||
it('should reject names that are too long', () => {
|
||||
const bucketName = Array.from({ length: 64 }, () => 'j').join('');
|
||||
return awsPlugin.validateS3BucketName(bucketName)
|
||||
.then(() => {
|
||||
throw new Error('Should not get here');
|
||||
})
|
||||
.catch(err => expect(err.message).to.contain('longer than 63 characters'));
|
||||
});
|
||||
});
|
||||
|
||||
it('should default to "us-east-1" region if region is not provided', () => {
|
||||
awsPlugin.options.region = false;
|
||||
return awsPlugin.validate().then(() => {
|
||||
expect(awsPlugin.options.region).to.equal('us-east-1');
|
||||
});
|
||||
});
|
||||
it('should reject names that are too short', () =>
|
||||
awsPlugin.validateS3BucketName('12')
|
||||
.then(() => {
|
||||
throw new Error('Should not get here');
|
||||
})
|
||||
.catch(err => expect(err.message).to.contain('shorter than 3 characters'))
|
||||
);
|
||||
|
||||
it('should use the service.defaults region if present', () => {
|
||||
awsPlugin.options.region = false;
|
||||
awsPlugin.serverless.service.defaults = {
|
||||
region: 'some-region',
|
||||
};
|
||||
it('should reject names that contain invalid characters', () =>
|
||||
awsPlugin.validateS3BucketName('this has b@d characters')
|
||||
.then(() => {
|
||||
throw new Error('Should not get here');
|
||||
})
|
||||
.catch(err => expect(err.message).to.contain('contains invalid characters'))
|
||||
);
|
||||
|
||||
return awsPlugin.validate().then(() => {
|
||||
expect(awsPlugin.options.region).to.equal('some-region');
|
||||
});
|
||||
it('should reject names that have consecutive periods', () =>
|
||||
awsPlugin.validateS3BucketName('otherwise..valid.name')
|
||||
.then(() => {
|
||||
throw new Error('Should not get here');
|
||||
})
|
||||
.catch(err => expect(err.message).to.contain('cannot contain consecutive periods'))
|
||||
);
|
||||
|
||||
it('should reject names that start with a dash', () =>
|
||||
awsPlugin.validateS3BucketName('-invalid.name')
|
||||
.then(() => {
|
||||
throw new Error('Should not get here');
|
||||
})
|
||||
.catch(err => expect(err.message).to.contain('start with a letter or number'))
|
||||
);
|
||||
|
||||
it('should reject names that start with a period', () =>
|
||||
awsPlugin.validateS3BucketName('.invalid.name')
|
||||
.then(() => {
|
||||
throw new Error('Should not get here');
|
||||
})
|
||||
.catch(err => expect(err.message).to.contain('start with a letter or number'))
|
||||
);
|
||||
|
||||
it('should reject names that end with a dash', () =>
|
||||
awsPlugin.validateS3BucketName('invalid.name-')
|
||||
.then(() => {
|
||||
throw new Error('Should not get here');
|
||||
})
|
||||
.catch(err => expect(err.message).to.contain('end with a letter or number'))
|
||||
);
|
||||
|
||||
it('should reject names that end with a period', () =>
|
||||
awsPlugin.validateS3BucketName('invalid.name.')
|
||||
.then(() => {
|
||||
throw new Error('Should not get here');
|
||||
})
|
||||
.catch(err => expect(err.message).to.contain('end with a letter or number'))
|
||||
);
|
||||
|
||||
it('should reject names that contain uppercase letters', () =>
|
||||
awsPlugin.validateS3BucketName('otherwise.Valid.name')
|
||||
.then(() => {
|
||||
throw new Error('Should not get here');
|
||||
})
|
||||
.catch(err => expect(err.message).to.contain('cannot contain uppercase letters'))
|
||||
);
|
||||
|
||||
it('should accept valid names', () =>
|
||||
awsPlugin.validateS3BucketName('1.this.is.valid.2')
|
||||
.then(() => awsPlugin.validateS3BucketName('another.valid.name'))
|
||||
.then(() => awsPlugin.validateS3BucketName('1-2-3'))
|
||||
.then(() => awsPlugin.validateS3BucketName('123'))
|
||||
.then(() => awsPlugin.validateS3BucketName('should.be.allowed-to-mix'))
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
@ -10,6 +10,7 @@ const validTemplates = [
|
||||
'aws-python',
|
||||
'aws-java-maven',
|
||||
'aws-java-gradle',
|
||||
'aws-scala-sbt',
|
||||
];
|
||||
|
||||
const humanReadableTemplateList = `${validTemplates.slice(0, -1)
|
||||
|
||||
@ -38,8 +38,6 @@ provider:
|
||||
|
||||
# you can add packaging information here
|
||||
package:
|
||||
# include:
|
||||
# - include-me.java
|
||||
# exclude:
|
||||
# - exclude-me.java
|
||||
artifact: build/distributions/hello.zip
|
||||
|
||||
@ -38,8 +38,6 @@ provider:
|
||||
|
||||
# you can add packaging information here
|
||||
package:
|
||||
# include:
|
||||
# - include-me.java
|
||||
# exclude:
|
||||
# - exclude-me.java
|
||||
artifact: target/hello-dev.jar
|
||||
|
||||
@ -38,8 +38,6 @@ provider:
|
||||
|
||||
# you can add packaging information here
|
||||
#package:
|
||||
# include:
|
||||
# - include-me.js
|
||||
# exclude:
|
||||
# - exclude-me.js
|
||||
# artifact: my-service-code.zip
|
||||
|
||||
@ -38,8 +38,6 @@ provider:
|
||||
|
||||
# you can add packaging information here
|
||||
#package:
|
||||
# include:
|
||||
# - include-me.js
|
||||
# exclude:
|
||||
# - exclude-me.js
|
||||
# artifact: my-service-code.zip
|
||||
|
||||
21
lib/plugins/create/templates/aws-scala-sbt/build.sbt
Normal file
21
lib/plugins/create/templates/aws-scala-sbt/build.sbt
Normal file
@ -0,0 +1,21 @@
|
||||
import sbt.Keys._
|
||||
import sbt._
|
||||
import sbtrelease.Version
|
||||
|
||||
name := "hello"
|
||||
|
||||
resolvers += Resolver.sonatypeRepo("public")
|
||||
scalaVersion := "2.11.8"
|
||||
releaseNextVersion := { ver => Version(ver).map(_.bumpMinor.string).getOrElse("Error") }
|
||||
assemblyJarName in assembly := "hello.jar"
|
||||
|
||||
libraryDependencies ++= Seq(
|
||||
"com.amazonaws" % "aws-lambda-java-events" % "1.1.0",
|
||||
"com.amazonaws" % "aws-lambda-java-core" % "1.1.0"
|
||||
)
|
||||
|
||||
scalacOptions ++= Seq(
|
||||
"-unchecked",
|
||||
"-deprecation",
|
||||
"-feature",
|
||||
"-Xfatal-warnings")
|
||||
5
lib/plugins/create/templates/aws-scala-sbt/event.json
Normal file
5
lib/plugins/create/templates/aws-scala-sbt/event.json
Normal file
@ -0,0 +1,5 @@
|
||||
{
|
||||
"key3": "value3",
|
||||
"key2": "value2",
|
||||
"key1": "value1"
|
||||
}
|
||||
@ -0,0 +1,3 @@
|
||||
resolvers += Resolver.sonatypeRepo("public")
|
||||
|
||||
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.0")
|
||||
@ -0,0 +1 @@
|
||||
addSbtPlugin("com.github.gseitz" % "sbt-release" % "1.0.0")
|
||||
70
lib/plugins/create/templates/aws-scala-sbt/serverless.yml
Normal file
70
lib/plugins/create/templates/aws-scala-sbt/serverless.yml
Normal file
@ -0,0 +1,70 @@
|
||||
# Welcome to Serverless!
|
||||
#
|
||||
# This file is the main config file for your service.
|
||||
# It's very minimal at this point and uses default values.
|
||||
# You can always add more config options for more control.
|
||||
# We've included some commented out config examples here.
|
||||
# Just uncomment any of them to get that config option.
|
||||
#
|
||||
# For full config options, check the docs:
|
||||
# docs.serverless.com
|
||||
#
|
||||
# Happy Coding!
|
||||
|
||||
service: aws-scala-sbt # NOTE: update this with your service name
|
||||
|
||||
provider:
|
||||
name: aws
|
||||
runtime: java8
|
||||
|
||||
# you can overwrite defaults here
|
||||
# stage: dev
|
||||
# region: us-east-1
|
||||
|
||||
# you can add statements to the Lambda function's IAM Role here
|
||||
# iamRoleStatements:
|
||||
# - Effect: "Allow"
|
||||
# Action:
|
||||
# - "s3:ListBucket"
|
||||
# Resource: { "Fn::Join" : ["", ["arn:aws:s3:::", { "Ref" : "ServerlessDeploymentBucket" } ] ] }
|
||||
# - Effect: "Allow"
|
||||
# Action:
|
||||
# - "s3:PutObject"
|
||||
# Resource:
|
||||
# Fn::Join:
|
||||
# - ""
|
||||
# - - "arn:aws:s3:::"
|
||||
# - "Ref" : "ServerlessDeploymentBucket"
|
||||
|
||||
# you can add packaging information here
|
||||
package:
|
||||
# include:
|
||||
# - include-me.java
|
||||
# exclude:
|
||||
# - exclude-me.java
|
||||
artifact: target/scala-2.11/hello.jar
|
||||
|
||||
functions:
|
||||
hello:
|
||||
handler: hello.Handler
|
||||
|
||||
# you can add any of the following events
|
||||
# events:
|
||||
# - http:
|
||||
# path: users/create
|
||||
# method: get
|
||||
# - s3: ${env:bucket}
|
||||
# - schedule: rate(10 minutes)
|
||||
# - sns: greeter-topic
|
||||
|
||||
# you can add CloudFormation resource templates here
|
||||
#resources:
|
||||
# Resources:
|
||||
# NewResource:
|
||||
# Type: AWS::S3::Bucket
|
||||
# Properties:
|
||||
# BucketName: my-new-bucket
|
||||
# Outputs:
|
||||
# NewOutput:
|
||||
# Description: "Description for the output"
|
||||
# Value: "Some output value"
|
||||
@ -0,0 +1,11 @@
|
||||
package hello
|
||||
|
||||
import com.amazonaws.services.lambda.runtime.{Context, RequestHandler}
|
||||
|
||||
class Handler extends RequestHandler[Request, Response] {
|
||||
|
||||
def handleRequest(input: Request, context: Context): Response = {
|
||||
return new Response("Go Serverless v1.0! Your function executed successfully!", input)
|
||||
}
|
||||
|
||||
}
|
||||
@ -0,0 +1,7 @@
|
||||
package hello
|
||||
|
||||
import scala.beans.BeanProperty
|
||||
|
||||
class Request(@BeanProperty var key1: String, @BeanProperty var key2: String, @BeanProperty var key3: String) {
|
||||
def this() = this("", "", "")
|
||||
}
|
||||
@ -0,0 +1,5 @@
|
||||
package hello
|
||||
|
||||
import scala.beans.BeanProperty
|
||||
|
||||
case class Response(@BeanProperty message: String, @BeanProperty request: Request)
|
||||
@ -174,6 +174,36 @@ describe('Create', () => {
|
||||
});
|
||||
});
|
||||
|
||||
it('should generate scaffolding for "aws-scala-sbt" template', () => {
|
||||
const cwd = process.cwd();
|
||||
fse.mkdirsSync(tmpDir);
|
||||
process.chdir(tmpDir);
|
||||
create.options.template = 'aws-scala-sbt';
|
||||
|
||||
return create.create().then(() => {
|
||||
expect(create.serverless.utils.fileExistsSync(path.join(tmpDir, 'serverless.yml')))
|
||||
.to.be.equal(true);
|
||||
expect(create.serverless.utils.fileExistsSync(path.join(tmpDir, 'event.json')))
|
||||
.to.be.equal(true);
|
||||
expect(create.serverless.utils.fileExistsSync(path.join(tmpDir, 'build.sbt')))
|
||||
.to.be.equal(true);
|
||||
expect(create.serverless.utils.fileExistsSync(path.join(tmpDir, 'src', 'main', 'scala',
|
||||
'hello', 'Handler.scala'
|
||||
)))
|
||||
.to.be.equal(true);
|
||||
expect(create.serverless.utils.fileExistsSync(path.join(tmpDir, 'src', 'main', 'scala',
|
||||
'hello', 'Request.scala'
|
||||
)))
|
||||
.to.be.equal(true);
|
||||
expect(create.serverless.utils.fileExistsSync(path.join(tmpDir, 'src', 'main', 'scala',
|
||||
'hello', 'Response.scala'
|
||||
)))
|
||||
.to.be.equal(true);
|
||||
|
||||
process.chdir(cwd);
|
||||
});
|
||||
});
|
||||
|
||||
// this test should live here because of process.cwd() which might cause trouble when using
|
||||
// nested dirs like its done here
|
||||
it('should create a renamed service in the directory if using the "path" option', () => {
|
||||
|
||||
@ -9,8 +9,17 @@ This plugin creates a deployment package on a per service basis (it will zip up
|
||||
It will zip the whole service directory. The artifact will be stored in the `.serverless` directory which will be created
|
||||
upon zipping the service. The resulting path to the artifact will be appended to the `service.package.artifact` property.
|
||||
|
||||
The services `include` and `exclude` arrays are considered during zipping. At first the `exclude` will be applied. After
|
||||
that the `include` will be applied to ensure that previously excluded files and folders can be included again.
|
||||
Services can use `exclude` as an array. The array should be a series of
|
||||
globs to be considered for exclusion.
|
||||
|
||||
For example in serverless.yml:
|
||||
|
||||
``` yaml
|
||||
package:
|
||||
exclude:
|
||||
- "test/**"
|
||||
- "**/spec.js"
|
||||
```
|
||||
|
||||
Serverless will automatically exclude `.git`, `.gitignore`, `serverless.yml`, and `.DS_Store`.
|
||||
|
||||
|
||||
@ -21,11 +21,6 @@ module.exports = {
|
||||
return _.union(exclude, packageExcludes, this.defaultExcludes);
|
||||
},
|
||||
|
||||
getIncludedPaths(include) {
|
||||
const packageIncludes = this.serverless.service.package.include || [];
|
||||
return _.union(include, packageIncludes);
|
||||
},
|
||||
|
||||
getServiceArtifactName() {
|
||||
return `${this.serverless.service.service}.zip`;
|
||||
},
|
||||
@ -57,10 +52,9 @@ module.exports = {
|
||||
const servicePath = this.serverless.config.servicePath;
|
||||
|
||||
const exclude = this.getExcludedPaths();
|
||||
const include = this.getIncludedPaths();
|
||||
const zipFileName = this.getServiceArtifactName();
|
||||
|
||||
return this.zipDirectory(servicePath, exclude, include, zipFileName).then(filePath => {
|
||||
return this.zipDirectory(servicePath, exclude, zipFileName).then(filePath => {
|
||||
this.serverless.service.package.artifact = filePath;
|
||||
return filePath;
|
||||
});
|
||||
@ -80,10 +74,9 @@ module.exports = {
|
||||
const servicePath = this.serverless.config.servicePath;
|
||||
|
||||
const exclude = this.getExcludedPaths(funcPackageConfig.exclude);
|
||||
const include = this.getIncludedPaths(funcPackageConfig.include);
|
||||
const zipFileName = this.getFunctionArtifactName(functionObject);
|
||||
|
||||
return this.zipDirectory(servicePath, exclude, include, zipFileName).then((artifactPath) => {
|
||||
return this.zipDirectory(servicePath, exclude, zipFileName).then((artifactPath) => {
|
||||
functionObject.artifact = artifactPath;
|
||||
return artifactPath;
|
||||
});
|
||||
|
||||
@ -4,13 +4,19 @@ const archiver = require('archiver');
|
||||
const BbPromise = require('bluebird');
|
||||
const path = require('path');
|
||||
const fs = require('fs');
|
||||
const glob = require('glob');
|
||||
|
||||
module.exports = {
|
||||
zipDirectory(servicePath, exclude, include, zipFileName) {
|
||||
zipDirectory(servicePath, exclude, zipFileName) {
|
||||
exclude.push('.serverless/**');
|
||||
|
||||
const zip = archiver.create('zip');
|
||||
|
||||
const artifactFilePath = path.join(servicePath,
|
||||
'.serverless', zipFileName);
|
||||
const artifactFilePath = path.join(
|
||||
servicePath,
|
||||
'.serverless',
|
||||
zipFileName
|
||||
);
|
||||
|
||||
this.serverless.utils.writeFileDir(artifactFilePath);
|
||||
|
||||
@ -19,22 +25,26 @@ module.exports = {
|
||||
output.on('open', () => {
|
||||
zip.pipe(output);
|
||||
|
||||
this.serverless.utils.walkDirSync(servicePath).forEach((filePath) => {
|
||||
const relativeFilePath = path.relative(servicePath, filePath);
|
||||
const files = glob.sync('**', {
|
||||
cwd: servicePath,
|
||||
ignore: exclude,
|
||||
dot: true,
|
||||
silent: true,
|
||||
});
|
||||
|
||||
// ensure we don't include the new zip file in our zip
|
||||
if (relativeFilePath.startsWith('.serverless')) return;
|
||||
files.forEach((filePath) => {
|
||||
const fullPath = path.resolve(
|
||||
servicePath,
|
||||
filePath
|
||||
);
|
||||
|
||||
const shouldBeExcluded =
|
||||
exclude.some(value => relativeFilePath.toLowerCase().indexOf(value.toLowerCase()) > -1);
|
||||
const stats = fs.statSync(fullPath);
|
||||
|
||||
const shouldBeIncluded =
|
||||
include.some(value => relativeFilePath.toLowerCase().indexOf(value.toLowerCase()) > -1);
|
||||
|
||||
if (!shouldBeExcluded || shouldBeIncluded) {
|
||||
const permissions = fs.statSync(filePath).mode;
|
||||
|
||||
zip.append(fs.readFileSync(filePath), { name: relativeFilePath, mode: permissions });
|
||||
if (!stats.isDirectory(fullPath)) {
|
||||
zip.append(fs.readFileSync(fullPath), {
|
||||
name: filePath,
|
||||
mode: stats.mode,
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
|
||||
@ -62,24 +62,6 @@ describe('#packageService()', () => {
|
||||
});
|
||||
});
|
||||
|
||||
describe('#getIncludedPaths()', () => {
|
||||
it('should include defaults', () => {
|
||||
const include = packageService.getIncludedPaths();
|
||||
expect(include).to.deep.equal([]);
|
||||
});
|
||||
|
||||
it('should return package includes', () => {
|
||||
const packageIncludes = [
|
||||
'dir', 'file.js',
|
||||
];
|
||||
|
||||
serverless.service.package.include = packageIncludes;
|
||||
|
||||
const exclude = packageService.getIncludedPaths();
|
||||
expect(exclude).to.deep.equal(packageIncludes);
|
||||
});
|
||||
});
|
||||
|
||||
describe('#getServiceArtifactName()', () => {
|
||||
it('should create name with time', () => {
|
||||
const name = packageService.getServiceArtifactName();
|
||||
@ -149,7 +131,6 @@ describe('#packageService()', () => {
|
||||
it('should call zipService with settings', () => {
|
||||
const servicePath = 'test';
|
||||
const exclude = ['test-exclude'];
|
||||
const include = ['test-include'];
|
||||
const artifactName = 'test-artifact.zip';
|
||||
const artifactFilePath = '/some/fake/path/test-artifact.zip';
|
||||
|
||||
@ -157,8 +138,6 @@ describe('#packageService()', () => {
|
||||
|
||||
const getExcludedPathsStub = sinon
|
||||
.stub(packageService, 'getExcludedPaths').returns(exclude);
|
||||
const getIncludedPathsStub = sinon
|
||||
.stub(packageService, 'getIncludedPaths').returns(include);
|
||||
const getServiceArtifactNameStub = sinon
|
||||
.stub(packageService, 'getServiceArtifactName').returns(artifactName);
|
||||
|
||||
@ -167,14 +146,12 @@ describe('#packageService()', () => {
|
||||
|
||||
return packageService.packageAll().then(() => {
|
||||
expect(getExcludedPathsStub.calledOnce).to.be.equal(true);
|
||||
expect(getIncludedPathsStub.calledOnce).to.be.equal(true);
|
||||
expect(getServiceArtifactNameStub.calledOnce).to.be.equal(true);
|
||||
|
||||
expect(zipDirectoryStub.calledOnce).to.be.equal(true);
|
||||
expect(zipDirectoryStub.args[0][0]).to.be.equal(servicePath);
|
||||
expect(zipDirectoryStub.args[0][1]).to.be.equal(exclude);
|
||||
expect(zipDirectoryStub.args[0][2]).to.be.equal(include);
|
||||
expect(zipDirectoryStub.args[0][3]).to.be.equal(artifactName);
|
||||
expect(zipDirectoryStub.args[0][2]).to.be.equal(artifactName);
|
||||
|
||||
expect(serverless.service.package.artifact).to.be.equal(artifactFilePath);
|
||||
});
|
||||
@ -187,7 +164,6 @@ describe('#packageService()', () => {
|
||||
const funcName = 'test-func';
|
||||
|
||||
const exclude = ['test-exclude'];
|
||||
const include = ['test-include'];
|
||||
const artifactName = 'test-artifact.zip';
|
||||
const artifactFilePath = '/some/fake/path/test-artifact.zip';
|
||||
|
||||
@ -197,8 +173,6 @@ describe('#packageService()', () => {
|
||||
|
||||
const getExcludedPathsStub = sinon
|
||||
.stub(packageService, 'getExcludedPaths').returns(exclude);
|
||||
const getIncludedPathsStub = sinon
|
||||
.stub(packageService, 'getIncludedPaths').returns(include);
|
||||
const getFunctionArtifactNameStub = sinon
|
||||
.stub(packageService, 'getFunctionArtifactName').returns(artifactName);
|
||||
|
||||
@ -207,14 +181,12 @@ describe('#packageService()', () => {
|
||||
|
||||
return packageService.packageFunction(funcName).then((filePath) => {
|
||||
expect(getExcludedPathsStub.calledOnce).to.be.equal(true);
|
||||
expect(getIncludedPathsStub.calledOnce).to.be.equal(true);
|
||||
expect(getFunctionArtifactNameStub.calledOnce).to.be.equal(true);
|
||||
|
||||
expect(zipDirectoryStub.calledOnce).to.be.equal(true);
|
||||
expect(zipDirectoryStub.args[0][0]).to.be.equal(servicePath);
|
||||
expect(zipDirectoryStub.args[0][1]).to.be.equal(exclude);
|
||||
expect(zipDirectoryStub.args[0][2]).to.be.equal(include);
|
||||
expect(zipDirectoryStub.args[0][3]).to.be.equal(artifactName);
|
||||
expect(zipDirectoryStub.args[0][2]).to.be.equal(artifactName);
|
||||
|
||||
expect(filePath).to.be.equal(artifactFilePath);
|
||||
});
|
||||
|
||||
@ -33,6 +33,14 @@ describe('#zipService()', () => {
|
||||
permissions: 444,
|
||||
},
|
||||
},
|
||||
'node_modules/include-me': {
|
||||
include: 'some-file-content',
|
||||
'include-aswell': 'some-file content',
|
||||
},
|
||||
'node_modules/exclude-me': {
|
||||
exclude: 'some-file-content',
|
||||
'exclude-aswell': 'some-file content',
|
||||
},
|
||||
'exclude-me': {
|
||||
'some-file': 'some-file content',
|
||||
},
|
||||
@ -86,11 +94,10 @@ describe('#zipService()', () => {
|
||||
|
||||
it('should zip a whole service', () => {
|
||||
const exclude = [];
|
||||
const include = [];
|
||||
const zipFileName = getTestArtifactFileName('whole-service');
|
||||
|
||||
return packageService
|
||||
.zipDirectory(servicePath, exclude, include, zipFileName).then((artifact) => {
|
||||
.zipDirectory(servicePath, exclude, zipFileName).then((artifact) => {
|
||||
const data = fs.readFileSync(artifact);
|
||||
|
||||
return zip.loadAsync(data);
|
||||
@ -98,7 +105,7 @@ describe('#zipService()', () => {
|
||||
const unzippedFileData = unzippedData.files;
|
||||
|
||||
expect(Object.keys(unzippedFileData)
|
||||
.filter(file => !unzippedFileData[file].dir).length).to.equal(9);
|
||||
.filter(file => !unzippedFileData[file].dir).length).to.equal(13);
|
||||
|
||||
expect(unzippedFileData['handler.js'].name)
|
||||
.to.equal('handler.js');
|
||||
@ -126,15 +133,26 @@ describe('#zipService()', () => {
|
||||
|
||||
expect(unzippedFileData['a-serverless-plugin.js'].name)
|
||||
.to.equal('a-serverless-plugin.js');
|
||||
|
||||
expect(unzippedFileData['node_modules/include-me/include'].name)
|
||||
.to.equal('node_modules/include-me/include');
|
||||
|
||||
expect(unzippedFileData['node_modules/include-me/include-aswell'].name)
|
||||
.to.equal('node_modules/include-me/include-aswell');
|
||||
|
||||
expect(unzippedFileData['node_modules/exclude-me/exclude'].name)
|
||||
.to.equal('node_modules/exclude-me/exclude');
|
||||
|
||||
expect(unzippedFileData['node_modules/exclude-me/exclude-aswell'].name)
|
||||
.to.equal('node_modules/exclude-me/exclude-aswell');
|
||||
});
|
||||
});
|
||||
|
||||
it('should keep file permissions', () => {
|
||||
const exclude = [];
|
||||
const include = [];
|
||||
const zipFileName = getTestArtifactFileName('file-permissions');
|
||||
|
||||
return packageService.zipDirectory(servicePath, exclude, include, zipFileName)
|
||||
return packageService.zipDirectory(servicePath, exclude, zipFileName)
|
||||
.then((artifact) => {
|
||||
const data = fs.readFileSync(artifact);
|
||||
return zip.loadAsync(data);
|
||||
@ -157,45 +175,15 @@ describe('#zipService()', () => {
|
||||
});
|
||||
});
|
||||
|
||||
it('should exclude defined files and folders', () => {
|
||||
const exclude = ['exclude-me.js', 'exclude-me'];
|
||||
const include = [];
|
||||
const zipFileName = getTestArtifactFileName('exclude');
|
||||
it('should exclude globs', () => {
|
||||
const exclude = [
|
||||
'exclude-me*/**',
|
||||
'node_modules/exclude-me/**',
|
||||
];
|
||||
|
||||
return packageService.zipDirectory(servicePath, exclude, include, zipFileName)
|
||||
.then((artifact) => {
|
||||
const data = fs.readFileSync(artifact);
|
||||
|
||||
return zip.loadAsync(data);
|
||||
}).then(unzippedData => {
|
||||
const unzippedFileData = unzippedData.files;
|
||||
|
||||
expect(Object.keys(unzippedFileData)
|
||||
.filter(file => !unzippedFileData[file].dir).length).to.equal(7);
|
||||
|
||||
expect(unzippedFileData['handler.js'].name)
|
||||
.to.equal('handler.js');
|
||||
|
||||
expect(unzippedFileData['lib/function.js'].name)
|
||||
.to.equal('lib/function.js');
|
||||
|
||||
expect(unzippedFileData['include-me.js'].name)
|
||||
.to.equal('include-me.js');
|
||||
|
||||
expect(unzippedFileData['include-me/some-file'].name)
|
||||
.to.equal('include-me/some-file');
|
||||
|
||||
expect(unzippedFileData['a-serverless-plugin.js'].name)
|
||||
.to.equal('a-serverless-plugin.js');
|
||||
});
|
||||
});
|
||||
|
||||
it('should include a previously excluded file', () => {
|
||||
const exclude = ['exclude-me.js', 'exclude-me'];
|
||||
const include = ['exclude-me.js', 'exclude-me'];
|
||||
const zipFileName = getTestArtifactFileName('re-include');
|
||||
|
||||
return packageService.zipDirectory(servicePath, exclude, include, zipFileName)
|
||||
return packageService.zipDirectory(servicePath, exclude, zipFileName)
|
||||
.then((artifact) => {
|
||||
const data = fs.readFileSync(artifact);
|
||||
|
||||
@ -218,14 +206,14 @@ describe('#zipService()', () => {
|
||||
expect(unzippedFileData['include-me/some-file'].name)
|
||||
.to.equal('include-me/some-file');
|
||||
|
||||
expect(unzippedFileData['exclude-me.js'].name)
|
||||
.to.equal('exclude-me.js');
|
||||
|
||||
expect(unzippedFileData['exclude-me/some-file'].name)
|
||||
.to.equal('exclude-me/some-file');
|
||||
|
||||
expect(unzippedFileData['a-serverless-plugin.js'].name)
|
||||
.to.equal('a-serverless-plugin.js');
|
||||
|
||||
expect(unzippedFileData['node_modules/include-me/include'].name)
|
||||
.to.equal('node_modules/include-me/include');
|
||||
|
||||
expect(unzippedFileData['node_modules/include-me/include-aswell'].name)
|
||||
.to.equal('node_modules/include-me/include-aswell');
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
61
npm-shrinkwrap.json
generated
61
npm-shrinkwrap.json
generated
@ -114,11 +114,6 @@
|
||||
"from": "async@>=1.5.2 <2.0.0",
|
||||
"resolved": "https://registry.npmjs.org/async/-/async-1.5.2.tgz"
|
||||
},
|
||||
"asynckit": {
|
||||
"version": "0.4.0",
|
||||
"from": "asynckit@>=0.4.0 <0.5.0",
|
||||
"resolved": "https://registry.npmjs.org/asynckit/-/asynckit-0.4.0.tgz"
|
||||
},
|
||||
"aws-sdk": {
|
||||
"version": "2.5.3",
|
||||
"from": "aws-sdk@>=2.3.17 <3.0.0",
|
||||
@ -141,7 +136,7 @@
|
||||
},
|
||||
"bl": {
|
||||
"version": "1.1.2",
|
||||
"from": "bl@>=1.1.2 <1.2.0",
|
||||
"from": "bl@>=1.0.0 <2.0.0",
|
||||
"resolved": "https://registry.npmjs.org/bl/-/bl-1.1.2.tgz",
|
||||
"dependencies": {
|
||||
"readable-stream": {
|
||||
@ -183,7 +178,7 @@
|
||||
},
|
||||
"builtin-modules": {
|
||||
"version": "1.1.1",
|
||||
"from": "builtin-modules@>=1.1.1 <2.0.0",
|
||||
"from": "builtin-modules@>=1.0.0 <2.0.0",
|
||||
"resolved": "https://registry.npmjs.org/builtin-modules/-/builtin-modules-1.1.1.tgz"
|
||||
},
|
||||
"caller-id": {
|
||||
@ -218,7 +213,7 @@
|
||||
},
|
||||
"chalk": {
|
||||
"version": "1.1.3",
|
||||
"from": "chalk@>=1.1.0 <2.0.0",
|
||||
"from": "chalk@>=1.1.1 <2.0.0",
|
||||
"resolved": "https://registry.npmjs.org/chalk/-/chalk-1.1.3.tgz"
|
||||
},
|
||||
"circular-json": {
|
||||
@ -332,12 +327,12 @@
|
||||
},
|
||||
"debug": {
|
||||
"version": "2.2.0",
|
||||
"from": "debug@>=2.2.0 <3.0.0",
|
||||
"from": "debug@>=2.0.0 <3.0.0",
|
||||
"resolved": "https://registry.npmjs.org/debug/-/debug-2.2.0.tgz"
|
||||
},
|
||||
"decamelize": {
|
||||
"version": "1.2.0",
|
||||
"from": "decamelize@>=1.0.0 <2.0.0",
|
||||
"from": "decamelize@>=1.1.1 <2.0.0",
|
||||
"resolved": "https://registry.npmjs.org/decamelize/-/decamelize-1.2.0.tgz"
|
||||
},
|
||||
"deep-eql": {
|
||||
@ -563,7 +558,7 @@
|
||||
},
|
||||
"fs-extra": {
|
||||
"version": "0.26.7",
|
||||
"from": "fs-extra@>=0.26.4 <0.27.0",
|
||||
"from": "fs-extra@>=0.26.7 <0.27.0",
|
||||
"resolved": "https://registry.npmjs.org/fs-extra/-/fs-extra-0.26.7.tgz"
|
||||
},
|
||||
"fs.realpath": {
|
||||
@ -642,7 +637,7 @@
|
||||
},
|
||||
"har-validator": {
|
||||
"version": "2.0.6",
|
||||
"from": "har-validator@>=2.0.2 <2.1.0",
|
||||
"from": "har-validator@>=2.0.6 <2.1.0",
|
||||
"resolved": "https://registry.npmjs.org/har-validator/-/har-validator-2.0.6.tgz"
|
||||
},
|
||||
"has-ansi": {
|
||||
@ -657,7 +652,7 @@
|
||||
},
|
||||
"hawk": {
|
||||
"version": "3.1.3",
|
||||
"from": "hawk@>=3.1.0 <3.2.0",
|
||||
"from": "hawk@>=3.1.3 <3.2.0",
|
||||
"resolved": "https://registry.npmjs.org/hawk/-/hawk-3.1.3.tgz"
|
||||
},
|
||||
"hoek": {
|
||||
@ -812,7 +807,7 @@
|
||||
},
|
||||
"js-yaml": {
|
||||
"version": "3.6.1",
|
||||
"from": "js-yaml@>=3.5.5 <4.0.0",
|
||||
"from": "js-yaml@>=3.6.1 <4.0.0",
|
||||
"resolved": "https://registry.npmjs.org/js-yaml/-/js-yaml-3.6.1.tgz"
|
||||
},
|
||||
"jsbn": {
|
||||
@ -895,6 +890,11 @@
|
||||
"from": "lcid@>=1.0.0 <2.0.0",
|
||||
"resolved": "https://registry.npmjs.org/lcid/-/lcid-1.0.0.tgz"
|
||||
},
|
||||
"lcov-parse": {
|
||||
"version": "0.0.6",
|
||||
"from": "lcov-parse@0.0.6",
|
||||
"resolved": "https://registry.npmjs.org/lcov-parse/-/lcov-parse-0.0.6.tgz"
|
||||
},
|
||||
"levn": {
|
||||
"version": "0.3.0",
|
||||
"from": "levn@>=0.3.0 <0.4.0",
|
||||
@ -985,6 +985,11 @@
|
||||
"from": "lodash.keys@>=3.0.0 <4.0.0",
|
||||
"resolved": "https://registry.npmjs.org/lodash.keys/-/lodash.keys-3.1.2.tgz"
|
||||
},
|
||||
"log-driver": {
|
||||
"version": "1.2.4",
|
||||
"from": "log-driver@1.2.4",
|
||||
"resolved": "https://registry.npmjs.org/log-driver/-/log-driver-1.2.4.tgz"
|
||||
},
|
||||
"lolex": {
|
||||
"version": "1.3.2",
|
||||
"from": "lolex@1.3.2",
|
||||
@ -1069,7 +1074,7 @@
|
||||
},
|
||||
"node-uuid": {
|
||||
"version": "1.4.7",
|
||||
"from": "node-uuid@>=1.4.2 <2.0.0",
|
||||
"from": "node-uuid@>=1.4.7 <1.5.0",
|
||||
"resolved": "https://registry.npmjs.org/node-uuid/-/node-uuid-1.4.7.tgz"
|
||||
},
|
||||
"nopt": {
|
||||
@ -1094,12 +1099,12 @@
|
||||
},
|
||||
"oauth-sign": {
|
||||
"version": "0.8.2",
|
||||
"from": "oauth-sign@>=0.8.0 <0.9.0",
|
||||
"from": "oauth-sign@>=0.8.1 <0.9.0",
|
||||
"resolved": "https://registry.npmjs.org/oauth-sign/-/oauth-sign-0.8.2.tgz"
|
||||
},
|
||||
"object-assign": {
|
||||
"version": "4.1.0",
|
||||
"from": "object-assign@>=4.0.1 <5.0.0",
|
||||
"from": "object-assign@>=4.1.0 <5.0.0",
|
||||
"resolved": "https://registry.npmjs.org/object-assign/-/object-assign-4.1.0.tgz"
|
||||
},
|
||||
"once": {
|
||||
@ -1348,7 +1353,7 @@
|
||||
},
|
||||
"semver-regex": {
|
||||
"version": "1.0.0",
|
||||
"from": "semver-regex@latest",
|
||||
"from": "semver-regex@>=1.0.0 <2.0.0",
|
||||
"resolved": "https://registry.npmjs.org/semver-regex/-/semver-regex-1.0.0.tgz"
|
||||
},
|
||||
"set-blocking": {
|
||||
@ -1420,7 +1425,7 @@
|
||||
},
|
||||
"stack-trace": {
|
||||
"version": "0.0.9",
|
||||
"from": "stack-trace@>=0.0.0 <0.1.0",
|
||||
"from": "stack-trace@>=0.0.7 <0.1.0",
|
||||
"resolved": "https://registry.npmjs.org/stack-trace/-/stack-trace-0.0.9.tgz"
|
||||
},
|
||||
"string_decoder": {
|
||||
@ -1587,6 +1592,16 @@
|
||||
"from": "uglify-to-browserify@>=1.0.0 <1.1.0",
|
||||
"resolved": "https://registry.npmjs.org/uglify-to-browserify/-/uglify-to-browserify-1.0.2.tgz"
|
||||
},
|
||||
"underscore": {
|
||||
"version": "1.7.0",
|
||||
"from": "underscore@>=1.7.0 <1.8.0",
|
||||
"resolved": "https://registry.npmjs.org/underscore/-/underscore-1.7.0.tgz"
|
||||
},
|
||||
"underscore.string": {
|
||||
"version": "2.4.0",
|
||||
"from": "underscore.string@>=2.4.0 <2.5.0",
|
||||
"resolved": "https://registry.npmjs.org/underscore.string/-/underscore.string-2.4.0.tgz"
|
||||
},
|
||||
"uri-js": {
|
||||
"version": "2.1.1",
|
||||
"from": "uri-js@>=2.1.1 <3.0.0",
|
||||
@ -1608,9 +1623,9 @@
|
||||
"resolved": "https://registry.npmjs.org/util-deprecate/-/util-deprecate-1.0.2.tgz"
|
||||
},
|
||||
"uuid": {
|
||||
"version": "2.0.2",
|
||||
"from": "uuid@latest",
|
||||
"resolved": "https://registry.npmjs.org/uuid/-/uuid-2.0.2.tgz"
|
||||
"version": "2.0.3",
|
||||
"from": "uuid@>=2.0.2 <3.0.0",
|
||||
"resolved": "https://registry.npmjs.org/uuid/-/uuid-2.0.3.tgz"
|
||||
},
|
||||
"validate-npm-package-license": {
|
||||
"version": "3.0.1",
|
||||
@ -1639,7 +1654,7 @@
|
||||
},
|
||||
"wordwrap": {
|
||||
"version": "1.0.0",
|
||||
"from": "wordwrap@>=0.0.2",
|
||||
"from": "wordwrap@>=1.0.0 <1.1.0",
|
||||
"resolved": "https://registry.npmjs.org/wordwrap/-/wordwrap-1.0.0.tgz"
|
||||
},
|
||||
"wrap-ansi": {
|
||||
|
||||
@ -67,6 +67,7 @@
|
||||
"bluebird": "^3.4.0",
|
||||
"chalk": "^1.1.1",
|
||||
"fs-extra": "^0.26.7",
|
||||
"glob": "^7.0.6",
|
||||
"https-proxy-agent": "^1.0.0",
|
||||
"js-yaml": "^3.6.1",
|
||||
"json-refs": "^2.1.5",
|
||||
|
||||
@ -116,10 +116,11 @@ describe('CLI', () => {
|
||||
};
|
||||
}
|
||||
}
|
||||
const pluginMock = new PluginMock();
|
||||
const plugins = [pluginMock];
|
||||
serverless.pluginManager.addPlugin(PluginMock);
|
||||
|
||||
cli.setLoadedPlugins(serverless.pluginManager.getPlugins());
|
||||
cli.setLoadedCommands(serverless.pluginManager.getCommands());
|
||||
|
||||
cli.setLoadedPlugins(plugins);
|
||||
const processedInput = cli.processInput();
|
||||
const helpDisplayed = cli.displayHelp(processedInput);
|
||||
|
||||
@ -180,10 +181,11 @@ describe('CLI', () => {
|
||||
};
|
||||
}
|
||||
}
|
||||
const pluginMock = new PluginMock();
|
||||
const plugins = [pluginMock];
|
||||
serverless.pluginManager.addPlugin(PluginMock);
|
||||
|
||||
cli.setLoadedPlugins(serverless.pluginManager.getPlugins());
|
||||
cli.setLoadedCommands(serverless.pluginManager.getCommands());
|
||||
|
||||
cli.setLoadedPlugins(plugins);
|
||||
const processedInput = cli.processInput();
|
||||
const helpDisplayed = cli.displayHelp(processedInput);
|
||||
|
||||
@ -228,10 +230,11 @@ describe('CLI', () => {
|
||||
};
|
||||
}
|
||||
}
|
||||
const pluginMock = new PluginMock();
|
||||
const plugins = [pluginMock];
|
||||
serverless.pluginManager.addPlugin(PluginMock);
|
||||
|
||||
cli.setLoadedPlugins(serverless.pluginManager.getPlugins());
|
||||
cli.setLoadedCommands(serverless.pluginManager.getCommands());
|
||||
|
||||
cli.setLoadedPlugins(plugins);
|
||||
const processedInput = cli.processInput();
|
||||
const helpDisplayed = cli.displayHelp(processedInput);
|
||||
|
||||
|
||||
@ -214,10 +214,6 @@ describe('PluginManager', () => {
|
||||
expect(pluginManager.plugins.length).to.equal(0);
|
||||
});
|
||||
|
||||
it('should create an empty commandsList array', () => {
|
||||
expect(pluginManager.commandsList.length).to.equal(0);
|
||||
});
|
||||
|
||||
it('should create an empty commands object', () => {
|
||||
expect(pluginManager.commands).to.deep.equal({});
|
||||
});
|
||||
@ -254,81 +250,33 @@ describe('PluginManager', () => {
|
||||
it('should convert shortcuts into options when a one level deep command matches', () => {
|
||||
const cliOptionsMock = { r: 'eu-central-1', region: 'us-east-1' };
|
||||
const cliCommandsMock = ['deploy']; // command with one level deepness
|
||||
const commandsMock = {
|
||||
deploy: {
|
||||
options: {
|
||||
region: {
|
||||
shortcut: 'r',
|
||||
},
|
||||
const commandMock = {
|
||||
options: {
|
||||
region: {
|
||||
shortcut: 'r',
|
||||
},
|
||||
},
|
||||
};
|
||||
pluginManager.setCliCommands(cliCommandsMock);
|
||||
pluginManager.setCliOptions(cliOptionsMock);
|
||||
|
||||
pluginManager.convertShortcutsIntoOptions(cliOptionsMock, commandsMock);
|
||||
pluginManager.convertShortcutsIntoOptions(commandMock);
|
||||
|
||||
expect(pluginManager.cliOptions.region).to.equal(cliOptionsMock.r);
|
||||
});
|
||||
|
||||
it('should convert shortcuts into options when a two level deep command matches', () => {
|
||||
const cliOptionsMock = { f: 'function-1', function: 'function-2' };
|
||||
const cliCommandsMock = ['deploy', 'function']; // command with two level deepness
|
||||
const commandsMock = {
|
||||
deploy: {
|
||||
commands: {
|
||||
function: {
|
||||
options: {
|
||||
function: {
|
||||
shortcut: 'f',
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
};
|
||||
pluginManager.setCliCommands(cliCommandsMock);
|
||||
pluginManager.setCliOptions(cliOptionsMock);
|
||||
|
||||
pluginManager.convertShortcutsIntoOptions(cliOptionsMock, commandsMock);
|
||||
|
||||
expect(pluginManager.cliOptions.function).to.equal(cliOptionsMock.f);
|
||||
});
|
||||
|
||||
it('should not convert shortcuts into options when the command does not match', () => {
|
||||
const cliOptionsMock = { r: 'eu-central-1', region: 'us-east-1' };
|
||||
const cliCommandsMock = ['foo'];
|
||||
const commandsMock = {
|
||||
deploy: {
|
||||
options: {
|
||||
region: {
|
||||
shortcut: 'r',
|
||||
},
|
||||
},
|
||||
},
|
||||
};
|
||||
pluginManager.setCliCommands(cliCommandsMock);
|
||||
pluginManager.setCliOptions(cliOptionsMock);
|
||||
|
||||
pluginManager.convertShortcutsIntoOptions(cliOptionsMock, commandsMock);
|
||||
|
||||
expect(pluginManager.cliOptions.region).to.equal(cliOptionsMock.region);
|
||||
});
|
||||
|
||||
it('should not convert shortcuts into options when the shortcut is not given', () => {
|
||||
const cliOptionsMock = { r: 'eu-central-1', region: 'us-east-1' };
|
||||
const cliCommandsMock = ['deploy'];
|
||||
const commandsMock = {
|
||||
deploy: {
|
||||
options: {
|
||||
region: {},
|
||||
},
|
||||
const commandMock = {
|
||||
options: {
|
||||
region: {},
|
||||
},
|
||||
};
|
||||
pluginManager.setCliCommands(cliCommandsMock);
|
||||
pluginManager.setCliOptions(cliOptionsMock);
|
||||
|
||||
pluginManager.convertShortcutsIntoOptions(cliOptionsMock, commandsMock);
|
||||
pluginManager.convertShortcutsIntoOptions(commandMock);
|
||||
|
||||
expect(pluginManager.cliOptions.region).to.equal(cliOptionsMock.region);
|
||||
});
|
||||
@ -344,7 +292,7 @@ describe('PluginManager', () => {
|
||||
it('should load the plugin commands', () => {
|
||||
pluginManager.addPlugin(SynchronousPluginMock);
|
||||
|
||||
expect(pluginManager.commandsList[0]).to.have.property('deploy');
|
||||
expect(pluginManager.commands).to.have.property('deploy');
|
||||
});
|
||||
});
|
||||
|
||||
@ -438,19 +386,58 @@ describe('PluginManager', () => {
|
||||
const synchronousPluginMockInstance = new SynchronousPluginMock();
|
||||
pluginManager.loadCommands(synchronousPluginMockInstance);
|
||||
|
||||
expect(pluginManager.commandsList[0]).to.have.property('deploy');
|
||||
expect(pluginManager.commands).to.have.property('deploy');
|
||||
});
|
||||
|
||||
it('should merge plugin commands', () => {
|
||||
pluginManager.loadCommands({
|
||||
commands: {
|
||||
deploy: {
|
||||
lifecycleEvents: [
|
||||
'one',
|
||||
],
|
||||
options: {
|
||||
foo: {},
|
||||
},
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
pluginManager.loadCommands({
|
||||
commands: {
|
||||
deploy: {
|
||||
lifecycleEvents: [
|
||||
'one',
|
||||
'two',
|
||||
],
|
||||
options: {
|
||||
bar: {},
|
||||
},
|
||||
commands: {
|
||||
fn: {
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
expect(pluginManager.commands.deploy).to.have.property('options')
|
||||
.that.has.all.keys('foo', 'bar');
|
||||
expect(pluginManager.commands.deploy).to.have.property('lifecycleEvents')
|
||||
.that.is.an('array')
|
||||
.that.deep.equals(['one', 'two']);
|
||||
expect(pluginManager.commands.deploy.commands).to.have.property('fn');
|
||||
});
|
||||
});
|
||||
|
||||
describe('#getEvents()', () => {
|
||||
beforeEach(function () { // eslint-disable-line prefer-arrow-callback
|
||||
const synchronousPluginMockInstance = new SynchronousPluginMock();
|
||||
pluginManager.loadCommands(synchronousPluginMockInstance);
|
||||
pluginManager.addPlugin(SynchronousPluginMock);
|
||||
});
|
||||
|
||||
it('should get all the matching events for a root level command in the correct order', () => {
|
||||
const commandsArray = ['deploy'];
|
||||
const events = pluginManager.getEvents(commandsArray, pluginManager.commands);
|
||||
const command = pluginManager.getCommand(['deploy']);
|
||||
const events = pluginManager.getEvents(command);
|
||||
|
||||
expect(events[0]).to.equal('before:deploy:resources');
|
||||
expect(events[1]).to.equal('deploy:resources');
|
||||
@ -461,8 +448,8 @@ describe('PluginManager', () => {
|
||||
});
|
||||
|
||||
it('should get all the matching events for a nested level command in the correct order', () => {
|
||||
const commandsArray = ['deploy', 'onpremises'];
|
||||
const events = pluginManager.getEvents(commandsArray, pluginManager.commands);
|
||||
const command = pluginManager.getCommand(['deploy', 'onpremises']);
|
||||
const events = pluginManager.getEvents(command);
|
||||
|
||||
expect(events[0]).to.equal('before:deploy:onpremises:resources');
|
||||
expect(events[1]).to.equal('deploy:onpremises:resources');
|
||||
@ -471,13 +458,6 @@ describe('PluginManager', () => {
|
||||
expect(events[4]).to.equal('deploy:onpremises:functions');
|
||||
expect(events[5]).to.equal('after:deploy:onpremises:functions');
|
||||
});
|
||||
|
||||
it('should return an empty events array when the command is not defined', () => {
|
||||
const commandsArray = ['foo'];
|
||||
const events = pluginManager.getEvents(commandsArray, pluginManager.commands);
|
||||
|
||||
expect(events.length).to.equal(0);
|
||||
});
|
||||
});
|
||||
|
||||
describe('#getPlugins()', () => {
|
||||
@ -500,53 +480,34 @@ describe('PluginManager', () => {
|
||||
});
|
||||
});
|
||||
|
||||
describe('#validateCommands()', () => {
|
||||
it('should throw an error if a first level command is not found in the commands object', () => {
|
||||
pluginManager.commands = {
|
||||
foo: {},
|
||||
};
|
||||
const commandsArray = ['bar'];
|
||||
|
||||
expect(() => { pluginManager.validateCommands(commandsArray); }).to.throw(Error);
|
||||
});
|
||||
});
|
||||
|
||||
describe('#validateOptions()', () => {
|
||||
it('should throw an error if a required option is not set in a plain commands object', () => {
|
||||
it('should throw an error if a required option is not set', () => {
|
||||
pluginManager.commands = {
|
||||
foo: {
|
||||
options: {
|
||||
bar: {
|
||||
baz: {
|
||||
shortcut: 'b',
|
||||
required: true,
|
||||
},
|
||||
},
|
||||
},
|
||||
bar: {
|
||||
options: {
|
||||
baz: {
|
||||
required: true,
|
||||
},
|
||||
},
|
||||
},
|
||||
};
|
||||
const commandsArray = ['foo'];
|
||||
|
||||
expect(() => { pluginManager.validateOptions(commandsArray); }).to.throw(Error);
|
||||
const foo = pluginManager.commands.foo;
|
||||
const bar = pluginManager.commands.bar;
|
||||
|
||||
expect(() => { pluginManager.validateOptions(foo); }).to.throw(Error);
|
||||
expect(() => { pluginManager.validateOptions(bar); }).to.throw(Error);
|
||||
});
|
||||
|
||||
it('should throw an error if a required option is not set in a nested commands object', () => {
|
||||
pluginManager.commands = {
|
||||
foo: {
|
||||
commands: {
|
||||
bar: {
|
||||
options: {
|
||||
baz: {
|
||||
required: true,
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
};
|
||||
const commandsArray = ['foo', 'bar'];
|
||||
|
||||
expect(() => { pluginManager.validateOptions(commandsArray); }).to.throw(Error);
|
||||
});
|
||||
|
||||
it('should throw an error if a customValidation is not set in a plain commands object', () => {
|
||||
it('should throw an error if a customValidation is not met', () => {
|
||||
pluginManager.setCliOptions({ bar: 'dev' });
|
||||
|
||||
pluginManager.commands = {
|
||||
@ -561,33 +522,9 @@ describe('PluginManager', () => {
|
||||
},
|
||||
},
|
||||
};
|
||||
const commandsArray = ['foo'];
|
||||
const command = pluginManager.commands.foo;
|
||||
|
||||
expect(() => { pluginManager.validateOptions(commandsArray); }).to.throw(Error);
|
||||
});
|
||||
|
||||
it('should throw an error if a customValidation is not set in a nested commands object', () => {
|
||||
pluginManager.setCliOptions({ baz: 100 });
|
||||
|
||||
pluginManager.commands = {
|
||||
foo: {
|
||||
commands: {
|
||||
bar: {
|
||||
options: {
|
||||
baz: {
|
||||
customValidation: {
|
||||
regularExpression: /^[a-zA-z¥s]+$/,
|
||||
errorMessage: 'Custom Error Message',
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
};
|
||||
const commandsArray = ['foo', 'bar'];
|
||||
|
||||
expect(() => { pluginManager.validateOptions(commandsArray); }).to.throw(Error);
|
||||
expect(() => { pluginManager.validateOptions(command); }).to.throw(Error);
|
||||
});
|
||||
|
||||
it('should succeeds if a custom regex matches in a plain commands object', () => {
|
||||
@ -609,30 +546,6 @@ describe('PluginManager', () => {
|
||||
|
||||
expect(() => { pluginManager.validateOptions(commandsArray); }).to.not.throw(Error);
|
||||
});
|
||||
|
||||
it('should succeeds if a custom regex matches in a nested commands object', () => {
|
||||
pluginManager.setCliOptions({ baz: 'dev' });
|
||||
|
||||
pluginManager.commands = {
|
||||
foo: {
|
||||
commands: {
|
||||
bar: {
|
||||
options: {
|
||||
baz: {
|
||||
customValidation: {
|
||||
regularExpression: /^[a-zA-z¥s]+$/,
|
||||
errorMessage: 'Custom Error Message',
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
};
|
||||
const commandsArray = ['foo', 'bar'];
|
||||
|
||||
expect(() => { pluginManager.validateOptions(commandsArray); }).to.not.throw(Error);
|
||||
});
|
||||
});
|
||||
|
||||
describe('#run()', () => {
|
||||
@ -644,6 +557,22 @@ describe('PluginManager', () => {
|
||||
expect(() => { pluginManager.run(commandsArray); }).to.throw(Error);
|
||||
});
|
||||
|
||||
it('should throw an error when the given command has no hooks', () => {
|
||||
class HooklessPlugin {
|
||||
constructor() {
|
||||
this.commands = {
|
||||
foo: {},
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
pluginManager.addPlugin(HooklessPlugin);
|
||||
|
||||
const commandsArray = ['foo'];
|
||||
|
||||
expect(() => { pluginManager.run(commandsArray); }).to.throw(Error);
|
||||
});
|
||||
|
||||
it('should run the hooks in the correct order', () => {
|
||||
class CorrectHookOrderPluginMock {
|
||||
constructor() {
|
||||
@ -732,7 +661,7 @@ describe('PluginManager', () => {
|
||||
describe('when running a nested command', () => {
|
||||
it('should run the nested command', () => {
|
||||
const commandsArray = ['deploy', 'onpremises'];
|
||||
pluginManager.run(commandsArray)
|
||||
return pluginManager.run(commandsArray)
|
||||
.then(() => expect(pluginManager.plugins[0].deployedResources)
|
||||
.to.equal(1));
|
||||
});
|
||||
@ -750,14 +679,14 @@ describe('PluginManager', () => {
|
||||
pluginManager.addPlugin(SynchronousPluginMock);
|
||||
});
|
||||
|
||||
it('should run only the providers plugins (if the provider is specified)', () => {
|
||||
it('should load only the providers plugins (if the provider is specified)', () => {
|
||||
const commandsArray = ['deploy'];
|
||||
pluginManager.run(commandsArray).then(() => {
|
||||
return pluginManager.run(commandsArray).then(() => {
|
||||
expect(pluginManager.plugins.length).to.equal(2);
|
||||
expect(pluginManager.plugins[0].deployedFunctions).to.equal(1);
|
||||
expect(pluginManager.plugins[1].deployedFunctions).to.equal(0);
|
||||
|
||||
// other, provider independent plugins should also be run
|
||||
expect(pluginManager.plugins[2].deployedFunctions).to.equal(1);
|
||||
expect(pluginManager.plugins[0].provider).to.equal('provider1');
|
||||
expect(pluginManager.plugins[1].deployedFunctions).to.equal(1);
|
||||
expect(pluginManager.plugins[1].provider).to.equal(undefined);
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
@ -151,7 +151,6 @@ describe('Serverless', () => {
|
||||
google: {},
|
||||
},
|
||||
package: {
|
||||
include: ['include-me.js'],
|
||||
exclude: ['exclude-me.js'],
|
||||
artifact: 'some/path/foo.zip',
|
||||
},
|
||||
|
||||
@ -53,7 +53,6 @@ describe('Service', () => {
|
||||
google: {},
|
||||
},
|
||||
package: {
|
||||
include: ['include-me.js'],
|
||||
exclude: ['exclude-me.js'],
|
||||
artifact: 'some/path/foo.zip',
|
||||
},
|
||||
@ -69,7 +68,6 @@ describe('Service', () => {
|
||||
expect(serviceInstance.resources.aws).to.deep.equal({ resourcesProp: 'value' });
|
||||
expect(serviceInstance.resources.azure).to.deep.equal({});
|
||||
expect(serviceInstance.resources.google).to.deep.equal({});
|
||||
expect(serviceInstance.package.include[0]).to.equal('include-me.js');
|
||||
expect(serviceInstance.package.exclude[0]).to.equal('exclude-me.js');
|
||||
expect(serviceInstance.package.artifact).to.equal('some/path/foo.zip');
|
||||
});
|
||||
@ -136,7 +134,6 @@ describe('Service', () => {
|
||||
google: {},
|
||||
},
|
||||
package: {
|
||||
include: ['include-me.js'],
|
||||
exclude: ['exclude-me.js'],
|
||||
artifact: 'some/path/foo.zip',
|
||||
},
|
||||
@ -158,8 +155,6 @@ describe('Service', () => {
|
||||
expect(serviceInstance.resources.aws).to.deep.equal({ resourcesProp: 'value' });
|
||||
expect(serviceInstance.resources.azure).to.deep.equal({});
|
||||
expect(serviceInstance.resources.google).to.deep.equal({});
|
||||
expect(serviceInstance.package.include.length).to.equal(1);
|
||||
expect(serviceInstance.package.include[0]).to.equal('include-me.js');
|
||||
expect(serviceInstance.package.exclude.length).to.equal(1);
|
||||
expect(serviceInstance.package.exclude[0]).to.equal('exclude-me.js');
|
||||
expect(serviceInstance.package.artifact).to.equal('some/path/foo.zip');
|
||||
@ -188,7 +183,6 @@ describe('Service', () => {
|
||||
google: {},
|
||||
},
|
||||
package: {
|
||||
include: ['include-me.js'],
|
||||
exclude: ['exclude-me.js'],
|
||||
artifact: 'some/path/foo.zip',
|
||||
},
|
||||
@ -207,6 +201,37 @@ describe('Service', () => {
|
||||
});
|
||||
});
|
||||
|
||||
it('should support Serverless file with a non-aws provider', () => {
|
||||
const SUtils = new Utils();
|
||||
const serverlessYaml = {
|
||||
service: 'my-service',
|
||||
provider: 'ibm',
|
||||
functions: {
|
||||
functionA: {
|
||||
name: 'customFunctionName',
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
SUtils.writeFileSync(path.join(tmpDirPath, 'serverless.yaml'),
|
||||
YAML.dump(serverlessYaml));
|
||||
|
||||
const serverless = new Serverless({ servicePath: tmpDirPath });
|
||||
serviceInstance = new Service(serverless);
|
||||
|
||||
return serviceInstance.load().then(() => {
|
||||
const expectedFunc = {
|
||||
functionA: {
|
||||
name: 'customFunctionName',
|
||||
events: [],
|
||||
},
|
||||
};
|
||||
expect(serviceInstance.service).to.be.equal('my-service');
|
||||
expect(serviceInstance.provider.name).to.deep.equal('ibm');
|
||||
expect(serviceInstance.functions).to.deep.equal(expectedFunc);
|
||||
});
|
||||
});
|
||||
|
||||
it('should support Serverless file with a .yaml extension', () => {
|
||||
const SUtils = new Utils();
|
||||
const serverlessYaml = {
|
||||
|
||||
@ -19,7 +19,7 @@ serverless create --template $template
|
||||
echo "Overwriting Service Name"
|
||||
sed -i.bak s/${template}/sls-test-$template-$RANDOM/g $template_folder/serverless.yml
|
||||
|
||||
echo "Running Compose build for Teamplate"
|
||||
echo "Running Compose build for Template"
|
||||
docker-compose build $template
|
||||
|
||||
if [ ! -z "$2" ]
|
||||
|
||||
@ -10,5 +10,6 @@ function integration-test {
|
||||
|
||||
integration-test aws-java-gradle build
|
||||
integration-test aws-java-maven mvn package
|
||||
integration-test aws-scala-sbt sbt assembly
|
||||
integration-test aws-nodejs
|
||||
integration-test aws-python
|
||||
|
||||
Loading…
x
Reference in New Issue
Block a user