removed GITLAB_ROBOTS_OVERRIDE parameter. Override default robots.txt if GITLAB_ROBOTS_PATH exists.

Refer #364
This commit is contained in:
Sameer Naik 2015-09-24 12:53:55 +05:30
parent e0ac6ae5cf
commit 83eb229dbe
4 changed files with 6 additions and 11 deletions

View File

@ -11,6 +11,7 @@ This file only reflects the changes that are made in the the docker image. Pleas
- added `GITLAB_NOTIFY_ON_BROKEN_BUILDS` and `GITLAB_NOTIFY_PUSHER` parameters
- added options to email IMAP and reply by email feature
- set value of `GITLAB_EMAIL` to `SMTP_USER` if defined, else default to `example@example.com`
- removed `GITLAB_ROBOTS_OVERRIDE` parameter. Override default `robots.txt` if `GITLAB_ROBOTS_PATH` exists.
**7.14.3**
- gitlab: upgrade to CE v.7.14.3

View File

@ -861,8 +861,7 @@ Below is the complete list of available options that can be used to customize yo
- **AWS_BACKUP_ACCESS_KEY_ID**: AWS access key id. No defaults.
- **AWS_BACKUP_SECRET_ACCESS_KEY**: AWS secret access key. No defaults.
- **AWS_BACKUP_BUCKET**: AWS bucket for backup uploads. No defaults.
- **GITLAB_ROBOTS_OVERRIDE**: Override `robots.txt`. Defaults to `false`.
- **GITLAB_ROBOTS_PATH**: Location of `robots.txt`. See [www.robotstxt.org](http://www.robotstxt.org) for examples. Defaults to `robots.txt` which [prevents robots scanning gitlab](http://www.robotstxt.org/faq/prevent.html).
- **GITLAB_ROBOTS_PATH**: Location of custom `robots.txt`. Uses GitLab's default `robots.txt` configuration by default. See [www.robotstxt.org](http://www.robotstxt.org) for examples.
# Maintenance

View File

@ -1,2 +0,0 @@
User-Agent: *
Disallow: /

View File

@ -180,8 +180,7 @@ GOOGLE_ANALYTICS_ID=${GOOGLE_ANALYTICS_ID:-}
PIWIK_URL=${PIWIK_URL:-}
PIWIK_SITE_ID=${PIWIK_SITE_ID:-}
GITLAB_ROBOTS_OVERRIDE=${GITLAB_ROBOTS_OVERRIDE:-false}
GITLAB_ROBOTS_PATH=${GITLAB_ROBOTS_PATH:-$SYSCONF_TEMPLATES_DIR/gitlabhq/robots.txt}
GITLAB_ROBOTS_PATH=${GITLAB_ROBOTS_PATH:-${USERCONF_TEMPLATES_DIR}/gitlabhq/robots.txt}
# is a mysql or postgresql database linked?
# requires that the mysql or postgresql containers have exposed
@ -357,11 +356,6 @@ sudo -HEu ${GITLAB_USER} cp ${SYSCONF_TEMPLATES_DIR}/gitlabhq/unicorn.rb
sudo -HEu ${GITLAB_USER} cp ${SYSCONF_TEMPLATES_DIR}/gitlabhq/rack_attack.rb config/initializers/rack_attack.rb
[[ ${SMTP_ENABLED} == true ]] && \
sudo -HEu ${GITLAB_USER} cp ${SYSCONF_TEMPLATES_DIR}/gitlabhq/smtp_settings.rb config/initializers/smtp_settings.rb
# allow to override robots.txt to block bots
[[ ${GITLAB_ROBOTS_OVERRIDE} == true ]] && \
sudo -HEu ${GITLAB_USER} cp ${GITLAB_ROBOTS_PATH} public/robots.txt
[[ ${IMAP_ENABLED} == true ]] && \
sudo -HEu ${GITLAB_USER} cp ${SYSCONF_TEMPLATES_DIR}/gitlabhq/mail_room.yml config/mail_room.yml
@ -389,6 +383,9 @@ esac
[[ ${IMAP_ENABLED} == true ]] && \
[[ -f ${USERCONF_TEMPLATES_DIR}/gitlabhq/mail_room.yml ]] && sudo -HEu ${GITLAB_USER} cp ${USERCONF_TEMPLATES_DIR}/gitlabhq/mail_room.yml config/mail_room.yml
# override robots.txt if a user configuration exists
[[ -f ${GITLAB_ROBOTS_PATH} ]] && sudo -HEu ${GITLAB_USER} cp ${GITLAB_ROBOTS_PATH} public/robots.txt
if [[ -f ${SSL_CERTIFICATE_PATH} || -f ${CA_CERTIFICATES_PATH} ]]; then
echo "Updating CA certificates..."
[[ -f ${SSL_CERTIFICATE_PATH} ]] && cp "${SSL_CERTIFICATE_PATH}" /usr/local/share/ca-certificates/gitlab.crt