Updating versions and removing deprecated items (#8750)

* Removed deprecated Spark pieces.

* Bumped HashiCorp stack versions to current as of commit date

* Bumped versions of HashiCorp stack tools

* Bumped versions, added VAULT_ADDR in GCP, removed refs to Spark in shared README
This commit is contained in:
Gale Fagan
2020-09-02 10:14:47 -07:00
committed by GitHub
parent 9d7d9d8b4d
commit 10fa09943c
6 changed files with 42 additions and 59 deletions

View File

@@ -1,4 +1,4 @@
# Provision a Nomad cluster in the Cloud
# Provision a Nomad cluster in the cloud
Use this repo to easily provision a Nomad sandbox environment on AWS, Azure, or GCP with
[Packer](https://packer.io) and [Terraform](https://terraform.io).
@@ -6,8 +6,7 @@ Use this repo to easily provision a Nomad sandbox environment on AWS, Azure, or
[Vault](https://www.vaultproject.io/intro/index.html) are also installed
(colocated for convenience). The intention is to allow easy exploration of
Nomad and its integrations with the HashiCorp stack. This is *not* meant to be
a production ready environment. A demonstration of [Nomad's Apache Spark
integration](examples/spark/README.md) is included.
a production ready environment.
## Setup
@@ -79,11 +78,3 @@ Use the following links to get started with Nomad and its HashiCorp integrations
* [Vault integration](https://www.nomadproject.io/docs/vault-integration/index.html)
* [consul-template integration](https://www.nomadproject.io/docs/job-specification/template.html)
## Apache Spark integration
Nomad is well-suited for analytical workloads, given its performance
characteristics and first-class support for batch scheduling. Apache Spark is a
popular data processing engine/framework that has been architected to use
third-party schedulers. The Nomad ecosystem includes a [fork that natively
integrates Nomad with Spark](https://github.com/hashicorp/nomad-spark). A
detailed walkthrough of the integration is included [here](examples/spark/README.md).

View File

@@ -23,7 +23,7 @@ Includes:
Download the latest version of [Nomad](https://www.nomadproject.io/) from HashiCorp's website by copying and pasting this snippet in the terminal:
```console
curl "https://releases.hashicorp.com/nomad/0.12.0/nomad_0.12.0_linux_amd64.zip" -o nomad.zip
curl "https://releases.hashicorp.com/nomad/0.12.3/nomad_0.12.3_linux_amd64.zip" -o nomad.zip
unzip nomad.zip
sudo mv nomad /usr/local/bin
nomad --version
@@ -34,7 +34,7 @@ nomad --version
Download the latest version of [Consul](https://www.consul.io/) from HashiCorp's website by copying and pasting this snippet in the terminal:
```console
curl "https://releases.hashicorp.com/consul/1.8.0/consul_1.8.0_linux_amd64.zip" -o consul.zip
curl "https://releases.hashicorp.com/consul/1.8.3/consul_1.8.3_linux_amd64.zip" -o consul.zip
unzip consul.zip
sudo mv consul /usr/local/bin
consul --version
@@ -45,7 +45,7 @@ consul --version
Download the latest version of [Vault](https://www.vaultproject.io/) from HashiCorp's website by copying and pasting this snippet in the terminal:
```console
curl "https://releases.hashicorp.com/vault/1.4.3/vault_1.4.3_linux_amd64.zip" -o vault.zip
curl "https://releases.hashicorp.com/vault/1.5.3/vault_1.5.3_linux_amd64.zip" -o vault.zip
unzip vault.zip
sudo mv vault /usr/local/bin
vault --version
@@ -56,7 +56,7 @@ vault --version
Download the latest version of [Packer](https://www.packer.io/) from HashiCorp's website by copying and pasting this snippet in the terminal:
```console
curl "https://releases.hashicorp.com/packer/1.6.0/packer_1.6.0_linux_amd64.zip" -o packer.zip
curl "https://releases.hashicorp.com/packer/1.6.2/packer_1.6.2_linux_amd64.zip" -o packer.zip
unzip packer.zip
sudo mv packer /usr/local/bin
packer --version
@@ -67,7 +67,7 @@ packer --version
Download the latest version of [Terraform](https://www.terraform.io/) from HashiCorp's website by copying and pasting this snippet in the terminal:
```console
curl "https://releases.hashicorp.com/terraform/0.12.28/terraform_0.12.28_linux_amd64.zip" -o terraform.zip
curl "https://releases.hashicorp.com/terraform/0.13.1/terraform_0.13.1_linux_amd64.zip" -o terraform.zip
unzip terraform.zip
sudo mv terraform /usr/local/bin
terraform --version
@@ -75,7 +75,7 @@ terraform --version
### Install and Authenticate the GCP SDK Command Line Tools
**If you are using [Google Cloud](https://cloud.google.com/shell), you already have `gcloud` setup. So, you can safely skip this step.**
**If you are using [Google Cloud](https://cloud.google.com/shell), you already have `gcloud` set up, and you can safely skip this step.**
To install the GCP SDK Command Line Tools, follow the installation instructions for your specific operating system:
@@ -233,19 +233,39 @@ If you're **not** using Cloud Shell, you can use any of these links:
* [Vault](http://127.0.0.1:8200)
* [Consul](http://127.0.0.1:8500)
In case you want to try out any of the optional steps with the Vault CLI later on, set this helper variable:
```
export VAULT_ADDR=http://localhost:8200
```
## Next Steps
You have deployed a Nomad cluster to GCP! 🎉
Click [here](https://github.com/hashicorp/nomad/blob/master/terraform/README.md#test) for next steps.
> ### After You Finish
> Come back here when you're done exploring Nomad and the HashiCorp stack. In the next section, you'll learn how to clean up, and will destroy the demo infrastructure you've created.
## Conclusion
You have deployed a Nomad cluster to GCP!
You have deployed a Nomad cluster to GCP!
### Destroy Infrastrucure
### Destroy Infrastructure
To destroy all the demo infrastrucure:
To destroy all the demo infrastructure:
```console
terraform destroy -force -var="project=${GOOGLE_PROJECT}" -var="credentials=${GOOGLE_APPLICATION_CREDENTIALS}"
```
### Delete the Project
Finally, to completely delete the project:
gcloud projects delete $GOOGLE_PROJECT
> ### Alternative: Use the GUI
>
> If you prefer to delete the project using GCP's Cloud Console, follow this link to GCP's [Cloud Resource Manager](https://console.cloud.google.com/cloud-resource-manager).

View File

@@ -45,4 +45,5 @@
"script": "../shared/scripts/setup.sh"
}
]
}
}

View File

@@ -7,8 +7,6 @@ CONFIGDIR=/ops/shared/config
CONSULCONFIGDIR=/etc/consul.d
NOMADCONFIGDIR=/etc/nomad.d
CONSULTEMPLATECONFIGDIR=/etc/consul-template.d
HADOOP_VERSION=hadoop-2.7.7
HADOOPCONFIGDIR=/usr/local/$HADOOP_VERSION/etc/hadoop
HOME_DIR=ubuntu
# Wait for network
@@ -68,9 +66,6 @@ echo "nameserver $DOCKER_BRIDGE_IP_ADDRESS" | sudo tee /etc/resolv.conf.new
cat /etc/resolv.conf | sudo tee --append /etc/resolv.conf.new
sudo mv /etc/resolv.conf.new /etc/resolv.conf
# Hadoop config file to enable HDFS CLI
sudo cp $CONFIGDIR/core-site.xml $HADOOPCONFIGDIR
# Move examples directory to $HOME
sudo mv /ops/examples /home/$HOME_DIR
sudo chown -R $HOME_DIR:$HOME_DIR /home/$HOME_DIR/examples
@@ -80,8 +75,3 @@ sudo chmod -R 775 /home/$HOME_DIR/examples
echo "export VAULT_ADDR=http://$IP_ADDRESS:8200" | sudo tee --append /home/$HOME_DIR/.bashrc
echo "export NOMAD_ADDR=http://$IP_ADDRESS:4646" | sudo tee --append /home/$HOME_DIR/.bashrc
echo "export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64/jre" | sudo tee --append /home/$HOME_DIR/.bashrc
# Update PATH
echo "export PATH=$PATH:/usr/local/bin/spark/bin:/usr/local/$HADOOP_VERSION/bin" | sudo tee --append /home/$HOME_DIR/.bashrc

View File

@@ -8,8 +8,6 @@ CONSULCONFIGDIR=/etc/consul.d
VAULTCONFIGDIR=/etc/vault.d
NOMADCONFIGDIR=/etc/nomad.d
CONSULTEMPLATECONFIGDIR=/etc/consul-template.d
HADOOP_VERSION=hadoop-2.7.7
HADOOPCONFIGDIR=/usr/local/$HADOOP_VERSION/etc/hadoop
HOME_DIR=ubuntu
# Wait for network
@@ -83,9 +81,6 @@ echo "nameserver $DOCKER_BRIDGE_IP_ADDRESS" | sudo tee /etc/resolv.conf.new
cat /etc/resolv.conf | sudo tee --append /etc/resolv.conf.new
sudo mv /etc/resolv.conf.new /etc/resolv.conf
# Hadoop
sudo cp $CONFIGDIR/core-site.xml $HADOOPCONFIGDIR
# Move examples directory to $HOME
sudo mv /ops/examples /home/$HOME_DIR
sudo chown -R $HOME_DIR:$HOME_DIR /home/$HOME_DIR/examples
@@ -97,6 +92,3 @@ echo "export CONSUL_HTTP_ADDR=$IP_ADDRESS:8500" | sudo tee --append /home/$HOME_
echo "export VAULT_ADDR=http://$IP_ADDRESS:8200" | sudo tee --append /home/$HOME_DIR/.bashrc
echo "export NOMAD_ADDR=http://$IP_ADDRESS:4646" | sudo tee --append /home/$HOME_DIR/.bashrc
echo "export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64/jre" | sudo tee --append /home/$HOME_DIR/.bashrc
# Update PATH
echo "export PATH=$PATH:/usr/local/bin/spark/bin:/usr/local/$HADOOP_VERSION/bin" | sudo tee --append /home/$HOME_DIR/.bashrc

View File

@@ -9,37 +9,31 @@ cd /ops
CONFIGDIR=/ops/shared/config
CONSULVERSION=1.6.0
CONSULVERSION=1.8.3
CONSULDOWNLOAD=https://releases.hashicorp.com/consul/${CONSULVERSION}/consul_${CONSULVERSION}_linux_amd64.zip
CONSULCONFIGDIR=/etc/consul.d
CONSULDIR=/opt/consul
VAULTVERSION=1.0.3
VAULTVERSION=1.5.3
VAULTDOWNLOAD=https://releases.hashicorp.com/vault/${VAULTVERSION}/vault_${VAULTVERSION}_linux_amd64.zip
VAULTCONFIGDIR=/etc/vault.d
VAULTDIR=/opt/vault
NOMADVERSION=0.9.0
NOMADVERSION=0.12.3
NOMADDOWNLOAD=https://releases.hashicorp.com/nomad/${NOMADVERSION}/nomad_${NOMADVERSION}_linux_amd64.zip
NOMADCONFIGDIR=/etc/nomad.d
NOMADDIR=/opt/nomad
CONSULTEMPLATEVERSION=0.20.0
CONSULTEMPLATEVERSION=0.25.1
CONSULTEMPLATEDOWNLOAD=https://releases.hashicorp.com/consul-template/${CONSULTEMPLATEVERSION}/consul-template_${CONSULTEMPLATEVERSION}_linux_amd64.zip
CONSULTEMPLATECONFIGDIR=/etc/consul-template.d
CONSULTEMPLATEDIR=/opt/consul-template
HADOOP_VERSION=2.7.7
# Dependencies
sudo apt-get install -y software-properties-common
sudo apt-get update
sudo apt-get install -y unzip tree redis-tools jq curl tmux
# Numpy (for Spark)
sudo apt-get install -y python-setuptools
sudo easy_install pip
sudo pip install numpy
# Disable the firewall
@@ -105,6 +99,7 @@ sudo chmod 755 $CONSULTEMPLATECONFIGDIR
sudo mkdir -p $CONSULTEMPLATEDIR
sudo chmod 755 $CONSULTEMPLATEDIR
# Docker
distro=$(lsb_release -si | tr '[:upper:]' '[:lower:]')
sudo apt-get install -y apt-transport-https ca-certificates gnupg2
@@ -113,6 +108,7 @@ sudo add-apt-repository "deb [arch=amd64] https://download.docker.com/linux/${di
sudo apt-get update
sudo apt-get install -y docker-ce
# Needs testing, updating and fixing
if [[ ! -z ${INSTALL_NVIDIA_DOCKER+x} ]]; then
# Install official NVIDIA driver package
sudo apt-key adv --fetch-keys http://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64/7fa2af80.pub
@@ -133,7 +129,9 @@ if [[ ! -z ${INSTALL_NVIDIA_DOCKER+x} ]]; then
fi
# rkt
VERSION=1.29.0
# Note: rkt has been ended and archived. This should likely be removed.
# See https://github.com/rkt/rkt/issues/4024
VERSION=1.30.0
DOWNLOAD=https://github.com/rkt/rkt/releases/download/v${VERSION}/rkt-v${VERSION}.tar.gz
function install_rkt() {
@@ -171,12 +169,3 @@ sudo add-apt-repository -y ppa:openjdk-r/ppa
sudo apt-get update
sudo apt-get install -y openjdk-8-jdk
JAVA_HOME=$(readlink -f /usr/bin/java | sed "s:bin/java::")
# Spark
sudo wget -P /ops/examples/spark https://nomad-spark.s3.amazonaws.com/spark-2.2.0-bin-nomad-0.7.0.tgz
sudo tar -xf /ops/examples/spark/spark-2.2.0-bin-nomad-0.7.0.tgz --directory /ops/examples/spark
sudo mv /ops/examples/spark/spark-2.2.0-bin-nomad-0.7.0 /usr/local/bin/spark
sudo chown -R root:root /usr/local/bin/spark
# Hadoop (to enable the HDFS CLI)
wget -O - http://apache.mirror.iphh.net/hadoop/common/hadoop-${HADOOP_VERSION}/hadoop-${HADOOP_VERSION}.tar.gz | sudo tar xz -C /usr/local/