-
Notifications
You must be signed in to change notification settings - Fork 1
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
chore: update defunct github runner job feat(cloudflared): add a standalone cloudflare tunnel job docs(README): extend README Signed-off-by: Bruce Becker <[email protected]> --------- Signed-off-by: Bruce Becker <[email protected]>
- Loading branch information
1 parent
fd10b77
commit 2dba64d
Showing
5 changed files
with
133 additions
and
21 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,3 +1,41 @@ | ||
# Hashi at Home Nomad jobs | ||
|
||
These are the Nomad jobs for [Hashi at Home](https://hashiatho.me) | ||
These are the Nomad jobs for [Hashi@Home](https://hashiatho.me) | ||
|
||
Many of them are inspired by the works of others, including the Hashicorp Nomad tutorials themselves, so I make no claim over their authorship. | ||
|
||
These are not quite ready for production. | ||
|
||
## Hashi Stack integration | ||
|
||
Hashi@Home is a playground of mostly Raspberry Pis created so that I could experiment and learn the Hashicorp Stack (Vault, Consul, Nomad) and perfect my usage of their tools (Terraform, Packer, Waypoint, _etc_.). | ||
Since the clients in the cluster are mixed architecture, the jobs use constraints or dynamic statements to retrieve relevant artifacts. | ||
|
||
### Consul and Vault | ||
|
||
Several of the jobs use templates with either Consul or Vault functions. | ||
Consul functions include either lookups in the Consul catalogue, of services or nodes, or template configuration files based on Consul KV data. | ||
The Nomad services and clients are configured to use Nomad workload identities, in order to issue Vault tokens to jobs so that they can consume secrets. | ||
|
||
### Terraform | ||
|
||
In cases where jobs required backing services outside of the cluster, they are implemented with Terraform. | ||
Terraform is responsible for the creation of the backing resources (DNS entries, S3 buckets, _etc_), as well as the actual Nomad job. | ||
This is especially useful when needing to template Nomad job descriptions, and is used in these cases as a kind of replacement for Nomad Pack. | ||
In these cases, the Terraform backend used is typically Consul, for the reasons provided above. | ||
|
||
## Using | ||
|
||
You will need a working Nomad cluster of course, which is sufficient for many of the jobs described here. | ||
However, as mentioned above the jobs that required service discovery or interaction with Consul or Vault for templating will of course require those services. | ||
If your cluster has ACLs enabled, you will need to set the `NOMAD_TOKEN` appropriately. | ||
|
||
## Jobs | ||
|
||
Jobs are found in the main directory, and are mostly schedulable via Nomad itself, using the usual `nomad plan/apply`, while a select few are deployed with Terraform directly. | ||
A few notable examples are described in a bit more depth below: | ||
|
||
* [Container Storage Interface](csi/README.md) - attempts to deploy CSI plugins | ||
* [jenkins](jenkins/README.md) - Jenkins controller with configuration as code | ||
* [loki](loki/README.md) - Grafana Loki deployment with DigitalOcean spaces storage | ||
* [monitoring](monitoring/README.md) - unified Grafana monitoring stack (Prometheus, loki, grafana, promtail, node-exporter) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,77 @@ | ||
job "cloudflare-tunnel" { | ||
datacenters = ["dc1"] | ||
type = "service" | ||
|
||
group "app" { | ||
network { | ||
port "metrics" {} | ||
port "cloudflared" {} | ||
mode = "bridge" | ||
} | ||
service { | ||
name = "cloudflared-test" | ||
tags = ["cloudflared", "test"] | ||
port = "cloudflared" | ||
// check { | ||
// name = "alive" | ||
// type = "tcp" | ||
// interval = "10s" | ||
// timeout = "2s" | ||
// } | ||
|
||
check { | ||
name = "metrics" | ||
type = "http" | ||
interval = "10s" | ||
timeout = "2s" | ||
port = "metrics" | ||
path = "/metrics" | ||
} | ||
} | ||
reschedule { | ||
attempts = 1 | ||
interval = "1m" | ||
unlimited = false | ||
delay_function = "constant" | ||
} | ||
restart { | ||
attempts = 1 | ||
interval = "2m" | ||
delay = "15s" | ||
mode = "delay" | ||
} | ||
|
||
task "tunnel" { | ||
vault {} | ||
template { | ||
data = <<EOH | ||
TUNNEL_TOKEN="{{ with secret "hashiatho.me-v2/data/cloudflare" }}{{- .Data.data.cloudflare_access_test_token -}}{{ end }}" | ||
EOH | ||
destination = "secrets/.env" | ||
env = true | ||
} | ||
env { | ||
TUNNEL_METRICS = "0.0.0.0:${NOMAD_PORT_metrics}" | ||
TUNNEL_LOGLEVEL = "info" | ||
} | ||
restart { | ||
interval = "1m" | ||
attempts = 1 | ||
delay = "5s" | ||
mode = "fail" | ||
} | ||
driver = "docker" | ||
config { | ||
image = "cloudflare/cloudflared" | ||
args = [ | ||
"tunnel", "run" | ||
] | ||
} | ||
|
||
resources { | ||
cpu = 100 | ||
memory = 64 | ||
} | ||
} | ||
} | ||
} |
Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters