Scripts provide easy way getting access to provate EC2 instances behind Jump host server
Most of the deployments on AWS includes public and private subnets.
On public subnets, jump host server will be running
in the form of EC2 instance and are publicly accessible using pre-built trust mechanisms such as by using SSH key pairs.
On private subnets, all the services will be running inside the docker containers on EC2 instance.
While troubleshooting the issue, most of the times, one has to get into this machines and run few commands to check if the containers are up or check the docker logs. Since this is mundane task, having the script which does this in an automated way would be desirable.
Hence, come up with the script which does this for us.
Note: Script fetches SSH user id from the git account.
AWS Profile with valid access and secret key created under "./aws/credentials" file.
SSH public key on the Jump host and private EC2 instance.
./jump-host.sh "profile-name" "ec2-public-instance-tag-value" "ec2-private-instance-tag-value" "aws-region" "command"
"profile-name": Name of the profile stored in "aws/credentials or aws/config" file
"ec2-public-instance-tag-value": Publicly accessible EC2 instance tag value
"ec2-private-instance-tag-value": Private EC2 instance tag value
"aws-gerion": AWS account Region
"Command": COmmand to be executed such as "ls" "grep" "docker ps" etc..