Fabric release taks commands to use with AWS Beanstalk using boto. Integrates with git by creating aliases for pushing code directly to the Beanstalk. Optional dependency is django-storages, package includes utilities for setting up your static and media backend for use in S3.
The tool was inspired by AWS Elastic Beanstalk command line tool, see eb.
Dependencies:
- Fabric
- Django
- Boto
- prettytable
- git
This tool is best suited for use with with AMI ami-35792c5c. It is a legacy AMI but the ebextensions file generated by this tool installs all required packages as an instance is built. Now updated to 2014.09 stack! (ebextensions changed), see .ext files
The only fully supported db backend is postgres / postgis using the psycopg2 driver. The problem is that MySQL does not support all spatial operations, see the django docs.
Ensure the version of git you are using includes the points-at command, git 1.8+. See Git Install.
[sudo] pip install git+https://github.com/radlws/django-awseb-tasks.git
Alternatively, install somewhere on your PYTHONPATH i.e. in ../lib. This method will allow you to keep the module part of your repository.
cd ../lib
pip install --target . -U git+https://github.com/radlws/django-awseb-tasks.git
First set the required environment variables in your fab file, then import the tasks
import os
os.environ['PROJECT_NAME'] = os.getcwd().split('/')[-1] # Import before aws_tasks, as it is used there.
os.environ['DEFAULT_REGION'] = 'us-east-1'
os.environ['DB_HOST'] = 'prod.your-db-url.us-east-1.rds.amazonaws.com' # RDS DB URL, update accordingly
from awseb_fab_tasks import tasks as aws
- list_environments - Shows all available environments
- status - runs list_environments
- instances - Returns SSH connection string to available instance
- leader - Returns ssh connection string to leader instance
- list_instances - Shows all instances for an environment
- deploy - Deploy a release to the specified AWS Elastic Beanstalk environment. Requires site name & tag (release)
- dump_bucket - Downloads an S3 Bucket, given the bucket name
- manage - Run a manage command remotely, need host that you can get from leader command. use appropriate cert
- sw_creds - switch credential files for boto and eb if they exist in your home dir. Quickly switch accounts i.e kct and baker
- eb_init - creats aws.push and aws.config commands used by deploy
- new_creds
- generate_app_config - Generates .ebextensions/app.config file based on PROJECT_NAME in root of project
- environment_status: - returns the environment health and status
TODO: eb_init -> automate db creation for RDS
(OPTIONAL) Using the S3 backend for media and static (requires storages install). Add this to your settings file:
DEFAULT_FILE_STORAGE = 'aws_tasks.storage_backends.MediaS3Storage'
STATICFILES_STORAGE = 'aws_tasks.storage_backends.StaticS3Storage'
THUMBNAIL_DEFAULT_STORAGE = DEFAULT_FILE_STORAGE
-
Login to (AWS)[http://aws.amazon.com], setup an account if you don't have one.
-
Go to Beanstalk Console
-
Create a new application, give it the same name as your django project.
-
Create a new environment, i.e. staging, and follow the wizard, use these settings:
- Environment Tier: Web Server
- Pre Defined Configuration: Python
- Environment Type: Use Load Balanced for Production, Otherwise Single Instance is fine
- Deploy with Sample Application on your first run
- Environment name, for staging would be -staging, URL anything that is available
- Do not create an RDS in the beanstalk, provide an existing one. It will terminate the RDS if you terminate the environment. So unless you want that to happen. Provide your own RDS later.
- Select micro instance and default SSH key
-
Launch the environment. Now you can setup your RDS, deploy your code using fab aws.deploy.
-
Update your environment variables used in your settings.py file. You can edit them in your environment dashboard, Configuration -> Software Configuration
-
For troubleshooting, pull the logs via the environment dashboard and see what errors occured if any.
Assuming tasks are imported as aws. You can deploy, migrate and collectstatic like this:
fab aws.deploy aws.leader aws.manage:migrate aws.manage:collectstatic
Or just deploy, then later migrate and collectstatic
fab aws.deploy
fab aws.leader aws.manage:migrate aws.manage:collectstatic
The 'leader' task stores the leader instance in env.hosts, manage makes an ssh connection which requires you use the correct ssh private key used to start the instance to connect to it. Make sure the ssh key is added to ssh agent so it gets picked up:
chmod 600 ~/.ssh/id_rsa_aws
ssh-add ~/.ssh/id_rsa_aws
The project name is used as the name of the application in Elastic Beanstalk. The site i.e. live, staging, content, are created in EB as project-site.