-
Notifications
You must be signed in to change notification settings - Fork 129
Old Setup Guide Docker
If you're just attaching to a queue go to the run a compute worker section.
You'll need to create a storage container (AWS S3 or Azure) and configure a .env
file which will store your settings like: storage keys, SSL certificate paths and ports that are open.
NOTE: If you are upgrading from an older version of Codalab that doesn't use Docker, please make sure you remove your settings/local.py
file!
Clone the project and make a copy of .env_sample
(or .env_production_sample
if you're creating a production server) called .env
in the root of the project directory:
- Install Docker CE (Community Edition) use the mac installer here, which is complete with docker-compose.
- For Ubuntu, follow command line installation instructions here). Note: If you're using Ubuntu, you'll also need to install manually install docker-compose. If you have Pip installed on your system, you can do this with
pip install docker-compose
- For Ubuntu, follow command line installation instructions here). Note: If you're using Ubuntu, you'll also need to install manually install docker-compose. If you have Pip installed on your system, you can do this with
git clone https://github.com/codalab/codalab-competitions.git
cd codalab-competitions
cp .env_sample .env
TODO describe how to install on Windows
Open .env
in your preferred editor, you'll fill this in with your storage keys, SSL information, etc.
Be sure to replace passwords/secrets!
# Here are some important settings to consider:
# =============================================
# Always set the secret key!
DJANGO_SECRET_KEY=something-random
# For storage you'll want to choose AWS or S3, and comment out or delete the other one
# DEFAULT_FILE_STORAGE=storages.backends.s3boto.S3BotoStorage
# ... and rest of AWS settings
#
# or
# DEFAULT_FILE_STORAGE=codalab.azure_storage.AzureStorage
# ... and rest of Azure storage settings
# Change the secrets, passwords, and ports if you're going to open up the rest,
# but with just the default setup and port 80 open the only other thing you may have
# to worry about is SSL. The bottom of this doc has a section on SSL
Codalab gives you the option of using AWS or Azure as your storage of choice. Depending on vendor you use, you must comment out the one you are not using in the .env
file.
Sign in or create an AWS account here and then create a private and public bucket.
- You don't have to do this if you've already setup Azure Blob Storage!
- Sign into the AWS Management Console and open the Amazon S3 console here
- Type
S3
into the Services Search bar, and navigate to the S3 console - Create 2 buckets, one named
"whatever-public"
and another"whatever-private"
- Click Create Bucket
- In the Create a Bucket dialog box, in the Bucket Name box, enter a bucket name. You'll need two buckets, one public and one private. Let's create the public bucket first. Name it
whatever-public
- In the Region box, select US West 2.
- Click Create
- Create another bucket: Click Create Bucket
- In the Create a Bucket dialog box, in the Bucket Name box, enter a bucket name for your private bucket. Something like
whatever-private
- Put these bucket names in your
.env
underAWS_STORAGE_BUCKET_NAME
andAWS_STORAGE_PRIVATE_BUCKET_NAME
- Make sure the
DEFAULT_FILE_STORAGE
.env
option is set tostorages.backends.s3boto.S3BotoStorage
Make sure you have your bucket highlighted, click properties button, then click permissions tab. Now edit CORS.
In both buckets set CORS as follows:
<?xml version="1.0" encoding="UTF-8"?>
<CORSConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
<CORSRule>
<AllowedOrigin>*</AllowedOrigin>
<AllowedMethod>PUT</AllowedMethod>
<AllowedMethod>POST</AllowedMethod>
<AllowedMethod>GET</AllowedMethod>
<MaxAgeSeconds>3000</MaxAgeSeconds>
<AllowedHeader>*</AllowedHeader>
</CORSRule>
</CORSConfiguration>
If you don't already have a user (via Identity and Access Management -- IAM), you'll need to create that here and put those key/secret values into your .env
under AWS_ACCESS_KEY_ID
and AWS_SECRET_ACCESS_KEY
.
If you're new to IAM you'll need to complete the following steps:
-
Add User
-
Set user to have programmatic access
-
Your permissions don't need to be set. Continue to the next step.
-
Copy your special
AWS_ACCESS_KEY_ID
andAWS_SECRET_ACCESS_KEY
for this user. -
Copy your username, you'll need it for setting up Policies on your
public
andprivate
buckets. -
Copy down your AWS account ID by looking at the Finding your AWS account ID section from this guide, which has been pasted below:
To view your AWS account ID when signed in as a root account user
Use your AWS account email address and password to sign in to the AWS Management Console.
If you previously signed in to the console with IAM user credentials, your browser might open your IAM user sign-in page. You can't use the IAM user sign-in page to sign in with your AWS account credentials. Instead, choose Sign-in using root account credentials to go to the AWS account sign-in page.
In the top right of the console, choose your account name or number. Then choose My Security Credentials.
If necessary, in the dialog box, choose Continue to Security Credentials. You can choose the box next to Don’t show me this message again to stop the dialog box from appearing in the future.
Expand the Account Identifiers section to view your AWS account ID.
To view your AWS account ID when signed in as a federated user
Sign in to the AWS Management Console as a federated user.
Select Support in the upper right corner of the console and choose Support Center.If necessary, in the dialog box, choose Continue to Security Credentials. You can choose the box next to Don’t show me this message again to stop the dialog box from appearing in the future.
Your AWS account ID appears in the upper right. The account ID for an AWS account is the same for the root account and its IAM users. For more information, see Your AWS Account ID and Its Alias.
Public
The following policy will allow anyone to download this data, like competition logos.
{
"Version": "2008-10-17",
"Statement": [
{
"Sid": "PublicReadForGetBucketObjects",
"Effect": "Allow",
"Principal": {
"AWS": "*"
},
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::<your-bucket-local>/*"
},
{
"Sid": "",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::<account-number-without-hyphens>:user/<your-username-here>"
},
"Action": "s3:*",
"Resource": [
"arn:aws:s3:::<your-bucket-local>",
"arn:aws:s3:::<your-bucket-local>/*"
]
}
]
}
Private
The following policy will disallow people from downloading competitions and submission data, but will retain your access (root
) and a separate IAM policy (user/<username-here>
):
{
"Version": "2008-10-17",
"Statement": [
{
"Sid": "DenyAllButMe",
"Effect": "Allow",
"Principal": {
"AWS": [
"arn:aws:iam::<account-number-without-hyphens>:user/<your-username-here>",
"arn:aws:iam::<account-number-without-hyphens>:root"
]
},
"Action": "s3:*",
"Resource": [
"arn:aws:s3:::<your-bucket-private>",
"arn:aws:s3:::<your-bucket-private>/*"
]
},
{
"Sid": "DenyAllButMe",
"Effect": "Deny",
"NotPrincipal": {
"AWS": [
"arn:aws:iam::<account-number-without-hyphens>:user/<your-username-here>",
"arn:aws:iam::<account-number-without-hyphens>:root"
]
},
"Action": "s3:*",
"Resource": [
"arn:aws:s3:::<your-bucket-private>",
"arn:aws:s3:::<your-bucket-private>/*"
]
}
]
}
For more details on Bucket Policy syntax and structure, refer to this guide.
You may sign up for an Azure account here, then follow the directions below.
- You do not have to do this if you've already setup S3!
- Log on to the Azure Portal.
- From the Dashboard, click Storage accounts on the left.
- Click Add at the top of the page to create a new storage account.
- If you don't already have a subscription, create one now. The free trial requires a credit card, and deletes all your storage containers after 90 days, unless you upgrade to a different plan i.e. 'Pay as You Go'.
- Select the Classic storage account. Refer to the image below for settings.
- In the dashboard, click All Resources/All Subscriptions and then click on your username. Click Access Keys and Copy your account name and access key to
.env
underAZURE_ACCOUNT_NAME
andAZURE_ACCOUNT_KEY
.
- Within that same user account, click on Containers and Add a new container.
- Create a new container named "bundles". Set the Access to "Private".
- Add another container named "public". Set the Access to "Public Blob".
Make sure your domain is set properly in your .env
. If you are running locally, CODALAB_SITE_DOMAIN=localhost
is fine. For our main website we use CODALAB_SITE_DOMAIN=codalab.org
.
To run the server navigate to the root directory of the git repo, near the docker-compose.yml
file and start docker:
docker-compose up -d
Read here to auto-start docker when your OS boots.
To change where logs are kept, modify LOG_DIR
in your .env
configuration file.
Place your certs in the certs/
folder and specify them in your .env
, i.e. if you put a cert named localhost
into your certs
folder you'd change .env
to:
SSL_CERTIFICATE=/app/certs/localhost.crt
SSL_CERTIFICATE_KEY=/app/certs/localhost.key
Change your RabbitMQ ports to the SSL versions which are 5671
and 15671
:
RABBITMQ_PORT=5671
RABBITMQ_MANAGEMENT_PORT=15671
Make sure you have ports 80
and 443
open on the machine.
This is only for users running workers for the "Worker Management Queues":
- Install docker (mac or Ubuntu)
git clone [email protected]:codalab/codalab-competitions.git
cd codalab-competitions
cp .env_sample .env
Edit .env
and set your BROKER_URL
from the worker management screen:
Now your configuration file should look something like this:
BROKER_URL=pyamqp://cd980e2d-78f9-4707-868d-bdfdd071...
Then you can run the worker:
$ docker-compose start worker_compute