Skip to content

Old Setup Guide Docker

Adrien Pavao edited this page Oct 20, 2022 · 1 revision

Setup Guide - Docker

To run a worker to process submissions

If you're just attaching to a queue go to the run a compute worker section.

To run an entire instance of the Codalab website

You'll need to create a storage container (AWS S3 or Azure) and configure a .env file which will store your settings like: storage keys, SSL certificate paths and ports that are open.

Complete website setup

1. Environment

NOTE: If you are upgrading from an older version of Codalab that doesn't use Docker, please make sure you remove your settings/local.py file!

- On Mac/Linux

Clone the project and make a copy of .env_sample (or .env_production_sample if you're creating a production server) called .env in the root of the project directory:

  1. Install Docker CE (Community Edition) use the mac installer here, which is complete with docker-compose.
  2. git clone https://github.com/codalab/codalab-competitions.git
  3. cd codalab-competitions
  4. cp .env_sample .env

- On Windows

TODO describe how to install on Windows

- Editing project variables

Open .env in your preferred editor, you'll fill this in with your storage keys, SSL information, etc.

Be sure to replace passwords/secrets!

# Here are some important settings to consider:
# =============================================
# Always set the secret key!
DJANGO_SECRET_KEY=something-random

# For storage you'll want to choose AWS or S3, and comment out or delete the other one
# DEFAULT_FILE_STORAGE=storages.backends.s3boto.S3BotoStorage
# ... and rest of AWS settings
#
# or
# DEFAULT_FILE_STORAGE=codalab.azure_storage.AzureStorage  
# ... and rest of Azure storage settings

# Change the secrets, passwords, and ports if you're going to open up the rest, 
# but with just the default setup and port 80 open the only other thing you may have
# to worry about is SSL. The bottom of this doc has a section on SSL

2. Storage

Codalab gives you the option of using AWS or Azure as your storage of choice. Depending on vendor you use, you must comment out the one you are not using in the .env file.

- AWS S3

Sign in or create an AWS account here and then create a private and public bucket.

Creating buckets

  1. You don't have to do this if you've already setup Azure Blob Storage!
  2. Sign into the AWS Management Console and open the Amazon S3 console here
  3. Type S3 into the Services Search bar, and navigate to the S3 console
  4. Create 2 buckets, one named "whatever-public" and another "whatever-private"
  5. Click Create Bucket
  6. In the Create a Bucket dialog box, in the Bucket Name box, enter a bucket name. You'll need two buckets, one public and one private. Let's create the public bucket first. Name it whatever-public
  7. In the Region box, select US West 2.
  8. Click Create
  9. Create another bucket: Click Create Bucket
  10. In the Create a Bucket dialog box, in the Bucket Name box, enter a bucket name for your private bucket. Something like whatever-private
  11. Put these bucket names in your .env under AWS_STORAGE_BUCKET_NAME and AWS_STORAGE_PRIVATE_BUCKET_NAME
  12. Make sure the DEFAULT_FILE_STORAGE .env option is set to storages.backends.s3boto.S3BotoStorage

Setting CORS

Make sure you have your bucket highlighted, click properties button, then click permissions tab. Now edit CORS.

properties-s3-bucket-perm

In both buckets set CORS as follows:

<?xml version="1.0" encoding="UTF-8"?>
<CORSConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
    <CORSRule>
        <AllowedOrigin>*</AllowedOrigin>
        <AllowedMethod>PUT</AllowedMethod>
        <AllowedMethod>POST</AllowedMethod>
        <AllowedMethod>GET</AllowedMethod>
        <MaxAgeSeconds>3000</MaxAgeSeconds>
        <AllowedHeader>*</AllowedHeader>
    </CORSRule>
</CORSConfiguration>

image

image

Setting Bucket Policies

If you don't already have a user (via Identity and Access Management -- IAM), you'll need to create that here and put those key/secret values into your .env under AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY.

If you're new to IAM you'll need to complete the following steps:

  1. Add User add user

  2. Set user to have programmatic access programmatic access

  3. Your permissions don't need to be set. Continue to the next step. scary permissions box

  4. Copy your special AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY for this user. access key and id

  5. Copy your username, you'll need it for setting up Policies on your public and private buckets. username

  6. Copy down your AWS account ID by looking at the Finding your AWS account ID section from this guide, which has been pasted below:

Get your AWS account ID

To view your AWS account ID when signed in as a root account user

Use your AWS account email address and password to sign in to the AWS Management Console.

If you previously signed in to the console with IAM user credentials, your browser might open your IAM user sign-in page. You can't use the IAM user sign-in page to sign in with your AWS account credentials. Instead, choose Sign-in using root account credentials to go to the AWS account sign-in page.

In the top right of the console, choose your account name or number. Then choose My Security Credentials.

If necessary, in the dialog box, choose Continue to Security Credentials. You can choose the box next to Don’t show me this message again to stop the dialog box from appearing in the future.

Expand the Account Identifiers section to view your AWS account ID.

To view your AWS account ID when signed in as a federated user

Sign in to the AWS Management Console as a federated user.

Select Support in the upper right corner of the console and choose Support Center.If necessary, in the dialog box, choose Continue to Security Credentials. You can choose the box next to Don’t show me this message again to stop the dialog box from appearing in the future.

Your AWS account ID appears in the upper right. The account ID for an AWS account is the same for the root account and its IAM users. For more information, see Your AWS Account ID and Its Alias.

Edit Your Bucket Policy

Public

perms

The following policy will allow anyone to download this data, like competition logos.

Replace only the text enclosed in pointed brackets, such as <your-username-here>

{
    "Version": "2008-10-17",
    "Statement": [
        {
            "Sid": "PublicReadForGetBucketObjects",
            "Effect": "Allow",
            "Principal": {
                "AWS": "*"
            },
            "Action": "s3:GetObject",
            "Resource": "arn:aws:s3:::<your-bucket-local>/*"
        },
        {
            "Sid": "",
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::<account-number-without-hyphens>:user/<your-username-here>"
            },
            "Action": "s3:*",
            "Resource": [
                "arn:aws:s3:::<your-bucket-local>",
                "arn:aws:s3:::<your-bucket-local>/*"
            ]
        }
    ]
}

Private

The following policy will disallow people from downloading competitions and submission data, but will retain your access (root) and a separate IAM policy (user/<username-here>):

{
    "Version": "2008-10-17",
    "Statement": [
        {
            "Sid": "DenyAllButMe",
            "Effect": "Allow",
            "Principal": {
                "AWS": [
                    "arn:aws:iam::<account-number-without-hyphens>:user/<your-username-here>",
                    "arn:aws:iam::<account-number-without-hyphens>:root"
                ]
            },
            "Action": "s3:*",
            "Resource": [
                "arn:aws:s3:::<your-bucket-private>",
                "arn:aws:s3:::<your-bucket-private>/*"
            ]
        },
        {
            "Sid": "DenyAllButMe",
            "Effect": "Deny",
            "NotPrincipal": {
                "AWS": [
                    "arn:aws:iam::<account-number-without-hyphens>:user/<your-username-here>",
                    "arn:aws:iam::<account-number-without-hyphens>:root"
                ]
            },
            "Action": "s3:*",
            "Resource": [
                "arn:aws:s3:::<your-bucket-private>",
                "arn:aws:s3:::<your-bucket-private>/*"
            ]
        }
    ]
}

For more details on Bucket Policy syntax and structure, refer to this guide.

- Azure blob storage

You may sign up for an Azure account here, then follow the directions below.

  1. You do not have to do this if you've already setup S3!
  2. Log on to the Azure Portal.
  3. From the Dashboard, click Storage accounts on the left.
  4. Click Add at the top of the page to create a new storage account.
  5. If you don't already have a subscription, create one now. The free trial requires a credit card, and deletes all your storage containers after 90 days, unless you upgrade to a different plan i.e. 'Pay as You Go'.
  6. Select the Classic storage account. Refer to the image below for settings.
  1. In the dashboard, click All Resources/All Subscriptions and then click on your username. Click Access Keys and Copy your account name and access key to .env under AZURE_ACCOUNT_NAME and AZURE_ACCOUNT_KEY.
  1. Within that same user account, click on Containers and Add a new container.
  1. Create a new container named "bundles". Set the Access to "Private".
  2. Add another container named "public". Set the Access to "Public Blob".
1. Make sure the `DEFAULT_FILE_STORAGE` `.env` option is set to `codalab.azure_storage.AzureStorage`

3. Running docker

Make sure your domain is set properly in your .env. If you are running locally, CODALAB_SITE_DOMAIN=localhost is fine. For our main website we use CODALAB_SITE_DOMAIN=codalab.org.

To run the server navigate to the root directory of the git repo, near the docker-compose.yml file and start docker:

docker-compose up -d

Read here to auto-start docker when your OS boots.

4. Extras

- Logging

To change where logs are kept, modify LOG_DIR in your .env configuration file.

- SSL

Place your certs in the certs/ folder and specify them in your .env, i.e. if you put a cert named localhost into your certs folder you'd change .env to:

SSL_CERTIFICATE=/app/certs/localhost.crt
SSL_CERTIFICATE_KEY=/app/certs/localhost.key

Change your RabbitMQ ports to the SSL versions which are 5671 and 15671:

RABBITMQ_PORT=5671
RABBITMQ_MANAGEMENT_PORT=15671

Make sure you have ports 80 and 443 open on the machine.

Compute worker (only) setup

This is only for users running workers for the "Worker Management Queues":

  • Install docker (mac or Ubuntu)
  • git clone [email protected]:codalab/codalab-competitions.git
  • cd codalab-competitions
  • cp .env_sample .env

Edit .env and set your BROKER_URL from the worker management screen:
image

Now your configuration file should look something like this:

BROKER_URL=pyamqp://cd980e2d-78f9-4707-868d-bdfdd071...

Then you can run the worker:

$ docker-compose start worker_compute
Clone this wiki locally