Installation: lookit-api (Django project)

lookit-api is the codebase for Experimenter and Lookit, excluding the actual studies themselves. Any functionality you see as a researcher or a participant (e.g., signing up, adding a child, editing or deploying a study, downloading data) is part of the lookit-api repo. This project is built using Django and PostgreSQL. (The studies themselves use Ember.js; see Ember portion of codebase, ember-lookit-frameplayer.), It was initially developed by the Center for Open Science.

If you install only the lookit-api project locally, you will be able to edit any functionality that does not require actual study participation. For instance, you could contribute an improvement to how studies are displayed to participants or create a new CSV format for downloading data as a researcher.


These instructions are for Mac OS. Installing on another OS? Please consider documenting the exact steps you take and submitting a PR to the lookit-api repo to update the documentation! For notes on Linux installation, there may be helpful information in a previous version of invoke


Before you can begin setting your local environment, we’ll need to install a couple of things.

  1. Install mkcert.
  2. Install Docker Desktop.

Local Environment

Now the Prerequisites have been installed and Docker is running we can setup the local environment.

  1. Clone lookit-api and change directory:

    git clone
    cd lookit-api
  2. Create environment file:

    cp env_dist .env
  3. Make local CA and certificates for HTTPS:

    make local-certs
  4. Run DB migrations and add entry into Site table:

    make migrate
    make site
  5. Celery worker will need some permissions setup to run correctly. To setup these permissions, we’ll need to first start the Docker services:

    make serve

    Once the services are up and worker has exited due to a permission constraint, we’ll set the permissions in the container and restart the worker:

    make broker-perms

    From time to time, the container will need to be recreated, when this happens you may need to run “broker-perms” again.

At this point, the services should all be up and running. Direct your browser to https://localhost:8000 to see the local environment. In the future, to start the services you will only need to run “make serve”


Here are few Django related tasks that might come up every now and then.

To migrate the existing database:

make migrate

If you need to create an entry in the Site table:

make site

To create a superuser:

make superuser

To run tests:

make tests


The broker should come up with the rest of the Docker services. If you get a Celery worker error due to permissions, you can run the following command to resolve the issue and restart the worker service:

make broker-perms


Here are a couple of command that might be useful for managing the local database.

To access the database shell:

make dbshell

To import a SQL file into a fresh database (one where migrations haven’t been ran):

cat /location/of/sql/file | make dbpipe


You can create participant and researcher accounts through the regular signup flow on your local instance. To access Experimenter you will need to add two-factor authentication to your account following the prompts. In order to access the admin interface (https://localhost:8000/__CTRL__), which provides a convenient way to access and edit records, you will need to log in using the superuser you created earlier using

Handling video

This project includes an incoming webhook handler for an event generated by the Pipe video recording service used by ember-lookit-frameplayer when video is transferred to our S3 storage. This requires a webhook key for authentication. It can be generated via our Pipe account and, for local testing, stored in .env under PIPE_WEBHOOK_KEY.

Pipe needs to be told where to send the webhook. First, you need to expose your local /exp/renamevideo hook. You can use Ngrok to generate a public URL for your local instance during testing:

ngrok http https://localhost:8000

Then, based on the the assigned URL, you will need to manually edit the webhook on the dev environment of Pipe to send the video_copied_s3 event to (for example)