David Zhang

David Zhang

How to install web crawler app locally using Docker


Crawl app with django backend and go crawler.

Getting Started


You need to have Docker and docker-compose installed.

alt text

Running locally

Start the backend and frontend with postgres locally:

make dev

This will start listening on localhost:8000.

Creating super user

Open shell to running backend conatiner:

docker exec -it web-crawl-app_backend_1 bash

Use django command to create superuser:

./manage.py createsuperuser

Deploying to staging

  1. Install the requirements:
pip3 install -r deploy/requirements.txt
  1. Authenticate with aws

  2. Deploy to staging:

make staging


We're using django-rest-auth for authentication endpoints. See backend/server/urls.py for mapping of the endpoint urls.

Google login

  1. Add google social application in django admin with dev client oauth client from Developer Console. Only client id and secret key are needed. You also need to move the site to the right.

  2. Login url is /auth/google/login/. You cannot login with an email that is already registered in the system.