Price Tracker Backend

A Django api to display items and their current up-to-date prices from different online retailers in one platform. Utilizing scrapy to periodically scrape the latest prices from different online retailers. Store in a PostgreSQL database and make available via an API.

forthebadge made-with-python

Note: This project is a work in progress.

Setup Instructions / Installation

Getting Started


  • Python and pip (I am currently using 3.9.7) Any version above 3.7 should work.
  • Git installed on your machine
  • Code editor/ IDE
  • PostgreSQL installed on your machine

Installation and Running the App

  1. Clone GitHub repository

    git clone 
  2. Change into the folder

    cd price_tracker_backend
  3. Create a virtual environment

    python3 -m venv venv 
    • Activate the virtual environment
    source ./bin/activate
  • If you are using pyenv:

    3a. Create a virtualenv

    pyenv virtualenv price_tracker_backend

    3b. Activate the virtualenv

    pyenv activate price_tracker_backend
  1. Create a .env file and add your credentials

    touch .env 

    OR Copy the included example

    cp .env-example .env 
  2. Create a PostgreSQL Database

    psql -U postgres
    • Create a database
    CREATE DATABASE price_tracker_db;
    • Create a user
    CREATE USER price_tracker_user WITH PASSWORD 'password';
    • Grant privileges to the user
    GRANT ALL PRIVILEGES ON DATABASE price_tracker_db TO price_tracker_user;
    • Exit the psql session
  3. Add your credentials to the .env file

    5a. export db credentials to your environment variables

    export DATABASE_URL=postgres://user:password@localhost:5432/price_tracker
  4. Migrate your database

    python migrate
  5. Install the required dependencies

    pip install -r requirements.txt
  6. Make the shell script executable

    chmod a+x ./
  7. Run the app


    app should be available on

    OR run with python

    python runserver

To run the scrapy spider

 cd price_scraper 
 scrapy crawl jumia_spider

To Run Scrapd and use a web interface

cd price_scraper 

To run ScrapyRT and get a real time API

cd price_scraper 
  • Test out the api:
 curl "http://localhost:9080/crawl.json?spider_name=jumia_spider&url="

Technologies used

  • Python-3.9.7
  • Django web framework
  • PostgreSQL
  • Scrapy

Download source code from Github

Download ZIP

Submit resources