Skip to content

Latest commit

 

History

History
496 lines (349 loc) · 13.8 KB

File metadata and controls

496 lines (349 loc) · 13.8 KB

Steps to build a full-fledged RESTful API in Python (using FastAPI)

  • STEP 1 : posts.py

    Create path operations for the following (CRUD Operations)

    • Create / View/ Update / Delete posts
  • STEP 2 :

    Connect to POSTGRESQL -> the target database

  • STEP 3 :

    Using SQLALCHEMY -> the DB access library

    (automates the sql queries in the python code)

  • STEP 4 :

    Creating pydantic models using pydantic PYTHON MODULE

    (to create a structure for the request and response sent and received from fastapi to the server)

  • STEP 5 : users.py

    Creating functionality for the users to login and register.

    • Create / login users
  • STEP 6 :

    Hashing password to store it in db using PASSLIB, BCRYPT PYTHON MODULES

  • STEP 7 : AUTHENTICATION

    Using JWT PYTHON MODULE for jwt token authentication which checks if the users are logged in

    • Generation of Token and fetching data :

      1. client -------- /login(uid+pwd) ---> API
      
      2. if credentials exists,
         API --------- [TOKEN] ----------> client
      
      3. if the user wants to see the posts,
         client --------- /posts [TOKEN] ----> API
      
      4. if the token which is along wid the header is a valid token,
         API ---------- DATA -------------> client
      
    • A JWT token has 3 parts :

      • header (algo & token type)
      • payload (data)
      • signature (header + payload + secret key)
    • How validation of token done?

      • API takes header, payload and secret key to generate test signature

      • If the test signature is same as the signature received from the token
        --> token is valid
        else
        --> token is invalid

    • How token is generated?

      • User enters the email and pwd.

      • Email and pwd is sent to api from server

      • API fetches the credentials of user from database on searching by email

      • The entered pwd is hashed.

      • If the hashed pwd equals the one present in database
        --> token is generated by API and sent to server

          ✔ take the User ID and expiration time as the payload
        
          ✔ set a secret_key (a random string of 32 hexadecimal digits generated by rand(hex32) in openssl)
        
          ✔ encode them into a token using `jwt.encode(data,secretkey,algo)` function
        
          ✔ return the generated token
        
      • else
        --> token generation fails

  • STEP 8 : AUTHORIZATIION

    Setting up authorization to all the endpoints / path operations (auth.py)

  • STEP 9 : RELATIONSHIP BETWEEN DATABASES

    Create a relationship between users and posts db (foreign key) via sqlalchemy to display usernames along with the posts created by them.

  • STEP 10 : QUERY PARAMETERS

    Adding query parameters to the path routes (search by)

  • STEP 11 : .env

    Setting up ENVIRONMENT VARIABLES (in .env file) so that confidential data is not exposed directly in code

  • STEP 12 : votes.py

    Setting up the liking system using the table votes

    • Vote / unvote on posts
  • STEP 13 : ALEMBIC

    Setting up database migration tool :

    • When changes are made to the models in the python code, they wont reflect in the database unless the tables are deleted.

    • That is, if the tables already exists, changes dont get reflected in them.

    • In order to deal with this prb, database migration tool is used.

    ALEMBIC -> database migration toolkit

    Alembic is hosted on github under SQLALCHEMY organization

    • Basic alembic setup :

      1. Create an environment for alembic in the working directory:

        alembic init [name]
        

        Example :

        alembic init alembic
        
      2. Edit the [name].ini file (alembic.ini)

      3. Create a migration script

        create revision -m "some-title"
        

        Example:

        create revision -m "create posts table"
        
      4. Edit the migration script :

        • give sql queries for upgrade() and downgrade() functions
      5. Running 1st migration:

        alembic upgrade head
        

        or

        alembic upgrade [upgrade revision number]
        

        Running 2 migration ahead of the current:

        alembic upgrade +2
        
      6. Going 1 migration down:

        alembic downgrade base
        

        or

        alembic downgrade -1
        
      7. Getting info:

        alembic current
        
      8. Auto generation of migrations:

        • Modify env.py file so that it gets access to the table metadata object that contains the target.

        • Suppose our application has a declarative base in myapp.mymodel.
          This base contains a MetaData object which contains Table objects defining our database.

          in env.py change,

          target_metadata = None

          to

          from myapp.mymodel import Base
          target_metadata = Base.metadata
        • Command:

          alembic revision --autogenarate -m "some-title"
          

    Once alembic is setup, the databases are automatically created and updated everytime the application is run.

  • STEP 13 : DEPLOYMENT

    Deployment methods :

    • Heroku (free)
    • Ubuntu server

    HEROKU:

    • Heroku is a PaaS (platform as a service) for building and running software applications in the cloud.

    • Steps to deploy in Heroku :

      • Setup the heroku CLI in local computer

      • Create a procfile
        The Procfile tells heroku the command needed for starting the application.

        web: uvicorn app.main:app --host=0.0.0.0 --port =${PORT:-5000}
        
      • Create a postgres instance. Heroku provides a free postgres instance that we can have accessible to the public. hobby-dev is the name of the free plan that heroku offers.

      • Set the environment variables in Config Vars of the heroku app

      • restart the heroku instance

        heroku ps:restart
        
      • Create tables in the postgres instance.

        heroku run alembic upgrade head
        
      • Restart the heroku instance again

    • The API is successfully deployed to heroku

    UBUNTU:

    • Prepare the application
      SSH into the ubuntu server, create and navigate to the directory that you want your application to be stored into. Say it's /var/www/myapp.

      $ mkdir /var/www/myapp 
      $ cd /var/www/myapp
      
    • Setup the virtual env

      $ virtualenv -p python3.8 venv
      $ mkdir src
      $ . venv/bin/activate
      (venv) $ cd src
      
    • Pull the source code and install all dependencies

      (venv) $ git init
      (venv) $ git remote add origin <your-repo-url>
      (venv) $ git pull origin <your-branch-name>
      (venv) $ pip install -r requirements.txt
      (venv) $ pip install gunicorn uvicorn
      
    • Configure Nginx

      In nginx file:

      server{
      server_name <your-site-name>;
      location / {
         include proxy_params;
         proxy_pass http://127.0.0.1:8000;
      }
      }
      
    • Start the nginx service

      $ sudo systemctl restart nginx.service
      
    • Start the application

      $ gunicorn -w 4 -k uvicorn.workers.UvicornWorker main:app
      
    • Create a service for the Gunicorn server so that it is always running and it automatically starts when the server is rebooted.

      In gunicorn.service file :

      [Unit]
      Description=Gunicorn instance to serve MyApp
      After=network.target
      
      [Service]
      User=<username>
      Group=www-data
      WorkingDirectory=/var/www/myapp/src
      Environment="PATH=/var/www/myapp/venv/bin"
      ExecStart=/var/www/myapp/venv/bin/gunicorn -w 4 -k uvicorn.workers.UvicornWorker main:app
      
      [Install]
      WantedBy=multi-user.target
      

    This will start the application and the server will be running in the background.

  • STEP 14 DOCKERIZING (optional step)

    • Docker is an open source containerization platform.

    • It enables developers to package applications into containers—standardized executable components combining application source code with the operating system (OS) libraries and dependencies required to run that code in any environment.

    • Containers simplify delivery of distributed applications.

    • Docker is essentially a toolkit that enables developers to build, deploy, run, update, and stop containers using simple commands and work-saving automation through a single API.

    • Steps to deploy in docker :

      1. Check if docker is setup in the machine

        docker --version
        
      2. Create a docker file to build an image which contains the commands to be run on startup

      Example :

      • the usual commands run in local machines:

        python -m venv venv
        
        source venv/bin/activate
        
        pip install -r requirements.txt
        
        # Start the server on localhost:8080
        myapp api --host 0.0.0.0 --port 8080 --storage ./storage
        
      • dockerfile to do the same process :

         FROM python:3.9.7 # the official python image
        
         WORKDIR /app
        
         COPY requirements.txt ./
        
         RUN pip install -r requirements.txt
        
         COPY . .
        
         CMD ["myapp", "api", "--host", "0.0.0.0", "--port", "8080"]
    1. Build the image :

      docker build -t [name] [dir]
      

      Example :

      docker build -t fastapi .
      
    2. To view all the docker images :

      docker image ls
      
    3. docker run can be used to start and run the image in CLI but docker-compose is even better as all the commands used to run the image can be saved into a docker-compose file and there's no necessity to remember all those long commands and type them over and over again.

      Command to do a docker-run:

      docker run -i -t -p 80:8000 fastapi
      

      Create the docker-compose file with all the env variables and directory mentioned

    4. Run the docker compose command to start the image

      docker-compose up -d (if filename is docker-compose.yml)
      

      or

      docker-compose -f [filename] up -d
      

      Example:

      docker-compose -f docker-compose-dev.yml up -d
      
    5. To stop the image,

      docker-compose -f docker-compose-dev.yml down 
      
    6. View the logs,

      docker logs fastapi-tutorial_api_1
      
    7. Execute the image in interactive mode,

      docker exec -it fastapi bash
      
  • STEP 15 : TESTING

    • testing is done using pytest MODULE

    • setup a seperate folder in main directory as tests

    • create an empty init.py file

    • setup the conftest.py to have all the required fixtures over the entire test set

    • create different functions to test different scenarios possible with all the path operations

    • in order to run the tests :

      pytest -p no:faulthandlers -v -safer
      
  • STEP 15: CI/CD - Continuous integration / continuous delivery

    • Manual Process invloves :

      Make changes to code   
              |  
              v  
       commit changes  
              |  
              v   
      run all automated tests   
              |  
              v  
        build the app   
              |  
              v  
           deploy  
      
    • Automated CI/CD :

      Make changes to code 
              |
              v
       commit changes --> Continuous integration phase
      
                          1. pull source code
                          2. install dependencies
                          3. run automated tests
                          4. build the app/image ---> Continuous Delivery phase
      
                                                      1. grab image/code
                                                      2. update production
      
    • Common CI/CD tools:

      • Travis CI
      • circleci
      • jenkins
      • github actions
    • What CI/CD tool does:

      • provides a runner -> a computer / vm to run bunch of commands specified by used
      • the commands are configured by either one of the below - yaml/json file - gui
      • the commands -> are all the actions the pipeline will perform
      • the command are triggerred based on some event (in case of github -> git push/merge)
    • Steps :

      • create a folder .github/workflows
      • create a yaml file -> build-deploy.yml
      • push the code to the github repo
      • the github action will be triggered
      • the application is successfully setup for automated CI/CD