π This repository is used as a backend for the service MindLogger HERE.
π Web application is powered by:
- β Python3.10+
- β Pipenv
- β FastAPI
- β Postgesql
- β Redis
- β Docker
- β Pydantic
- β FastAPI
- β SQLAlchemy
And
π Code quality tools:
git clone git@github.com:ChildMindInstitute/mindlogger-backend-refactor.git
π Project is configured via environment variables. You have to export them into your session from which you are running the application locally of via Docker.
π All default variables configured for making easy to run application via Docker in a few clicks
π‘ All of them you can find in
.env.default
Key | Default value | Description |
---|---|---|
PYTHONPATH | src/ | This variable makes all folders inside src/ reachable in a runtime. NOTE: You don't need to do this if you use Docker as far as it is hardcoded in Dockerfile |
DATABASE__HOST | postgres | Database Host |
DATABASE__USER | postgres | User name for Postgresql Database user |
DATABASE__PASSWORD | postgres | Password for Postgresql Database user |
DATABASE__DB | mindlogger_backend | Database name |
CORS__ALLOW_ORIGINS | * |
Represents the list of allowed origins. Set the Access-Control-Allow-Origin header. Example: https://dev.com,http://localohst:8000 |
CORS__ALLOW_CREDENTIALS | true | Set the Access-Control-Allow-Credentials header |
CORS__ALLOW_METHODS | * |
Set the Access-Control-Allow-Methods header |
CORS__ALLOW_HEADERS | * |
Set the Access-Control-Allow-Headers header |
AUTHENTICATION__SECRET_KEY | e51bcf5f4cb8550ff3f6a8bb4dfe112a | Access token's salt |
AUTHENTICATION__REFRESH_SECRET_KEY | 5da342d54ed5659f123cdd1cefe439c5aaf7e317a0aba1405375c07d32e097cc | Refresh token salt |
AUTHENTICATION__ALGORITHM | HS256 | The JWT's algorithm |
AUTHENTICATION__ACCESS_TOKEN_EXPIRATION_TIME | 30 | Time in minutes after which the access token will stop working |
AUTHENTICATION__REFRESH_TOKEN_EXPIRATION_TIME | 30 | Time in minutes after which the refresh token will stop working |
ADMIN_DOMAIN | - | Admin panel domain |
You can see that some environment variables have double underscore (
__
) instead of_
.As far as
pydantic
supports nested settings models it uses to have cleaner code
It is hightly recommended to create .env
file as far as it is needed for setting up the project with Local and Docker approaches.
cp .env.default .env
β π§ Linux
β ο£Ώ MacOs
docker-compose up -d redis
Pipenv used as a default dependencies manager
# Activate your environment
pipenv shell
# Install all deps from Pipfile.lock
# to install venv to current directory use `export PIPENV_VENV_IN_PROJECT=1`
pipenv sync --dev
π NOTE: if you don't use
pipenv
for some reason remember that you will not have automatically exported variables from your.env
file.π Pipenv docs
So then you have to do it by your own manually
# Manual exporting in Unix (like this)
export PYTHONPATH=src/
export BASIC_AUTH__PASSWORD=1234
...
...or using a Bash-script
set -o allexport; source .env; set +o allexport
π NOTE: Please do not forget about environment variables! Now all environment variables for the Postgres Database which runs in docker are already passed to docker-compose.yaml from the .env file.
It is a good practice to use Git hooks to provide better commits.
For increased security during development, install git-secrets
to scan code for aws keys.
Please use this link for that: https://github.com/awslabs/git-secrets#installing-git-secrets
.pre-commit-config.yaml
is placed in the root of the repository.
π Once you have installed git-secrets
and pre-commit
simply run the following command.
make aws-scan
π Then all your staged cahnges will be checked via git hooks on every git commit
π NOTE: Don't forget to set the
PYTHONPATH
environment variable, e.g: export PYTHONPATH=src/
In project we use simplified version of imports: from apps.application_name import class_name, function_name, module_nanme
.
For doing this we must have src/
folder specified in a PATH.
P.S. You don't need to do this additional step if you run application via Docker container π€«
uvicorn src.main:app --proxy-headers --port {PORT} --reload
The pytest
framework is using in order to write unit tests.
Currently postgresql is used as a database for tests with running configurations that are defined in pyproject.toml
DATABASE__HOST=localhost
DATABASE__PORT=5432
DATABASE__PASSWORD=test
DATABASE__USER=test
DATABASE__DB=test
# Connect to the database with Docker
docker-compose exec postgres psql -U postgres postgres
# Or connect to the database locally
psql -U postgres postgres
# Create user's database
psql# create database test;
# Create arbitrary database
psql# create database test_arbitrary;
# Create user test
psql# create user test;
# Set password for the user
psql# alter user test with password 'test';
To correctly calculate test coverage, you need to run the coverage with the --concurrency=thread,gevent
parameter:
coverage run --concurrency=thread,gevent -m pytest
coverage report -m
docker-compose build
β
Make sure that you completed .env
file. It is using by default in docker-compose.yaml
file for buildnig.
β
Check building with docker images
command. You should see the record with fastapi_service
.
π‘ If you would like to debug the application insode Docker comtainer make sure that you use COMPOSE_FILE=docker-compose.dev.yaml
in .env
. It has opened stdin and tty.
docker-compose up
Additional docker-compose up
flags that might be useful for development
-d # Run docker containers as deamons (in background)
--no-recreate # If containers already exist, don't recreate them
docker-compose down
Additional docker-compose down
flags that might be useful for development
-v # Remove with all volumes
β Only in case you want to setup the Git hooks inside your Docker container and burn down in hell you may skip this step. πΉ π₯
π For the rest of audience it is recommended:
- Don't install pre-commit hooks
- Use Makefile to run all commands in Docker container
Usage:
# Run the application in a background
# NOTE: Mandatory to run commands inside the container
docker-compose up -d
# Check the code quality
make dcq
# Check tests passing
make dtest
# Check everything in one hop
make dcheck
You can use the Makefile
to work with project (run the application / code quality tools / tests ...)
For local usage:
# Run the application
make run
# Check the code quality
# make cq
# Check tests passing
make test
# Check everything in one hop
make check
...
π‘ If you want run web-app locally you can use the next commands
π NOTE: Before these commands, the storages must be started.
If you want, you can start storage with the command:
make run_storages
Run web-app locally (don't forget to activate the environment)
make run
By default CORS policy accepts all connections
alembic revision --autogenerate -m "Add a new field"
alembic upgrade head
alembic downgrade 0e43c346b90d
β This hash is taken from the generated file in the migrations folder
alembic downgrade 0e43c346b90d
π‘ Do not forget that alembic saves the migration version into the database.
delete from alembic_version;
alembic -c alembic_arbitrary.ini upgrade head
erDiagram
User_applet_accesses ||--o{ Applets: ""
User_applet_accesses {
int id
datetime created_at
datetime updated_at
int user_id FK
int applet_id FK
string role
}
Users {
int id
datetime created_at
datetime updated_at
boolean is_deleted
string email
string full_name
string hashed_password
}
Users||--o{ Applets : ""
Applets {
int id
datetime created_at
datetime updated_at
boolean is_deleted
string display_name
jsonb description
jsonb about
string image
string watermark
int theme_id
string version
int creator_id FK
text report_server_id
text report_public_key
jsonb report_recipients
boolean report_include_user_id
boolean report_include_case_id
text report_email_body
}
Applet_histories }o--|| Users: ""
Applet_histories {
int id
datetime created_at
datetime updated_at
boolean is_deleted
jsonb description
jsonb about
string image
string watermark
int theme_id
string version
int account_id
text report_server_id
text report_public_key
jsonb report_recipients
boolean report_include_user_id
boolean report_include_case_id
text report_email_body
string id_version
string display_name
int creator_id FK
}
Answers_activity_items }o--|| Applets: ""
Answers_activity_items }o--|| Users: ""
Answers_activity_items }o--|| Activity_item_histories: ""
Answers_activity_items {
int id
datetime created_at
datetime updated_at
jsonb answer
int applet_id FK
int respondent_id FK
int activity_item_history_id_version FK
}
Answers_flow_items }o--|| Applets: ""
Answers_flow_items }o--|| Users: ""
Answers_flow_items ||--o{ Flow_item_histories: ""
Answers_flow_items {
int id
datetime created_at
datetime updated_at
jsonb answer
int applet_id FK
int respondent_id FK
int flow_item_history_id_version FK
}
Activities }o--|| Applets: ""
Activities {
int id
datetime created_at
datetime updated_at
boolean is_deleted
UUID guid
string name
jsonb description
text splash_screen
text image
boolean show_all_at_once
boolean is_skippable
boolean is_reviewable
boolean response_is_editable
int ordering
int applet_id FK
}
Activity_histories }o--|| Applets: ""
Activity_histories {
int id
datetime created_at
datetime updated_at
boolean is_deleted
UUID guid
string name
jsonb description
text splash_screen
text image
boolean show_all_at_once
boolean is_skippable
boolean is_reviewable
boolean response_is_editable
int ordering
int applet_id FK
}
Activity_item_histories }o--|| Activity_histories: ""
Activity_item_histories {
int id
datetime created_at
datetime updated_at
boolean is_deleted
jsonb question
string response_type
jsonb answers
text color_palette
int timer
boolean has_token_value
boolean is_skippable
boolean has_alert
boolean has_score
boolean is_random
boolean is_able_to_move_to_previous
boolean has_text_response
int ordering
string id_version
int activity_id FK
}
Activity_items }o--|| Activities: ""
Activity_items {
int id
datetime created_at
datetime updated_at
jsonb question
string response_type
jsonb answers
text color_palette
int timer
boolean has_token_value
boolean is_skippable
boolean has_alert
boolean has_score
boolean is_random
boolean is_able_to_move_to_previous
boolean has_text_response
int ordering
int activity_id FK
}
Flows }o--|| Applets: ""
Flows {
int id
datetime created_at
datetime updated_at
boolean is_deleted
string name
UUID guid
jsonb description
boolean is_single_report
boolean hide_badge
int ordering
int applet_id FK
}
Flow_items }o--|| Flows: ""
Flow_items }o--|| Activities: ""
Flow_items {
int id
datetime created_at
datetime updated_at
boolean is_deleted
int ordering
int activity_flow_id FK
int activity_id FK
}
Flow_item_histories }o--|| Flow_histories: ""
Flow_item_histories }o--|| Activity_histories: ""
Flow_item_histories {
int id
datetime created_at
datetime updated_at
boolean is_deleted
string id_version
int activity_flow_id FK
int activity_id FK
}
Flow_histories }o--|| Applet_histories: ""
Flow_histories {
int id
datetime created_at
datetime updated_at
boolean is_deleted
string name
UUID guid
jsonb description
boolean is_single_report
boolean hide_badge
int ordering
string id_version
int applet_id FK
}
You can connect arbitrary file storage and database by filling special fields in table user_workspaces
.
Add your database connection string into database_uri
In next format:
postgresql+asyncpg://<username>:<password>@<hostname>:port/database
For AWS S3 bucket next fields are required:
storage_region
,storage_bucket
, storage_access_key
,storage_secret_key
.
In case of Azure blob, specify your connection string into field storage_secret_key