Python API with a PostgreSQL database, using FastAPI framework.
Core requirement:
Main Python packages (managed by Docker):
- Python 3.12
- FastAPI <1.0.0
- Tortoise ORM >=0.19.3
- Gunicorn <21.0.0
- uvicorn <1.0.0
- NLTK <4.0
- spaCy <4.0.0
- spacy-lefff >=0.5.1
For the complete list of dependencies and their versions, see pyproject.toml
.
-
/docs
: Display the documentation of the API, with the availables endpoints, parameters, and provide a testing interface. Method:GET
-
/{lang}/generate
: Generate a new word that doesn't exist, and stores it in the DB. Availablelang
:en
,fr
,it
,es
Method:GET
-
/{lang}/get
: Get a random word that doesn't exist form the DB of generated words. Availablelang
:en
,fr
,it
,es
Method:GET
-
/{lang}/alter
: Alter a text with random non existing words. Availablelang
:en
,fr
Other parameters:text
percentage
Method:GET
-
/{lang}/definition
: Generate a random fake/altered dictionnary definition. Availablelang
:en
,fr
Method:GET
The easiest way to run the application is using Docker:
# Build and start all services
docker compose up --build
# Run in background
docker compose up -d
# Stop services
docker compose down
# View logs
docker compose logs -f
The API will be available at http://localhost:8000
.
- Copy the example environment file:
cp .env.example .env
- Update the
.env
file with your configuration values. See.env.example
for all available options and their descriptions.
If you prefer to run the application locally without Docker:
Create a virtual environment and install the dependencies with uv:
uv sync
For the French language, you need to download the Spacy NLP data:
python3 -m spacy download fr_core_news_sm
or, with uv:
uv run python -m spacy download fr_core_news_sm
If any issue with the fr_core_news_sm
model installing, one can install it manually with:
wget https://github.com/explosion/spacy-models/releases/download/fr_core_news_sm-3.5.0/fr_core_news_sm-3.5.0-py3-none-any.whl -P ./assets
unzip assets/fr_core_news_sm-3.5.0-py3-none-any.whl -d ./.venv/lib/python3.12/site-packages && chmod -R 777 ./.venv/lib/python3.12/site-packages/fr_core_news_sm
If any issue with pip in the venv for Spacy:
python3 -m ensurepip --default-pip
If Spacy lefff doesn't work, try to install it manually with pip and not with uv in the venv:
pip install spacy-lefff
or, with uv:
uv run pip install spacy-lefff
Launch the web server with:
uv run uvicorn api:app --reload
Inside the venv:
uvicorn api:app --reload
The project uses a simple migration system in the migrations
directory. To run a migration:
# In Docker:
docker compose exec api uv run python migrations/YYYYMMDD_migration_name.py
# Locally:
python migrations/YYYYMMDD_migration_name.py
Before contributing to the repository, it is necessary to initialize the pre-commit hooks:
pre-commit install
Once this is done, code formatting and linting, as well as import sorting, will be automatically checked before each commit.
Lint and format with:
uv run ruff check --fix && ruff format
build_proba_file.py
+ language: Create the probability file for the Markov chainbatch_generate.py
+ language: Generate a batch of words (500 by default) and save them in DBclassify_db_generated.py
+ language: Update the generated words in DB with their tense, conjugation, genre, number, etc.classify_db_real.py
+ language (from a dictionary TXT file): Update the real words in DB with their tense, conjugation, genre, number, etc.tweet.py
+ language + optional:--dry-run
To run the commands, use for example:
# In Docker:
docker compose exec api python -m commands.build_proba_file en
# Locally:
python3 -m commands.build_proba_file en