Cookiecutter template for flask restful, including blueprints, application factory, and more
This cookie cutter is a very simple boilerplate for starting a REST api using Flask, flask-restful, marshmallow, SQLAlchemy and jwt. It comes with basic project structure and configuration, including blueprints, application factory and basics unit tests.
Features
- Simple flask application using application factory, blueprints
- Flask command line interface integration
- Simple cli implementation with basics commands (init, run, etc.)
- Flask Migrate included in entry point
- Authentication using Flask-JWT-Extended including access token and refresh token management
- Simple pagination utils
- Unit tests using pytest and factoryboy
- Configuration using environment variables
- OpenAPI json file and swagger UI
Used packages :
- Flask
- Flask-RESTful
- Flask-Migrate
- Flask-SQLAlchemy
- Flask-Marshmallow
- Flask-JWT-Extended
- marshmallow-sqlalchemy
- passlib
- tox
- pytest
- factoryboy
- dotenv
- apispec
- Installation
- Configuration
- Authentication
- Running tests
- WSGI Server
- Flask CLI
- Using Celery
- Using Docker
- Makefile
- APISpec and swagger
- Changelog
For the example, let's say you named your app myapi and your project myproject
Once project started with cookiecutter, you can install it using pip :
cd myproject
pip install -r requirements.txt
pip install -e .
You have now access to cli commands and you can init your project
myapi db upgrade
myapi init
To list all commands
myapi --help
Configuration is handled by environment variables, for development purpose you just
need to update / add entries in .flaskenv file.
It's filled by default with following content:
FLASK_ENV=development
FLASK_APP="myapp.app:create_app"
SECRET_KEY=changeme
DATABASE_URI="sqlite:////tmp/myapp.db"
CELERY_BROKER_URL=amqp://guest:guest@localhost/ # only present when celery is enabled
CELERY_RESULT_BACKEND_URL=amqp://guest:guest@localhost/ # only present when celery is enabled
Avaible configuration keys:
FLASK_ENV: flask configuration key, enablesDEBUGif set todevelopmentSECREY_KEY: your application secret keyDATABASE_URI: SQLAlchemy connection stringCELERY_BROKER_URL: URL to use for celery broker, only when you enabled celeryCELERY_RESULT_BACKEND_URL: URL to use for celery result backend (e.g:redis://localhost)
To access protected resources, you will need an access token. You can generate
an access and a refresh token using /auth/login endpoint, example using curl
curl -X POST -H "Content-Type: application/json" -d '{"username": "admin", "password": "admin"}' http://localhost:5000/auth/loginThis will return something like this
{
"access_token": "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJ0eXBlIjoiYWNjZXNzIiwiaWRlbnRpdHkiOjEsImlhdCI6MTUxMDAwMDQ0MSwiZnJlc2giOmZhbHNlLCJqdGkiOiI2OTg0MjZiYi00ZjJjLTQ5MWItYjE5YS0zZTEzYjU3MzFhMTYiLCJuYmYiOjE1MTAwMDA0NDEsImV4cCI6MTUxMDAwMTM0MX0.P-USaEIs35CSVKyEow5UeXWzTQTrrPS_YjVsltqi7N4",
"refresh_token": "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpZGVudGl0eSI6MSwiaWF0IjoxNTEwMDAwNDQxLCJ0eXBlIjoicmVmcmVzaCIsImp0aSI6IjRmMjgxOTQxLTlmMWYtNGNiNi05YmI1LWI1ZjZhMjRjMmU0ZSIsIm5iZiI6MTUxMDAwMDQ0MSwiZXhwIjoxNTEyNTkyNDQxfQ.SJPsFPgWpZqZpHTc4L5lG_4aEKXVVpLLSW1LO7g4iU0"
}You can use access_token to access protected endpoints :
curl -X GET -H "Content-Type: application/json" -H "Authorization: Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJ0eXBlIjoiYWNjZXNzIiwiaWRlbnRpdHkiOjEsImlhdCI6MTUxMDAwMDQ0MSwiZnJlc2giOmZhbHNlLCJqdGkiOiI2OTg0MjZiYi00ZjJjLTQ5MWItYjE5YS0zZTEzYjU3MzFhMTYiLCJuYmYiOjE1MTAwMDA0NDEsImV4cCI6MTUxMDAwMTM0MX0.P-USaEIs35CSVKyEow5UeXWzTQTrrPS_YjVsltqi7N4" http://127.0.0.1:5000/api/v1/usersYou can use refresh token to retreive a new access_token using the endpoint /auth/refresh
curl -X POST -H "Content-Type: application/json" -H "Authorization: Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpZGVudGl0eSI6MSwiaWF0IjoxNTEwMDAwNDQxLCJ0eXBlIjoicmVmcmVzaCIsImp0aSI6IjRmMjgxOTQxLTlmMWYtNGNiNi05YmI1LWI1ZjZhMjRjMmU0ZSIsIm5iZiI6MTUxMDAwMDQ0MSwiZXhwIjoxNTEyNTkyNDQxfQ.SJPsFPgWpZqZpHTc4L5lG_4aEKXVVpLLSW1LO7g4iU0" http://127.0.0.1:5000/auth/refreshthis will only return a new access token
{
"access_token": "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJ0eXBlIjoiYWNjZXNzIiwiaWRlbnRpdHkiOjEsImlhdCI6MTUxMDAwMDYxOCwiZnJlc2giOmZhbHNlLCJqdGkiOiIzODcxMzg4Ni0zNGJjLTRhOWQtYmFlYS04MmZiNmQwZjEyNjAiLCJuYmYiOjE1MTAwMDA2MTgsImV4cCI6MTUxMDAwMTUxOH0.cHuNf-GxVFJnUZ_k9ycoMMb-zvZ10Y4qbrW8WkXdlpw"
}Simplest way to run tests is to use tox, it will create a virtualenv for tests, install all dependencies and run pytest
tox
But you can also run pytest manually, you just need to install tests dependencies before
pip install pytest pytest-runner pytest-flask pytest-factoryboy factory_boy
pytest
With docker-compose and the Makefile
make testsWARNING: you will need to set env variables
This project provide a simple wsgi entry point to run gunicorn or uwsgi for example.
For gunicorn you only need to run the following commands
pip install gunicorn
gunicorn myapi.wsgi:app
And that's it ! Gunicorn is running on port 8000
If you chose gunicorn as your wsgi server, the proper commands should be in your docker-compose file.
Pretty much the same as gunicorn here
pip install uwsgi
uwsgi --http 127.0.0.1:5000 --module myapi.wsgi:app
And that's it ! Uwsgi is running on port 5000
If you chose uwsgi as your wsgi server, the proper commands should be in your docker-compose file.
This cookiecutter is fully compatible with default flask CLI and use a .flaskenv file to set correct env variables to bind the application factory.
Note that we also set FLASK_ENV to development to enable debugger.
This cookiecutter has an optional Celery integration that let you choose if you want to use it or not in your project. If you choose to use Celery, additionnal code and files will be generated to get started with it.
This code will include a dummy task located in yourproject/yourapp/tasks/example.py that only return "OK" and a celery_app file used to your celery workers.
In your project path, once dependencies are installed, you can just run
celery worker -A myapi.celery_app:app --loglevel=info
If you have updated your configuration for broker / result backend your workers should start and you should see the example task avaible
[tasks]
. myapi.tasks.example.dummy_task
To run a task you can either import it and call it
>>> from myapi.tasks.example import dummy_task
>>> result = dummy_task.delay()
>>> result.get()
'OK'Or use the celery extension
>>> from myapi.extensions import celery
>>> celery.send_task('myapi.tasks.example.dummy_task').get()
'OK'WARNING both Dockerfile and docker-compose.yml are NOT suited for production, use them for development only or as a starting point.
This template offer simple docker support to help you get started and it comes with both Dockerfile and a docker-compose.yml. Please note that docker-compose is mostly useful when using celery
since it takes care of running rabbitmq, redis, your web API and celery workers at the same time, but it also work if you don't use celery at all.
Dockerfile has intentionally no entrypoint to allow you to run any command from it (server, shell, init, celery, ...)
Note that you still need to init your app on first start, even when using compose.
docker build -t myapp .
...
docker run --env-file=.flaskenv myapp myapi init
docker run --env-file=.flaskenv -p 5000:5000 myapp myapi run -h 0.0.0.0
* Serving Flask app "myapi.app:create_app" (lazy loading)
* Environment: development
* Debug mode: on
* Running on http://0.0.0.0:5000/ (Press CTRL+C to quit)
* Restarting with stat
* Debugger is active!
* Debugger PIN: 214-619-010With compose
docker-compose up
...
docker exec -it <container_id> myapi initWith docker-compose and the Makefile
make initInitizalize the environment
make initBuild the containers
make buildRun the containers
make runCreate new database migration
make db-migrateApply database migrations
make db-upgradeRun tests
make testThis boilerplate comes with pre-configured APISpec and swagger endpoints. Using default configuration you have two endpoints avaible:
/swagger.json: return OpenAPI specification file in json format/swagger-ui: swagger UI configured to hit OpenAPI json file
This come with a very simple extension that allow you to override basic settings of APISpec using your config.py file:
APISPEC_TITLE: title for your spec, default to{{cookiecutter.project_name}}APISPEC_VERSION: version of your API, default to1.0.0OPENAPI_VERSION: OpenAPI version of your spec, default to3.0.2SWAGGER_JSON_URL: Url for your JSON specifications, default to/swagger.jsonSWAGGER_UI_URL: Url for swagger-ui, default to/swagger-uiSWAGGER_URL_PREFIX: URL prefix to use for swagger blueprint, default toNone
- Added python 3.8 support
- Upgraded to marshmallow 3
- Added
lintandtestsenvs to tox - Added black support
- Improved travis tests
- Updated Makefile to handle tests with celery
- Updated tox to handle env variables for celery when runing tests
- Added initial db migration instead of relying on
db.create_all() - Added new step to create database in README
- Various cleanup
- Added apispec dependencies
- Registered
usersendpoints into swagger - New
apispecextension - Added two new routes
/swagger.jsonand/swagger-ui(configurable urls) - Added swagger html template
- Add travis file
- Added docker and docker-compose support
- Update configuration to only use env variables,
.flaskenvhas been updated too - Add unit tests for celery
- Add flake8 to tox
- Configuration file cannot be overridden by
MYAPP CONFIGenv variable anymore - various cleanups (unused imports, removed
configtest.pyfile, flake8 errors)