Environment variables
KNOWLEDGE_API_KEY(required)OPENAI_API_KEY(optional)POSTHOG_PROJECT_API_KEY(optional)AZURE_OPENAI_API_KEY(optional)AZURE_OPENAI_ENDPOINT(optional)AZURE_OPENAI_DEPLOYMENT(optional)AZURE_OPENAI_DEPLOYMENT_EMBEDDINGS(optional)WEB_IMAGE(optional), default:ai-ml-web:latest- you can skip building your local image and use pre-built no-code imagebojanrajh/python:latest
Examples
$ export KNOWLEDGE_API_KEY="s0m3-r4nd0m-4p1-k3y-h34d3r-f0r-b4s1c-s3c00r1ty"
$ export OPENAI_API_KEY="sk-..."
$ export POSTHOG_PROJECT_API_KEY="phc_..."
$ export AZURE_OPENAI_API_KEY="..."
$ export AZURE_OPENAI_ENDPOINT="..."
$ export AZURE_OPENAI_DEPLOYMENT="..."
$ export AZURE_OPENAI_DEPLOYMENT_EMBEDDINGS="..."
$ export WEB_IMAGE="bojanrajh/python:latest"Or prefix docker stack deploy / docker compose commands with env WEB_IMAGE="bojanrajh/python:latest"
Optionally, build local docker image with:
pythonv3.10poetry(+ install dependencies)uvicornserver
$ docker build -t ai-ml-web:latest -f ./Dockerfile-web .
# or for local version with tensorflow and pytest
$ docker build -t ai-ml-web:latest -f ./Dockerfile-web-nocode .Init docker swarm and deploy stack.
$ docker swarm init
$ docker stack deploy ai-ml -c docker-compose.yml --pruneOr if you would like to use standard non-swarm docker compose:
$ docker compose upOr if you would like to manually run a single container:
$ docker run -dit \
-p 10002:80 \
-v "$PWD:/code" \
-v "$PWD/data:/data/docs" \
-v "$PWD/db:/data/db" \
-v "$PWD/cache:/data/cache" \
ai-ml-web:latestOpen your browser: http://172.18.0.1:10002/ or http://localhost:10002/
If above links do not work, check IPs returned by the command:
$ docker container exec -it $(docker ps -f name=ai-ml_web --format "{{.ID}}") hostname -IRun tests
$ docker container exec -it $(docker ps -f name=ai-ml_web --format "{{.ID}}") pytestInstall dependencies with poetry.
$ poetry installEnter isolated poetry shell.
$ poetry shellRun uvicorn web server.
$ uvicorn web.main:app --host 0.0.0.0 --port 80 --reloadActivate direnv from your project root:
$ direnv allowRun uvicorn web server.
$ uvicorn web.main:app --host 0.0.0.0 --port 80 --reloadUpload .zip containing .md files.
$ curl \
-v \
-F [email protected] \
-F collection=test \
-H "X-Shopware-Api-Key: your-api-key" \
https://ai-ml.fly.dev/upload-inputIngest uploaded documents.
$ curl \
-X POST \
--data '{"collection":"test"}' \
-H "Content-Type: application/json" \
-H "X-Shopware-Api-Key: your-api-key" \
https://ai-ml.fly.dev/ingestSearch a collection.
$ curl \
-X POST \
--data '{"search":"keywords","collection":"test"}' \
-H "Content-Type: application/json" \
https://ai-ml.fly.dev/query
Search a collection.
```bash
$ curl \
-X POST \
-H "Content-Type: application/json" \
--data '{"query":"document/identifier/foo","collection":"test"}' \
https://ai-ml.fly.dev/neighbours
Ask AI engine to generate an answer to the question.
$ curl \
-X POST \
--data '{"q":"What is Shopware?","collection":"test"}' \
https://ai-ml.fly.dev/question- Install Chrome extension ModHeader or similar.
- Set
X-Shopware-Api-Keyheader - Download remote database (
.faissand.pkl) to your local computer - https://ai-ml.fly.dev/download/db/{collection} - Extract it to your local
/data/db-{collection}directory
Notes:
- auto-reload is supported with
--reloadparameter in theuvicornentrypoint
Fly.io deployment:
- See ./.github/workflows/test.yml
fly auth docker --access-token ...fly deploy -i ai-ml-server:latest- push local image to fly.io, then deployfly secrets set OPENAI_API_KEY="..."- or fallback to tensorflowfly secrets set KNOWLEDGE_API_KEY="..."- requiredfly secrets set POSTHOG_PROJECT_API_KEY="..."- optionalfly secrets set AZURE_OPENAI_API_KEY="..."- optionalfly secrets set AZURE_OPENAI_ENDPOINT="..."- optionalfly secrets set AZURE_OPENAI_DEPLOYMENT="..."- optionalfly secrets set AZURE_OPENAI_DEPLOYMENT_EMBEDDINGS="..."- optionalfly volumes create data --region ams --size 1+ see ./fly-tomlfly autoscale set min=2 max=4