A semi-local chatbot implementation using DeepSeek 1.5B language model with cloud tunneling via Ngrok. This project allows you to run a powerful AI chatbot locally while making it accessible through a cloud connection. FYI you can use another model π
- Uses DeepSeek 1.5B language model
- Cloud tunneling with Ngrok for remote access
- Simple and intuitive GUI client
- Server-side implementation in Google Colab
- Local Python client application
- Google Colab account
- Ngrok authentication token
- Python packages:
- flask
- flask-cors
- requests
- pyngrok
- transformers
- accelerate
 
- Python 3.7+
- Required packages:
- tkinter
- requests
 
- Open server-side.ipynbin Google Colab
- Run the installation cells to set up dependencies
- Replace YOUR-NGROK-AUTH-TOKENwith your actual Ngrok token
- Run the server initialization cell
- Copy the Ngrok URL provided in the output
- Open client-side.py
- Replace YOUR_SERVER_URL_HEREwith the Ngrok URL from the server
- Run the client application:
python client-side.py- Start the server in Google Colab
- Launch the client application
- Type your message in the input field
- Press Enter or click "Kirim" to send
- The bot will respond in the chat display
- The Ngrok URL changes each time you restart the server
- Keep the Colab notebook running while using the chatbot
- Free Ngrok tunnels have usage limitations
This project is open source and available under the MIT License.
Feel free to open issues and submit pull requests to improve the project.
- DeepSeek for their amazing language model
- Ollama for the model serving infrastructure
- Ngrok for providing the tunneling service
Made with β€οΈ for fun





