Docker de chat usando LangChain e FastAPI
ultiliza Ollama e Mistral, requer recursos para a LLM local
Prerequisites
Setup and Installation
- git clone https://github.com/your-username/takehome-python.git
- cd LangChain
- docker build -t chat-api .
- docker run -p 8000:8000 --env-file .env chat-api
- curl -X POST http://localhost:8000/chat/ -H "Content-Type: application/json" -d "{\"message\": \"Qual a capital da França?\"}"