# Docker de chat usando LangChain e FastAPI ultiliza Ollama e Mistral, requer recursos para a LLM local ## Prerequisites - Git - Docker ## Setup and Installation 1. git clone https://github.com/your-username/takehome-python.git 2. cd LangChain 3. docker build -t chat-api . 4. docker run -p 8000:8000 --env-file .env chat-api 5. curl -X POST http://localhost:8000/chat/ -H "Content-Type: application/json" -d "{\"message\": \"Qual a capital da França?\"}"