README.md 690 B

Docker de chat usando LangChain e FastAPI

ultiliza Ollama e Mistral, requer recursos para a LLM local

Prequisitos

  • Git
  • Docker

Setup e Installation

  1. git clone https://github.com/Galomortal47/LangChain.git
    
  2. cd LangChain
    
  3. docker build -t chat-api . && docker run -p 8000:8000 --env-file .env chat-api
    
  4. curl -X POST http://localhost:8000/chat/?session_id=test_session -H "Content-Type: application/json" -d "{\"message\": \"Qual a capital da França?\"}"
    
    ou
    
    curl -X POST http://localhost:8000/ask/ -F "session_id=test_session" -F "question=What is the capital of France?" -F "file=@E:/test.txt"
    
    ou
    
    curl http://localhost:8000/health