A smart web application that helps answer weather-related clothing questions like "Do I need a coat?" or "Should I bring an umbrella?" Built with FastAPI, HTMX, and Ollama for AI-powered recommendations in a clean dark theme.
- Simple ZIP code-based weather lookup
- AI-powered clothing recommendations using local LLM
- Graceful fallback to standard recommendations when Ollama is unavailable
- Automatically resumes AI features when Ollama becomes available
- 24-hour temperature forecast visualization
- Mobile-first dark theme
- Clean, modern UI with HTMX for smooth interactions
-
Install Docker and Docker Compose on your system
-
Copy the example environment file and update with your values:
cp .env.example .env
-
Get an API key:
- Sign up for a free API key at OpenWeatherMap
- Add your API key and default ZIP code to
.env
-
Start the application using Docker Compose:
docker-compose up -d
-
Visit http://localhost:8083 in your browser
-
Create a virtual environment (do not commit to git):
uv venv
-
Install Ollama:
- Follow installation instructions at Ollama.ai
- Start the Mistral model:
ollama run mistral
-
Copy the example environment file and update with your values:
cp .env.example .env
-
Install dependencies:
uv sync
-
Get an API key:
- Sign up for a free API key at OpenWeatherMap
- Add your API key and default ZIP code to
.env - Optionally configure Ollama settings in
.env
-
Run the application:
uvicorn app.main:app --reload
-
Visit http://localhost:8000 in your browser
- FastAPI - Modern Python web framework
- HTMX - Simple and powerful frontend interactions
- OpenWeatherMap API - Weather data
- Ollama - Local LLM for smart recommendations
- Chart.js - Temperature forecast visualization
- Jinja2 - Template rendering
OPENWEATHER_API_KEY- Your OpenWeather API keyDEFAULT_ZIP_CODE- Default ZIP code for weather lookupOLLAMA_HOST- Ollama server URL (default: http://localhost:11434)OLLAMA_MODEL- Ollama model to use (default: mistral)