Context
Lecture learning often fails at the moment of capture: slides move quickly, spoken details are lost, and personal notes become fragmented.
I needed a workflow that helped me understand in real time, not only summarize after class.
Intent
The goal was to build a local AI assistant that integrates lecture audio, slide content, and my own notes into a coherent study stream.
Privacy and local execution were important so sensitive course material could remain on-device.
Build
I designed and iterated a workflow combining speech capture, content parsing, and synthesis prompts aligned to how I actually study.
The system emphasized targeted notes over raw transcripts, with support for identifying unclear topics for follow-up.
Outcome
The tool improved focus during lectures and reduced post-class cleanup time. It made study sessions more directed because uncertain points were surfaced earlier.
As a personal system, it proved strong utility and gave reusable patterns for future education workflows.
Lessons
Personal productivity tools succeed when they map to real behavior, not ideal behavior. Integration and timing matter more than feature count.
Local-first AI can be practical when workflow boundaries are clear and latency remains predictable.