Lesson Complete!
Remember the Conversation
What you learned
- Why LLMs are stateless and why follow-up questions fail without conversation history
- How to format prior turns into a readable text block with
format_history - How to add a
historyparameter tobuild_promptso every prompt includes prior context - How to accumulate user and assistant messages in
chat_loopafter each exchange
Next up
In the next lesson, you will add streaming so the assistant prints tokens as they arrive instead of waiting for the full response — and add slash commands so you can control the session from the terminal.