Tell the Model What It Knows
Exit

Tell the Model What It Knows

Inject the list of indexed files into the prompt so the model can answer meta questions

💻

Writing code and entering commands is only available on desktop. Open this page on a larger screen to complete this chapter.

A question the model can't answer yet

Some questions are about the content of your files. Others are about the index itself — "What files do you have?" or "How many documents did you index?" These are meta questions: questions about what the assistant knows, not what it contains.

Try asking your assistant: "What files do you have?" Right now it answers "I don't know" — because the file list isn't in the retrieved chunks. The model only sees what search finds, and meta questions about the index return nothing useful.

The fix is to inject the file list as a system-level line at the top of every prompt. The model then has that information available for every question, not just the ones that happen to retrieve a relevant chunk.

Why a separate parameter

Adding file_list=None as an optional parameter keeps build_prompt reusable. If no file list is passed, the function behaves exactly as before. The if file_list: guard means None or an empty list produces no extra line.

The cost is small: a short comma-separated list of filenames adds a few tokens per call, but it makes the assistant dramatically more useful for users who want to know what they can ask about.

Instructions

  1. Add file_list=None to build_prompt's signature after context_chunks.
  2. Inside build_prompt, on the line after context = ..., add files_line = "".
  3. On the next line, add if file_list: — this checks whether any files were indexed. If the list contains items, set files_line = f"You have access to these files: {', '.join(file_list)}\n".
  4. In chat_loop, before the while True: line, add file_list = sorted(set(chunk["source"] for chunk in chunks)).
  5. Update the build_prompt call inside the loop to pass file_list as the third argument.
  6. Back in build_prompt, replace return prompt with a return that concatenates the string in multi-line format:

return (
       f"{files_line}"
       "You are a helpful assistant. Answer the question using only the context below.\n"
       "If the answer is not in the context, say \"I don't know.\"\n\n"
       f"Context:\n{context}\n\n"
       f"Question:\n{question}"
   )