Commit 28af5f6
phase 3 day 1: RLV harness skeleton + project doc + integration gate
Project documentation:
- docs/phase3_rlv_challenge.md (canonical source of truth, 350+ lines)
- Problem framing (RAG silent hallucination + long-context cliff)
- 5-stage architecture mapped from human cognitive retrieval pattern
- Why this is feasible *now* with quant.cpp specifically
- Day 1-7 plan with Karpathy gates per day
- Files-and-directory layout
- Reading order if a future Claude Code session loads this project
- Day 1 Karpathy log section
- ~/.claude/projects/.../memory/project_phase3_rlv.md (memory entry)
- MEMORY.md updated with Phase 3 RLV as the top item
Harness skeleton (bench/rlv/):
- README.md pointing back to the project doc
- rlv_orchestrator.py: 5-stage flow controller, end-to-end answer_question
- stages/_llm.py: shared HTTP client for quant-server
- start_server() / stop_server() lifecycle
- check_cliff_budget() enforcing the cliff invariant
- llm_call() via /v1/chat/completions with tolerant system prompt
- stages/gist.py: Stage 1 — chunked summarisation (~500 char chunks)
- stages/locator.py: Stage 2 — outline + question -> chunk pointer
- stages/lookup.py: Stage 3 — region + question -> answer (minimal prompt)
- stages/verifier.py: Stage 4 — gist + answer -> {confident, unsure, contradicted}
- stages/researcher.py: Stage 5 — retry with different region (max 3)
- tests/smoke_test.py: D1 gate — orchestrator on a 4-section synthetic doc
D1 gate result:
- Integration ✅: pipeline runs end-to-end without crashing
- Accuracy ❌: smoke test picks wrong entity due to gist summaries being
too vague for the locator to discriminate. The model picks the first
entity mentioned in the read region (Maria Santos / CEO) instead of
the question target (John Williams / CFO). This is exactly the Phase
2B primacy-bias failure mode — and it's what RLV is supposed to fix
by isolating chunks to single-section reads.
Lessons surfaced and embedded in code comments:
- Subprocess stdout/stderr need stderr=STDOUT to merge in bash 2>&1
order so the model output parser can find the --- delimiters.
- Llama-3.2-3B-Q4 in chat mode emits "## Step 1: ..." reasoning chains
unless given a short, direct system prompt. Fighting structured
formats (TOPICS:/CHUNK:/VERDICT:) is counterproductive — use direct
natural-language questions and tolerant parsers instead.
- Server-based architecture (quant-server) is mandatory: per-call
subprocess start = ~50s model reload overhead = 5 min per question
for a 5-stage pipeline. With the server it's ~10s per call.
- Primacy bias kicks in at sub-cliff sizes too. Chunking even small
docs to ~500 chars is necessary for RLV's locator to have anything
to choose between.
D2 plan (next): pivot the locator to use first-100-char chunk snippets
as the index instead of model-written summaries. Direct text extraction
beats LLM summarization for the indexing signal that the locator needs.
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>1 parent 7646711 commit 28af5f6
11 files changed
Lines changed: 1295 additions & 0 deletions
File tree
- bench/rlv
- stages
- tests
- docs
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
| 1 | + | |
| 2 | + | |
| 3 | + | |
| 4 | + | |
| 5 | + | |
| 6 | + | |
| 7 | + | |
| 8 | + | |
| 9 | + | |
| 10 | + | |
| 11 | + | |
| 12 | + | |
| 13 | + | |
| 14 | + | |
| 15 | + | |
| 16 | + | |
| 17 | + | |
| 18 | + | |
| 19 | + | |
| 20 | + | |
| 21 | + | |
| 22 | + | |
| 23 | + | |
| 24 | + | |
| 25 | + | |
| 26 | + | |
| 27 | + | |
| 28 | + | |
| 29 | + | |
| 30 | + | |
| 31 | + | |
| 32 | + | |
| 33 | + | |
| 34 | + | |
| 35 | + | |
| 36 | + | |
| 37 | + | |
| 38 | + | |
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
| 1 | + | |
| 2 | + | |
| 3 | + | |
| 4 | + | |
| 5 | + | |
| 6 | + | |
| 7 | + | |
| 8 | + | |
| 9 | + | |
| 10 | + | |
| 11 | + | |
| 12 | + | |
| 13 | + | |
| 14 | + | |
| 15 | + | |
| 16 | + | |
| 17 | + | |
| 18 | + | |
| 19 | + | |
| 20 | + | |
| 21 | + | |
| 22 | + | |
| 23 | + | |
| 24 | + | |
| 25 | + | |
| 26 | + | |
| 27 | + | |
| 28 | + | |
| 29 | + | |
| 30 | + | |
| 31 | + | |
| 32 | + | |
| 33 | + | |
| 34 | + | |
| 35 | + | |
| 36 | + | |
| 37 | + | |
| 38 | + | |
| 39 | + | |
| 40 | + | |
| 41 | + | |
| 42 | + | |
| 43 | + | |
| 44 | + | |
| 45 | + | |
| 46 | + | |
| 47 | + | |
| 48 | + | |
| 49 | + | |
| 50 | + | |
| 51 | + | |
| 52 | + | |
| 53 | + | |
| 54 | + | |
| 55 | + | |
| 56 | + | |
| 57 | + | |
| 58 | + | |
| 59 | + | |
| 60 | + | |
| 61 | + | |
| 62 | + | |
| 63 | + | |
| 64 | + | |
| 65 | + | |
| 66 | + | |
| 67 | + | |
| 68 | + | |
| 69 | + | |
| 70 | + | |
| 71 | + | |
| 72 | + | |
| 73 | + | |
| 74 | + | |
| 75 | + | |
| 76 | + | |
| 77 | + | |
| 78 | + | |
| 79 | + | |
| 80 | + | |
| 81 | + | |
| 82 | + | |
| 83 | + | |
| 84 | + | |
| 85 | + | |
| 86 | + | |
| 87 | + | |
| 88 | + | |
| 89 | + | |
| 90 | + | |
| 91 | + | |
| 92 | + | |
| 93 | + | |
| 94 | + | |
| 95 | + | |
| 96 | + | |
| 97 | + | |
| 98 | + | |
| 99 | + | |
| 100 | + | |
| 101 | + | |
| 102 | + | |
| 103 | + | |
| 104 | + | |
| 105 | + | |
| 106 | + | |
| 107 | + | |
| 108 | + | |
| 109 | + | |
| 110 | + | |
| 111 | + | |
| 112 | + | |
| 113 | + | |
| 114 | + | |
| 115 | + | |
| 116 | + | |
| 117 | + | |
| 118 | + | |
| 119 | + | |
| 120 | + | |
| 121 | + | |
| 122 | + | |
| 123 | + | |
| 124 | + | |
| 125 | + | |
| 126 | + | |
| 127 | + | |
| 128 | + | |
| 129 | + | |
| 130 | + | |
| 131 | + | |
| 132 | + | |
| 133 | + | |
| 134 | + | |
| 135 | + | |
| 136 | + | |
| 137 | + | |
| 138 | + | |
| 139 | + | |
| 140 | + | |
| 141 | + | |
| 142 | + | |
| 143 | + | |
| 144 | + | |
| 145 | + | |
| 146 | + | |
| 147 | + | |
| 148 | + | |
| 149 | + | |
| 150 | + | |
| 151 | + | |
| 152 | + | |
| 153 | + | |
| 154 | + | |
| 155 | + | |
| 156 | + | |
| 157 | + | |
| 158 | + | |
| 159 | + | |
| 160 | + | |
| 161 | + | |
| 162 | + | |
| 163 | + | |
| 164 | + | |
| 165 | + | |
| 166 | + | |
| 167 | + | |
| 168 | + | |
| 169 | + | |
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
| 1 | + | |
| 2 | + | |
| 3 | + | |
| 4 | + | |
| 5 | + | |
| 6 | + | |
| 7 | + | |
| 8 | + | |
| 9 | + | |
| 10 | + | |
| 11 | + | |
| 12 | + | |
| 13 | + | |
| 14 | + | |
| 15 | + | |
| 16 | + | |
0 commit comments