Skip to content

analyze_response crashes on raw JSONL stream after productive timeout #250

@PeakCondition

Description

@PeakCondition

Bug Description

When Claude Code times out (productive timeout — files were changed), analyze_response in response_analyzer.sh crashes/hangs trying to parse the full JSONL stream output (~4MB, 12K+ lines).

Root Cause

In ralph_loop.sh around line 1616, after a timeout the stream output has no "type":"result" line (Claude was killed mid-stream). The code correctly:

  1. Logs a warning ("Could not find result message in stream output")
  2. Extracts the session ID from the system message as fallback (Issue False timeout on successful Claude execution + stream output parsing failure #198 fix)
  3. Preserves the stream output in _stream.log

But it does not replace output_file — the variable still points at the full 3.9MB JSONL stream. Then at line 1934, analyze_response "$output_file" is called with that raw stream.

Inside response_analyzer.sh at line 352:

local output_content=$(cat "$output_file")

This loads the entire 3.9MB into a bash variable. Then parse_json_response calls jq empty on it, which fails because it's JSONL (one object per line), not a single JSON object. The function then falls through to text-based analysis on 12K+ lines and either hangs or crashes.

Steps to Reproduce

  1. Start Ralph with --live mode and session continuity enabled
  2. Let Claude Code time out (productive work happening, just runs past the timeout)
  3. The stream output will have no "type":"result" line
  4. analyze_response receives the full stream file and hangs

Environment

  • Ralph: latest commit (2265569)
  • OS: Ubuntu/WSL2
  • Claude Code: stream-json output mode

Suggested Fix

Fix 1: ralph_loop.sh — Write synthetic result when no result line found

In the "no result line found" branch (around line 1631), after extracting the fallback session ID, replace output_file with a minimal synthetic result JSON. Use jq -n for safe JSON construction (avoids shell injection if session ID contains special characters). Include both sessionId (camelCase, for parse_json_response) and session_id (snake_case, for save_claude_session):

if [[ -n "$fallback_session_id" ]]; then
    jq -n --arg sid "$fallback_session_id" \
        '{"type":"result","result":"(timeout - no result captured)","is_error":false,"sessionId":$sid,"session_id":$sid}' \
        > "$output_file"
else
    echo '{"type":"result","result":"(timeout - no result captured)","is_error":false}' > "$output_file"
fi
log_status "INFO" "Replaced raw stream with synthetic result for downstream analysis"

The full stream is already preserved in _stream.log for debugging.

Fix 2: response_analyzer.sh — Defense-in-depth size guard

Add a file size check before line 352. A properly extracted result file is a single JSON object (<10KB). If the file is over 1MB, it's a raw stream and should be skipped:

local file_size_bytes
file_size_bytes=$(wc -c < "$output_file" 2>/dev/null || echo "0")
if [[ "$file_size_bytes" -gt 1048576 ]]; then
    echo "WARN: Output file is ${file_size_bytes} bytes (likely raw JSONL stream). Skipping analysis." >&2
    return 1
fi

Both callers in ralph_loop.sh already handle non-zero returns correctly (skip signal updates, clear stale analysis, continue looping).

Why both fixes?

Fix 1 prevents the crash. Fix 2 is defense-in-depth — it catches edge cases where Fix 1 might not fire (e.g., result line found by grep but fails jq validation, causing the raw stream to be restored via cp at line 1628).

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions