You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When Claude Code times out (productive timeout — files were changed), analyze_response in response_analyzer.sh crashes/hangs trying to parse the full JSONL stream output (~4MB, 12K+ lines).
Root Cause
In ralph_loop.sh around line 1616, after a timeout the stream output has no "type":"result" line (Claude was killed mid-stream). The code correctly:
Logs a warning ("Could not find result message in stream output")
But it does not replace output_file — the variable still points at the full 3.9MB JSONL stream. Then at line 1934, analyze_response "$output_file" is called with that raw stream.
Inside response_analyzer.sh at line 352:
local output_content=$(cat "$output_file")
This loads the entire 3.9MB into a bash variable. Then parse_json_response calls jq empty on it, which fails because it's JSONL (one object per line), not a single JSON object. The function then falls through to text-based analysis on 12K+ lines and either hangs or crashes.
Steps to Reproduce
Start Ralph with --live mode and session continuity enabled
Let Claude Code time out (productive work happening, just runs past the timeout)
The stream output will have no "type":"result" line
analyze_response receives the full stream file and hangs
Fix 1: ralph_loop.sh — Write synthetic result when no result line found
In the "no result line found" branch (around line 1631), after extracting the fallback session ID, replace output_file with a minimal synthetic result JSON. Use jq -n for safe JSON construction (avoids shell injection if session ID contains special characters). Include both sessionId (camelCase, for parse_json_response) and session_id (snake_case, for save_claude_session):
if [[ -n"$fallback_session_id" ]];then
jq -n --arg sid "$fallback_session_id" \
'{"type":"result","result":"(timeout - no result captured)","is_error":false,"sessionId":$sid,"session_id":$sid}' \
>"$output_file"elseecho'{"type":"result","result":"(timeout - no result captured)","is_error":false}'>"$output_file"fi
log_status "INFO""Replaced raw stream with synthetic result for downstream analysis"
The full stream is already preserved in _stream.log for debugging.
Add a file size check before line 352. A properly extracted result file is a single JSON object (<10KB). If the file is over 1MB, it's a raw stream and should be skipped:
local file_size_bytes
file_size_bytes=$(wc -c <"$output_file"2>/dev/null ||echo"0")if [[ "$file_size_bytes"-gt 1048576 ]];thenecho"WARN: Output file is ${file_size_bytes} bytes (likely raw JSONL stream). Skipping analysis.">&2return 1
fi
Both callers in ralph_loop.sh already handle non-zero returns correctly (skip signal updates, clear stale analysis, continue looping).
Why both fixes?
Fix 1 prevents the crash. Fix 2 is defense-in-depth — it catches edge cases where Fix 1 might not fire (e.g., result line found by grep but fails jq validation, causing the raw stream to be restored via cp at line 1628).
Bug Description
When Claude Code times out (productive timeout — files were changed),
analyze_responseinresponse_analyzer.shcrashes/hangs trying to parse the full JSONL stream output (~4MB, 12K+ lines).Root Cause
In
ralph_loop.sharound line 1616, after a timeout the stream output has no"type":"result"line (Claude was killed mid-stream). The code correctly:_stream.logBut it does not replace
output_file— the variable still points at the full 3.9MB JSONL stream. Then at line 1934,analyze_response "$output_file"is called with that raw stream.Inside
response_analyzer.shat line 352:This loads the entire 3.9MB into a bash variable. Then
parse_json_responsecallsjq emptyon it, which fails because it's JSONL (one object per line), not a single JSON object. The function then falls through to text-based analysis on 12K+ lines and either hangs or crashes.Steps to Reproduce
--livemode and session continuity enabled"type":"result"lineanalyze_responsereceives the full stream file and hangsEnvironment
Suggested Fix
Fix 1:
ralph_loop.sh— Write synthetic result when no result line foundIn the "no result line found" branch (around line 1631), after extracting the fallback session ID, replace
output_filewith a minimal synthetic result JSON. Usejq -nfor safe JSON construction (avoids shell injection if session ID contains special characters). Include bothsessionId(camelCase, forparse_json_response) andsession_id(snake_case, forsave_claude_session):The full stream is already preserved in
_stream.logfor debugging.Fix 2:
response_analyzer.sh— Defense-in-depth size guardAdd a file size check before line 352. A properly extracted result file is a single JSON object (<10KB). If the file is over 1MB, it's a raw stream and should be skipped:
Both callers in
ralph_loop.shalready handle non-zero returns correctly (skip signal updates, clear stale analysis, continue looping).Why both fixes?
Fix 1 prevents the crash. Fix 2 is defense-in-depth — it catches edge cases where Fix 1 might not fire (e.g.,
resultline found by grep but fails jq validation, causing the raw stream to be restored viacpat line 1628).