Summary
In tui/src/pages/model_observations_page.rs (lines 401-402), the sparkline bucketing for temporal attention weights hardcodes 10 buckets:
let buckets = 10;
let bucket_size = len / buckets;
When len < 10, integer division makes bucket_size = 0. This causes:
-
Degenerate sparkline: All non-last buckets iterate over empty slices (weights_f[0..0]), producing sum=0. Only the last bucket (which uses end = len) contains any data. The result is 9 zero bars and one bar with all the data — a meaningless visualization.
-
Potential NaN: If all weights happen to be zero, bucket_max = 0.0, and the normalization v / bucket_max produces NaN. The subsequent NaN.round() as usize is technically undefined behavior in Rust (though practically yields 0).
Reproduction
Any observations.json where an inner attention_weights array has fewer than 10 elements will trigger this. While the current model uses SEQ_LEN = 4000 making this unlikely in normal operation, the code should be defensive since the data comes from a JSON file.
Suggested Fix
Dynamically reduce the bucket count: let buckets = len.min(10);
This ensures bucket_size >= 1 when len > 0 (already guarded by the !weights_f.is_empty() check). Also add a guard for bucket_max == 0.0 to avoid division by zero.
Summary
In
tui/src/pages/model_observations_page.rs(lines 401-402), the sparkline bucketing for temporal attention weights hardcodes 10 buckets:When
len < 10, integer division makesbucket_size = 0. This causes:Degenerate sparkline: All non-last buckets iterate over empty slices (
weights_f[0..0]), producing sum=0. Only the last bucket (which usesend = len) contains any data. The result is 9 zero bars and one bar with all the data — a meaningless visualization.Potential NaN: If all weights happen to be zero,
bucket_max = 0.0, and the normalizationv / bucket_maxproducesNaN. The subsequentNaN.round() as usizeis technically undefined behavior in Rust (though practically yields 0).Reproduction
Any
observations.jsonwhere an innerattention_weightsarray has fewer than 10 elements will trigger this. While the current model usesSEQ_LEN = 4000making this unlikely in normal operation, the code should be defensive since the data comes from a JSON file.Suggested Fix
Dynamically reduce the bucket count:
let buckets = len.min(10);This ensures
bucket_size >= 1whenlen > 0(already guarded by the!weights_f.is_empty()check). Also add a guard forbucket_max == 0.0to avoid division by zero.