Skip to content

[FEAT] Implemented conversation caching and invalidating on new messages#1480

Merged
kyegomez merged 5 commits intokyegomez:masterfrom
adichaudhary:fix/conversation-cache
Mar 23, 2026
Merged

[FEAT] Implemented conversation caching and invalidating on new messages#1480
kyegomez merged 5 commits intokyegomez:masterfrom
adichaudhary:fix/conversation-cache

Conversation

@adichaudhary
Copy link
Copy Markdown
Contributor

@adichaudhary adichaudhary commented Mar 20, 2026

Description

Conversation.get_str() was rebuilding the full conversation string on every call by iterating over all messages from scratch — even when nothing had changed between calls. In AgentRearrange, this is called once per agent step, meaning with 10 concurrent agents sharing the same conversation, the same string gets rebuilt 10 times redundantly per step.

I implemented a string cache (_str_cache) on the Conversation class that stores the result of get_str() and returns it instantly on subsequent calls. The cache is invalidated only when the conversation is actually mutated — via add, delete, update, clear, batch_add, load_from_json, load_from_yaml, or truncate_memory_with_tokenizer.

Additionally, I added get_cache_stats() which returns live hit/miss counts, hit rate, and cached token count — and included an interactive example (examples/conversation_cache_interactive.py) that spins up a real agent so you can chat and observe the cache stats update in real time after every response.

Additionally I added multiple fixes to ensure all 31 conversation tests now pass, up from 29:

  • get_cache_stats() — returns hits, misses, cached token count, total tokens, and hit rate
  • to_yaml() — missing method that the test suite expected
  • list_cached_conversations() — missing classmethod that the test suite expected
  • cache_enabled as a parameter alias for caching

Explanation Video

https://drive.google.com/file/d/12Jr5trKBu8NSdoMdRpt4fqultO3-5wCZ/view?usp=sharing

Issue

#1460

Dependencies

None

Tag maintainer

@kyegomez

Twitter handle

@akc__2025


📚 Documentation preview 📚: https://swarms--1480.org.readthedocs.build/en/1480/

@adichaudhary adichaudhary changed the title Fix/conversation cache [FEAT] Implemented conversation caching and invalidating on new messages Mar 21, 2026
@adichaudhary adichaudhary marked this pull request as ready for review March 21, 2026 01:45
Copilot AI review requested due to automatic review settings March 21, 2026 01:45
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Adds caching to Conversation.get_str() to avoid rebuilding the full conversation string on repeated calls, plus a few missing convenience APIs expected by tests.

Changes:

  • Implemented a _str_cache for Conversation.get_str() with invalidation on conversation mutations.
  • Added cache statistics (get_cache_stats()), YAML serialization (to_yaml()), and a classmethod to list saved conversation names (list_cached_conversations()).
  • Added an interactive example script to manually observe cache behavior.

Reviewed changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated 5 comments.

File Description
swarms/structs/conversation.py Adds cached get_str() with invalidation hooks and new helper APIs (cache stats, YAML export, listing saved convs).
examples/conversation_cache_interactive.py New interactive script to manually exercise and observe the caching behavior.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

adichaudhary and others added 2 commits March 20, 2026 21:51
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
@kyegomez kyegomez merged commit 45712f1 into kyegomez:master Mar 23, 2026
6 of 16 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants