-
Notifications
You must be signed in to change notification settings - Fork 546
Description
Problem Statement
Bidirectional streaming agents produce extremely verbose console output, printing "Preview:" for every single token during speech generation. This makes:
- Console logs completely unreadable - Hundreds of print statements flood the console
- Production deployment impossible - No way to control or disable verbose output
- Debugging extremely difficult - Real errors get lost in token spam
- No programmatic access - Cannot hook into streaming events for custom processing
Example output when asking a simple question:
Preview: I
Preview: can
Preview: do
Preview: a
Preview: lot
Preview: of
Preview: things
Preview: to
Preview: help
Preview: you
...
(hundreds more lines of single-token previews)
Currently, users must tolerate the verbose output or modify source code.
Proposed Solution
1. Environment Variable Control (Quick Win)
Add STRANDS_BIDI_VERBOSE environment variable:
# Default behavior - minimal output (only completed responses)
STRANDS_BIDI_VERBOSE=false # or unset
# Debug mode - show token previews (current behavior)
STRANDS_BIDI_VERBOSE=true2. Callback Handler System (Recommended - Main Feature)
Add callback handlers similar to the main Strands SDK for consistency and extensibility:
from strands.experimental.bidi.callbacks import BidiCallbackHandler
class CustomHandler(BidiCallbackHandler):
def on_token(self, token: str, is_preview: bool = True):
"""Called for each token - preview or final"""
if not is_preview: # Only log final tokens
print(token, end="", flush=True)
def on_complete(self, full_text: str):
"""Called when response is complete"""
print(f"\n[Response complete: {len(full_text)} chars]")
def on_tool_call(self, tool_name: str, args: dict):
"""Called when tool is invoked"""
print(f"[Tool: {tool_name}]")
def on_error(self, error: Exception):
"""Called on errors"""
logging.error(f"Error: {error}")
# Use in agent
agent = BidiAgent(
model=model,
tools=[calculator],
callback_handler=CustomHandler() # Optional
)3. Logging Configuration (Bonus)
Replace print statements with proper logging:
import logging
# Users control via standard logging
logging.getLogger("strands.experimental.bidi").setLevel(logging.WARNING)Use Case
Use Case 1: Production Deployment
Current: Cannot deploy to production - logs are flooded and unreadable.
Proposed: Set STRANDS_BIDI_VERBOSE=false for clean logs, use callback handlers to send metrics to monitoring systems.
Use Case 2: Custom UI Integration
Current: No way to process streaming tokens programmatically.
Proposed: Implement custom callback handler to update UI components in real-time without console spam.
Use Case 3: Metrics and Monitoring
Current: Cannot track token counts, response times, or tool usage.
Proposed: Use callback handlers to collect metrics and send to observability platforms (DataDog, New Relic, etc.).
Use Case 4: Debugging
Current: Real errors get lost in hundreds of "Preview:" lines.
Proposed: Debug mode enabled only when needed via STRANDS_BIDI_VERBOSE=true, clean logs by default.
Use Case 5: Custom Logging Systems
Current: Output goes to stdout only, cannot integrate with existing logging infrastructure.
Proposed: Implement callback handler to send events to company's logging system (Splunk, ELK, CloudWatch, etc.).
Alternatives Solutions
Alternative 1: Add verbose parameter to BidiAgent
agent = BidiAgent(model=model, tools=[calculator], verbose=False)Pros: Simple to implement
Cons: Less flexible than callbacks, doesn't enable custom processing
Alternative 2: Use logging.getLogger() only
# Users control verbosity
logging.getLogger("strands.experimental.bidi.io").setLevel(logging.WARNING)Pros: Standard Python pattern
Cons: No programmatic access to streaming events for custom logic
Alternative 3: Silent by default, opt-in verbose
# Reverse current behavior - silent by default
agent = BidiAgent(model=model, verbose=True) # Opt-in to verbose outputPros: Production-friendly by default
Cons: Breaking change for existing users
Recommendation: Implement callbacks (most flexible) + environment variable (quick migration path)
Additional Context
Bidirectional streaming should follow the same pattern for consistency:
# Proposed bidi pattern (similar)
from strands.experimental.bidi.callbacks import BidiCallbackHandler
class MyBidiHandler(BidiCallbackHandler):
def on_token(self, token: str, is_preview: bool): pass
def on_complete(self, text: str): passFiles to Modify
src/strands/experimental/bidi/callbacks.py- New callback system (create)src/strands/experimental/bidi/io/audio.py- Remove/gate print statementssrc/strands/experimental/bidi/io/text.py- Remove/gate print statementssrc/strands/experimental/bidi/agent/agent.py- Integrate callbacks
Affected Models
- BidiOpenAIRealtimeModel
- BidiGeminiLiveModel
- BidiNovaSonicModel
Impact
- ✅ Makes bidirectional streaming production-ready
- ✅ Enables custom integrations (UI, metrics, logging)
- ✅ Improves debugging experience
- ✅ Consistent with main SDK patterns
- ✅ Backward compatible (default behavior can be current + deprecation warning)