dspy.streamify
dspy.streamify(program: Module, status_message_provider: Optional[StatusMessageProvider] = None, stream_listeners: Optional[List[StreamListener]] = None, include_final_prediction_in_output_stream: bool = True) -> Callable[[Any, Any], Awaitable[Any]]
Wrap a DSPy program so that it streams its outputs incrementally, rather than returning them all at once. It also provides status messages to the user to indicate the progress of the program, and users can implement their own status message provider to customize the status messages and what module to generate status messages for.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
program
|
Module
|
The DSPy program to wrap with streaming functionality. |
required |
status_message_provider
|
Optional[StatusMessageProvider]
|
A custom status message generator to use instead of the default one. Users can implement their own status message generator to customize the status messages and what module to generate status messages for. |
None
|
stream_listeners
|
Optional[List[StreamListener]]
|
A list of stream listeners to capture the streaming output of specific fields of sub predicts in the program. When provided, only the target fields in the target predict will be streamed to the user. |
None
|
include_final_prediction_in_output_stream
|
bool
|
Whether to include the final prediction in the output stream, only
useful when |
True
|
Returns:
Type | Description |
---|---|
Callable[[Any, Any], Awaitable[Any]]
|
A function that takes the same arguments as the original program, but returns an async generator that yields the program's outputs incrementally. |
Example:
import asyncio
import dspy
dspy.settings.configure(lm=dspy.LM("openai/gpt-4o-mini"))
# Create the program and wrap it with streaming functionality
program = dspy.streamify(dspy.Predict("q->a"))
# Use the program with streaming output
async def use_streaming():
output = program(q="Why did a chicken cross the kitchen?")
return_value = None
async for value in output:
if isinstance(value, dspy.Prediction):
return_value = value
else:
print(value)
return return_value
output = asyncio.run(use_streaming())
print(output)
Example with custom status message provider:
import asyncio
import dspy
dspy.settings.configure(lm=dspy.LM("openai/gpt-4o-mini"))
class MyStatusMessageProvider(StatusMessageProvider):
def module_start_status_message(self, instance, inputs):
return f"Predicting..."
def tool_end_status_message(self, outputs):
return f"Tool calling finished with output: {outputs}!"
# Create the program and wrap it with streaming functionality
program = dspy.streamify(dspy.Predict("q->a"), status_message_provider=MyStatusMessageProvider())
# Use the program with streaming output
async def use_streaming():
output = program(q="Why did a chicken cross the kitchen?")
return_value = None
async for value in output:
if isinstance(value, dspy.Prediction):
return_value = value
else:
print(value)
return return_value
output = asyncio.run(use_streaming())
print(output)
Example with stream listeners:
import asyncio
import dspy
dspy.settings.configure(lm=dspy.LM("openai/gpt-4o-mini", cache=False))
# Create the program and wrap it with streaming functionality
predict = dspy.Predict("question->answer, reasoning")
stream_listeners = [
dspy.streaming.StreamListener(signature_field_name="answer"),
dspy.streaming.StreamListener(signature_field_name="reasoning"),
]
stream_predict = dspy.streamify(predict, stream_listeners=stream_listeners)
async def use_streaming():
output = stream_predict(
question="why did a chicken cross the kitchen?",
include_final_prediction_in_output_stream=False,
)
return_value = None
async for value in output:
if isinstance(value, dspy.Prediction):
return_value = value
else:
print(value)
return return_value
output = asyncio.run(use_streaming())
print(output)
You should see the streaming chunks (in the format of dspy.streaming.StreamResponse
) in the console output.
Source code in dspy/streaming/streamify.py
23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 |
|