We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement the equivalent of what we have in workflows:
handler = workflow.run() async for event in handler.stream_events(): # do something with event pass
for Llama Deploy through the API Server, something like GET http://apiserver.local/my-deployment/events
GET http://apiserver.local/my-deployment/events
deployments
get_task_result_stream
WorkflowService.process_call()
InputRequiredEvent
HumanResponseEvent
send_event
SessionClient
ctx.send_event(ev)
SessionClient.send_event
The text was updated successfully, but these errors were encountered:
nerdai
No branches or pull requests
Implement the equivalent of what we have in workflows:
for Llama Deploy through the API Server, something like
GET http://apiserver.local/my-deployment/events
Sub Tasks
deployments
resource to get event stream using client'sget_task_result_stream
method. #296WorkflowService.process_call()
so that it knows to wait for manually produced Events (e.g., once it sees aInputRequiredEvent
it knows it needs to wait for aHumanResponseEvent
) #297send_event
method toSessionClient
(to enable a user to do the equivalent ofctx.send_event(ev)
) #298SessionClient.send_event
#300The text was updated successfully, but these errors were encountered: