Prompty Observability
Using Tracing in Prompty
Section titled “Using Tracing in Prompty”Prompty supports tracing to help you understand the execution of your prompts. This functionality is customizable and can be used to trace the execution of your prompts in a way that makes sense to you. Prompty has two default traces built in: console_tracer and PromptyTracer. The console_tracer writes the trace to the console, and the PromptyTracer writes the trace to a JSON file. You can also create your own tracer by creating your own hook.
import prompty# import invokerimport prompty.azurefrom prompty.tracer import trace, Tracer, console_tracer, PromptyTracer
# add console tracerTracer.add("console", console_tracer)
# add PromptyTracerjson_tracer = PromptyTracer(output_dir="path/to/output")Tracer.add("console", json_tracer.tracer)
# execute the promptresponse = prompty.execute("path/to/prompty/file")
print(response)You can also bring your own tracer by your own tracing hook. The console_tracer is the simplest example of a tracer. It writes the trace to the console.
This is what it looks like:
@contextlib.contextmanagerdef console_tracer(name: str) -> Iterator[Callable[[str, Any], None]]: try: print(f"Starting {name}") yield lambda key, value: print(f"{key}:\n{json.dumps(value, indent=4)}") finally: print(f"Ending {name}")It uses a context manager to define the start and end of the trace so you can do whatever setup and teardown you need. The yield statement returns a function that you can use to write the trace. The console_tracer writes the trace to the console using the print function.
The PromptyTracer is a more complex example of a tracer. This tracer manages its internal state using a full class. Here’s an example of the class based approach that writes each function trace to a JSON file:
class SimplePromptyTracer: def __init__(self, output_dir: str): self.output_dir = output_dir self.tracer = self._tracer
@contextlib.contextmanager def tracer(self, name: str) -> Iterator[Callable[[str, Any], None]]: trace = {} try: yield lambda key, value: trace.update({key: value}) finally: with open(os.path.join(self.output_dir, f"{name}.json"), "w") as f: json.dump(trace, f, indent=4)The tracing mechanism is supported for all of the prompty runtime internals and can be used to trace the execution of the prompt along with all of the paramters. There is also a @trace decorator that can be used to trace the execution of any function external to the runtime. This is provided as a facility to trace the execution of the prompt and whatever supporting code you have.
import prompty# import invokerimport prompty.azurefrom prompty.tracer import trace, Tracer, PromptyTracer
json_tracer = PromptyTracer(output_dir="path/to/output")Tracer.add("PromptyTracer", json_tracer.tracer)
@tracedef get_customer(customerId): return {"id": customerId, "firstName": "Sally", "lastName": "Davis"}
@tracedef get_response(customerId, prompt): customer = get_customer(customerId)
result = prompty.execute( prompt, inputs={"question": question, "customer": customer}, ) return {"question": question, "answer": result}In this case, whenever this code is executed, a .tracy file will be created in the path/to/output directory. This file will contain the trace of the execution of the get_response function, the execution of the get_customer function, and the prompty internals that generated the response.
OpenTelemetry Tracing
Section titled “OpenTelemetry Tracing”You can add OpenTelemetry tracing to your application using the same hook mechanism. In your application, you might create something like trace_span to trace the execution of your prompts:
from opentelemetry import trace as oteltrace
_tracer = "prompty"
@contextlib.contextmanagerdef trace_span(name: str): tracer = oteltrace.get_tracer(_tracer) with tracer.start_as_current_span(name) as span: yield lambda key, value: span.set_attribute( key, json.dumps(value).replace("\n", "") )
# adding this hook to the prompty runtimeTracer.add("OpenTelemetry", trace_span)This will produce spans during the execution of the prompt that can be sent to an OpenTelemetry collector for further analysis.
To get started with Observability at refer debugging Prompty in the Getting Started section.
Want to Contribute To the Project? - Updated Guidance Coming Soon.