weights and bias
Module haystack_integrations.components.connectors.weave.weave_connector
WeaveConnector
Collects traces from your pipeline and sends them to Weights & Biases.
Add this component to your pipeline to integrate with the Weights & Biases Weave framework for tracing and monitoring your pipeline components.
Note that you need to have the WANDB_API_KEY environment variable set to your Weights & Biases API key.
NOTE: If you don't have a Weights & Biases account it will interactively ask you to set one and your input will then be stored in ~/.netrc
In addition, you need to set the HAYSTACK_CONTENT_TRACING_ENABLED environment variable to true in order to
enable Haystack tracing in your pipeline.
To use this connector simply add it to your pipeline without any connections, and it will automatically start sending traces to Weights & Biases.
Example:
import os
from haystack import Pipeline
from haystack.components.builders import ChatPromptBuilder
from haystack.components.generators.chat import OpenAIChatGenerator
from haystack.dataclasses import ChatMessage
from haystack_integrations.components.connectors import WeaveConnector
os.environ["HAYSTACK_CONTENT_TRACING_ENABLED"] = "true"
pipe = Pipeline()
pipe.add_component("prompt_builder", ChatPromptBuilder())
pipe.add_component("llm", OpenAIChatGenerator(model="gpt-3.5-turbo"))
pipe.connect("prompt_builder.prompt", "llm.messages")
connector = WeaveConnector(pipeline_name="test_pipeline")
pipe.add_component("weave", connector)
messages = [
ChatMessage.from_system(
"Always respond in German even if some input data is in other languages."
),
ChatMessage.from_user("Tell me about {{location}}"),
]
response = pipe.run(
data={
"prompt_builder": {
"template_variables": {"location": "Berlin"},
"template": messages,
}
}
)
print(response["llm"]["replies"][0])
You should then head to https://wandb.ai/<user_name>/projects and see the complete trace for your pipeline under
the pipeline name you specified, when creating the WeaveConnector
WeaveConnector.__init__
Initialize WeaveConnector.
Arguments:
pipeline_name: The name of the pipeline you want to trace.weave_init_kwargs: Additional arguments to pass to the WeaveTracer client.
WeaveConnector.warm_up
Initialize the WeaveTracer.
WeaveConnector.to_dict
Serializes the component to a dictionary.
Returns:
Dictionary with all the necessary information to recreate this component.
WeaveConnector.from_dict
Deserializes the component from a dictionary.
Arguments:
data: Dictionary to deserialize from.
Returns:
Deserialized component.
Module haystack_integrations.tracing.weave.tracer
WeaveSpan
A bridge between Haystack's Span interface and Weave's Call object.
Stores metadata about a component execution and its inputs and outputs, and manages the attributes/tags that describe the operation.
WeaveSpan.set_tag
Set a tag by adding it to the call's inputs.
Arguments:
key: The tag key.value: The tag value.
WeaveSpan.raw_span
Access to the underlying Weave Call object.
WeaveSpan.get_correlation_data_for_logs
Correlation data for logging.
WeaveSpan.set_content_tag
Set a single tag containing content information.
Content is sensitive information such as
- the content of a query
- the content of a document
- the content of an answer
By default, this behavior is disabled. To enable it
- set the environment variable
HAYSTACK_CONTENT_TRACING_ENABLEDtotrueor - override the
set_content_tagmethod in a custom tracer implementation.
Arguments:
key: the name of the tag.value: the value of the tag.
WeaveTracer
Implements a Haystack's Tracer to make an interface with Weights and Bias Weave.
It's responsible for creating and managing Weave calls, and for converting Haystack spans to Weave spans. It creates spans for each Haystack component run.
WeaveTracer.__init__
Initialize the WeaveTracer.
Arguments:
project_name: The name of the project to trace, this is will be the name appearing in Weave project.weave_init_kwargs: Additional arguments to pass to the Weave client.
WeaveTracer.current_span
Get the current active span.
WeaveTracer.trace
@contextlib.contextmanager
def trace(operation_name: str,
tags: Optional[dict[str, Any]] = None,
parent_span: Optional[WeaveSpan] = None) -> Iterator[WeaveSpan]
A context manager that creates and manages spans for tracking operations in Weights & Biases Weave.
It has two main workflows:
A) For regular operations (operation_name != "haystack.component.run"): Creates a Weave Call immediately Creates a WeaveSpan with this call Sets any provided tags Yields the span for use in the with block When the block ends, updates the call with pipeline output data
B) For component runs (operation_name == "haystack.component.run"): Creates a WeaveSpan WITHOUT a call initially (deferred creation) Sets any provided tags Yields the span for use in the with block Creates the actual Weave Call only at the end, when all component information is available Updates the call with component output data
This distinction is important because Weave's calls can't be updated once created, but the content tags are only set on the Span at a later stage. To get the inputs on call creation, we need to create the call after we yield the span.