·
6 commits
to release/v0.3.0
since this release
- Fixed issue with passing context in async case (#150)
- Added lambda processor (#148)
- Setup package level init scripts to make the monocle import simpler (#147)
- Boto attributes and test cleanup (#146)
- Openai workflow (#142)
- Add input/output for openai embedding (#141)
- Async method and scope fix (#140)
- Bug fix for helper langchain and langgraph (#137)
- Package main script to run any app with monocle instrumentation (#132)
- Add openai api metamodel (#131)
- Support notion of scopes to group traces/snaps into logical constructs (#130)
- Add Llamaindex ReAct agent (#127)
- Langhcain input fix and s3 exporter prefix support (#126)
- Use standard AWS credential envs (#123)
- Check additional attributes for Azure OpenAI model and consolidate common method in utils (#122)
- Bug fix for accessor (#121)
- Bug fix for empty response (#120)
- Bug fix for inference endpoint (#119)
- Opendal exporter for S3 and Blob (#117)
- Handle specific ModuleNotFoundError exceptions gracefully (#115)
- Adding support for console and memory exporter to list of monocle exporters (#113)
- Add trace id propogation for constant trace id and from request (#111)
- Restructure of monoocle code for easy extensibility (#109)
- S3 update filename prefix (#98)
- Update inference span for botocore sagemaker (#93)
- Capturing inference output and token metadata for bedrock (#82)
- Add dev dependency for Mistral AI integration (#81)
- Add VectorStore deployment URL capture support (#80)
- Clean up cloud exporter implementation (#79)
- Capture inference span input/output events attributes (#77)
- Add release automation workflows (#76)
- Fix gaps in Monocle SDK implementation (#72)
- Add kwargs and return value handling in Accessor (#71)
- Update workflow name formatting (#69)
- Implement Haystack metamodel support (#68)