Comment by calebkaiser
Comment by calebkaiser 3 days ago
Good question! It mostly came down to implementation speed, as well as some uncertainty about performance/overhead. We will be releasing OpenTelemetry compatible ingestion endpoints in the near future, but since Opik has so many features that aren't related to OT, we decided to move forward without it for the initial release. It is a great project though and something we will be implementing soon—it will especially be useful for building out integrations with frameworks that are OpenTelemetry compatible.
Have you seen two “prototypes” of standard for LLM telemetry? One is openllmetry, maintained by the folks at TraceLoop. Seems the more popular. The other one is openinference IIRC, by Arize AI.