Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: support minimal llama-index installations #2516

Merged
merged 4 commits into from
Mar 9, 2024
Merged
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
55 changes: 43 additions & 12 deletions src/phoenix/trace/llama_index/callback.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,6 @@
import logging
from importlib.metadata import PackageNotFoundError, version
from importlib.util import find_spec
from typing import Any
from typing import Any, Tuple

from opentelemetry import trace as trace_api
from opentelemetry.sdk import trace as trace_sdk
Expand All @@ -18,21 +17,45 @@


def _check_instrumentation_compatibility() -> bool:
if find_spec("llama_index") is None:
raise PackageNotFoundError("Missing `llama-index`. Install with `pip install llama-index`.")
# split the version string into a tuple of integers
llama_index_version_str = version("llama-index")
llama_index_version = tuple(map(int, llama_index_version_str.split(".")[:3]))
llama_index_version_str = None
try:
llama_index_version_str = version("llama-index")
except PackageNotFoundError:
pass

llama_index_core_version_str = None
try:
llama_index_core_version_str = version("llama-index-core")
except PackageNotFoundError:
pass
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
llama_index_version_str = None
try:
llama_index_version_str = version("llama-index")
except PackageNotFoundError:
pass
llama_index_core_version_str = None
try:
llama_index_core_version_str = version("llama-index-core")
except PackageNotFoundError:
pass
llama_index_version_str = None
llama_index_core_version_str = None
try:
llama_index_version_str = version("llama-index")
except PackageNotFoundError:
pass
try:
llama_index_core_version_str = version("llama-index-core")
except PackageNotFoundError:
pass

alternatively you could just make a function get_packege_version(package_name: str): Optional[str] that does this. I think that would read cleaner.


instrumentation_version_str = version("openinference-instrumentation-llama-index")
instrumentation_version = tuple(map(int, instrumentation_version_str.split(".")[:3]))
# check if the llama_index version is compatible with the instrumentation version
instrumentation_version = _parse_semantic_version(instrumentation_version_str)

if llama_index_version_str is None:
if llama_index_core_version_str is None:
raise PackageNotFoundError(
"Missing `llama_index`. "
"Install with `pip install llama-index` or "
"`pip install llama-index-core` for a minimal installation."
)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why not just combine these conditionals with an or?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I could combine with an and, but then I would need to do an additional conditional to handle the case where llama-index-core is installed but the instrumentation is out of date.

There are four cases I was checking for:

  • The user has neither llama-index nor llama-index-core installed.
  • The user has only llama-index-core installed, but an old version of the instrumentation.
  • The user has an old version of llama-index installed and a modern version of the instrumentation.
  • The user has a modern version of llama-index installed and an old version of the instrumentation.

if instrumentation_version < INSTRUMENTATION_MODERN_VERSION:
raise IncompatibleLibraryVersionError(
f"llama-index-core v{llama_index_core_version_str} is not compatible with "
f"openinference-instrumentation-llama-index v{instrumentation_version_str}. "
"Please upgrade openinference-instrumentation-llama-index to at least 1.0.0 via "
"`pip install 'openinference-instrumentation-llama-index>=1.0.0'`."
)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe I'm reading the indentation wrong but I feel this is in the wrong place?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This condition is checking for the case where llama-index-core is installed but llama-index is not installed. In that case, I am checking whether the current version of the instrumentation is compatible.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see, yeah the conditionals a a bit confusing. I think it may be good to explain your logic here a bit as it's quite confusing to map the conditionals

return True
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Explain this true - e.g. -core existence means that the instrumentation is modern "enough"

Copy link
Contributor Author

@axiomofjoy axiomofjoy Mar 9, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This return statement corresponds to the case where only llama-index-core is installed and a modern version of our instrumentation is installed.


llama_index_version = _parse_semantic_version(llama_index_version_str)
if (
llama_index_version < LLAMA_INDEX_MODERN_VERSION
and instrumentation_version >= INSTRUMENTATION_MODERN_VERSION
):
raise IncompatibleLibraryVersionError(
f"llama-index v{llama_index_version_str} is not compatible with "
f"openinference-instrumentation-llama-index v{instrumentation_version_str}."
f"openinference-instrumentation-llama-index v{instrumentation_version_str}. "
"Please either migrate llama-index to at least 0.10.0 or downgrade "
"openinference-instrumentation-llama-index via "
"`pip install 'openinference-instrumentation-llama-index<1.0.0'`."
Expand All @@ -43,14 +66,22 @@ def _check_instrumentation_compatibility() -> bool:
):
raise IncompatibleLibraryVersionError(
f"llama-index v{llama_index_version_str} is not compatible with "
f"openinference-instrumentation-llama-index v{instrumentation_version_str}."
"Please upgrade openinference-instrumentation-llama-index to at least 1.0.0"
f"openinference-instrumentation-llama-index v{instrumentation_version_str}. "
"Please upgrade openinference-instrumentation-llama-index to at least 1.0.0 via "
"`pip install 'openinference-instrumentation-llama-index>=1.0.0'`."
)
# if the versions are compatible, return True
return True


def _parse_semantic_version(semver_string: str) -> Tuple[int, int, int]:
"""
Parse a semantic version string into a tuple of integers.
"""
major, minor, patch = semver_string.split(".")[:3]
return int(major), int(minor), int(patch)


if _check_instrumentation_compatibility():
from openinference.instrumentation.llama_index._callback import (
OpenInferenceTraceCallbackHandler as _OpenInferenceTraceCallbackHandler,
Expand Down
Loading