Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

You forgot about ChatMLX.bind_tools #28692

Closed
5 tasks done
svnv-svsv-jm opened this issue Dec 12, 2024 · 1 comment
Closed
5 tasks done

You forgot about ChatMLX.bind_tools #28692

svnv-svsv-jm opened this issue Dec 12, 2024 · 1 comment
Labels
Ɑ: models Related to LLMs or chat model modules

Comments

@svnv-svsv-jm
Copy link

svnv-svsv-jm commented Dec 12, 2024

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

from langchain_community.chat_models.mlx import ChatMLX
from langgraph.prebuilt import create_react_agent

tools = ...

agent_executor = create_react_agent(ChatMLX(...), tools)

NotImplementedError

Error Message and Stack Trace (if applicable)

No response

Description

You forgot to implement the bind_tools method for the ChatMLX class.

I was getting a NotImplementedError when trying to create a react agent:

from langchain_community.chat_models.mlx import ChatMLX
from langgraph.prebuilt import create_react_agent

agent_executor = create_react_agent(ChatMLX(...), tools)

Knowing that this works when I do agent_executor = create_react_agent(ChatHuggingFace(...), tools), I just copied the method ChatHuggingFace.bind_tools and extended the ChatMLX class with it, and not only I was able to create the agent, but also to .invoke() it!

See:

import typing as ty

from langchain_community.chat_models.mlx import ChatMLX as ChatMLX_
from langchain_core.tools import BaseTool
from langchain_core.runnables import Runnable
from langchain_core.language_models import LanguageModelInput
from langchain_core.utils.function_calling import convert_to_openai_tool

class ChatMLX(ChatMLX_):
    """We patch the original class by adding the `bind_tools` method."""

# THIS IS COPIED FROM `ChatHuggingFace.bind_tools`
    def bind_tools(
        self,
        tools: ty.Sequence[ty.Union[dict[str, ty.Any], type, ty.Callable, BaseTool]],
        *,
        tool_choice: ty.Optional[ty.Union[dict, str, ty.Literal["auto", "none"], bool]] = None,
        **kwargs: ty.Any,
    ) -> Runnable[LanguageModelInput, BaseMessage]:
        """Bind tool-like objects to this chat model.

        Assumes model is compatible with OpenAI tool-calling API.

        Args:
            tools: A list of tool definitions to bind to this chat model.
                Supports any tool definition handled by
                :meth:`langchain_core.utils.function_calling.convert_to_openai_tool`.
            tool_choice: Which tool to require the model to call.
                Must be the name of the single provided function or
                "auto" to automatically determine which function to call
                (if any), or a dict of the form:
                {"type": "function", "function": {"name": <<tool_name>>}}.
            **kwargs: Any additional parameters to pass to the
                :class:`~langchain.runnable.Runnable` constructor.
        """
        formatted_tools = [convert_to_openai_tool(tool) for tool in tools]
        if tool_choice is not None and tool_choice:
            if len(formatted_tools) != 1:
                raise ValueError(
                    "When specifying `tool_choice`, you must provide exactly one "
                    f"tool. Received {len(formatted_tools)} tools."
                )
            if isinstance(tool_choice, str):
                if tool_choice not in ("auto", "none"):
                    tool_choice = {
                        "type": "function",
                        "function": {"name": tool_choice},
                    }
            elif isinstance(tool_choice, bool):
                tool_choice = formatted_tools[0]
            elif isinstance(tool_choice, dict):
                if formatted_tools[0]["function"]["name"] != tool_choice["function"]["name"]:
                    raise ValueError(
                        f"Tool choice {tool_choice} was specified, but the only "
                        f"provided tool was {formatted_tools[0]['function']['name']}."
                    )
            else:
                raise ValueError(
                    f"Unrecognized tool_choice type. Expected str, bool or dict. " f"Received: {tool_choice}"
                )
            kwargs["tool_choice"] = tool_choice
        return super().bind(tools=formatted_tools, **kwargs)

I think you should create a small PR where this method is added to the ChatMLX class, perhaps avoiding code duplication.

I'd do it myself but I do not have time, the owners of this repo can do it very quickly.

System Info

System Information
------------------
> OS:  Darwin
> OS Version:  Darwin Kernel Version 24.1.0: Thu Oct 10 21:05:14 PDT 2024; root:xnu-11215.41.3~2/RELEASE_ARM64_T8103
> Python Version:  3.12.7 (main, Nov 20 2024, 14:24:14) [Clang 16.0.0 (clang-1600.0.26.4)]

Package Information
-------------------
> langchain_core: 0.3.21
> langchain: 0.3.9
> langchain_community: 0.3.9
> langsmith: 0.1.147
> langchain_huggingface: 0.1.2
> langchain_ollama: 0.2.1
> langchain_openai: 0.2.11
> langchain_text_splitters: 0.3.2
> langgraph_sdk: 0.1.43

Optional packages not installed
-------------------------------
> langserve

Other Dependencies
------------------
> aiohttp: 3.11.9
> async-timeout: Installed. No version info available.
> dataclasses-json: 0.6.7
> httpx: 0.27.2
> httpx-sse: 0.4.0
> huggingface-hub: 0.24.7
> jsonpatch: 1.33
> langsmith-pyo3: Installed. No version info available.
> numpy: 1.26.4
> ollama: 0.4.2
> openai: 1.56.2
> orjson: 3.10.12
> packaging: 24.2
> pydantic: 2.10.3
> pydantic-settings: 2.6.1
> PyYAML: 6.0.2
> requests: 2.32.3
> requests-toolbelt: 1.0.0
> sentence-transformers: 2.7.0
> SQLAlchemy: 2.0.36
> tenacity: 8.4.2
> tiktoken: 0.8.0
> tokenizers: 0.20.3
> transformers: 4.46.3
> typing-extensions: 4.12.2
@dosubot dosubot bot added the Ɑ: models Related to LLMs or chat model modules label Dec 12, 2024
ccurme pushed a commit that referenced this issue Dec 16, 2024
…ream` (#28743)

- **Description:** Added Support for `bind_tool` as requested in the
issue. Plus two issue in `_stream` were fixed:
    - Corrected the Positional Argument Passing for `generate_step`
    - Accountability if `token` returned by `generate_step` is integer.
- **Issue:** #28692
@ccurme
Copy link
Collaborator

ccurme commented Dec 16, 2024

Should be addressed in #28743

@ccurme ccurme closed this as completed Dec 16, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Ɑ: models Related to LLMs or chat model modules
Projects
None yet
Development

No branches or pull requests

2 participants