Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Investigate] Custom llama.dll Dependency Resolution Issues on Windows #12

Closed
MillionthOdin16 opened this issue Apr 3, 2023 · 8 comments

Comments

@MillionthOdin16
Copy link
Contributor

This is a note for using a custom llama.dll build on Windows. I ran into dependency resolution issues with loading my own llama.dll compiled with BLAS support and some extra hardware specific optimization flags. No matter what I do, it can't seem to locate all of its dependencies, even though I've tried placing them in system paths and even same dir.

My current workaround is using the default llama.dll that llama-cpp-python builds, but it doesn't have the hardware optimizations and BLAS compatibility that I enabled in my custom build. So, I'm still trying to figure out what my issue is. Maybe something python specific that i'm missing...

I'm dropping this issue here just in case anyone else runs into something similar. If you have any ideas or workarounds, let me know. I'll keep trying to figure it out until I get it resolved haha :)

@geocine
Copy link

geocine commented Apr 3, 2023

For me I could not even get past this

# TODO: fragile, should fix
 _base_path = pathlib.Path(__file__).parent
(_lib_path,) = chain(
     _base_path.glob("*.so"), _base_path.glob("*.dylib"), _base_path.glob("*.dll")
 )](url)

I get this error

(_lib_path,) = chain(
ValueError: not enough values to unpack (expected 1, got 0)

So I just commented it out and did this instead

_lib_path = "C:\path\to\my\llama.dll"

@MillionthOdin16
Copy link
Contributor Author

Okay, yep. Actually just made a pr working to make libs easier to work with. Will be better soon :) Thanks for sharing

@abetlen
Copy link
Owner

abetlen commented Apr 4, 2023

@geocine I've merged in @MillionthOdin16 fix for library loading, does the default installation path work now for you?

@geocine
Copy link

geocine commented Apr 4, 2023

Thanks @abetlen , I will try again later

@MillionthOdin16
Copy link
Contributor Author

I think there might be a combination of 2 things that make the lib setup confusing at the moment:

  1. I don't know at which point llama-cpp-python compiles the llama lib, and when I first got setup I think I had to go and compile it after searching. So in this case, it might not actually exist.
  2. When bringing a custom built 'llama' lib for extra functionality (like BLAS), it detects the llama lib itself, but can't find other dependencies like it normally should. I'm still trying to figure out if this is a Windows only type of thing or related to python somehow.

*Just a note, the CMakeLists for llama.cpp are less developed on Windows side than the MacOS/Linux side (which the devs are more focused on), so for Windows they often require manual tweaks to make them work correctly. This is one reason my first inclination was to bring my own lib. Just wanna make sure people know in case Windows users have more trouble.

@abetlen
Copy link
Owner

abetlen commented Apr 4, 2023

@MillionthOdin16 ahh that makes sense, I currently only have access to Linux and one Mac system so I haven't had a chance to test on Windows.

The shared library is supposed to build when you install the package from pip. I'm using scikit-build to accomplish this. In theory what should happen is that libllama.so / llama.dll is built and placed inside your python site packages alongside the python files. This is where llama-cpp-python should check for the shared library by default.

It seems scikit-build allows you to inject build arguments to cmake (I assume at build time after you determine the platform) see the documentation here. This may be a solution for Windows users but unfortunately I can't really help on this front.

@MillionthOdin16
Copy link
Contributor Author

I describe the behavior here #30 (comment). Once we resolve that, this is resolved as well.

@MillionthOdin16
Copy link
Contributor Author

Resolved in 0fd3204

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants