LlamaSharp won't run on legacy NET Framework 4.8 #582
Replies: 4 comments 6 replies
-
I think this is probably the same problem that was reported in #508. If you can work out what exactly it doesn't like I'm sure we can get a workaround in. My guess is something in |
Beta Was this translation helpful? Give feedback.
-
The call I mentioned is generating the following exception message: "Method's type signature is not PInvoke compatible." If you look at https://github.com/dotnet/runtime/issues/10973, it looks like it is the same limitation/bug. I will see if I can spend sometime on this later to do a more "manual" marshalling so I can have control over the process and avoid the bug. I think it is worth as there is a lot of legacy (NET 4.8) code out there that could benefit from a great wrapper like LlamaSharp! |
Beta Was this translation helpful? Give feedback.
-
Glad to do it. The only thing is the targets now are NETStandard;net6.0'net7.0'net8.0, so I would need to add the #if for NETSTANDARD |
Beta Was this translation helpful? Give feedback.
-
Tested successfully with NET Framework 4.8 and NET 8.0 with:
|
Beta Was this translation helpful? Give feedback.
-
The call NativeApi.llama_model_default_params() fails when executing LlamaSharp on NET Framework 4.8 due to a limitation/bug in interop code of NET Framework 4.8.
Just for testing, I added a simple code to avoid calling NativeApi.llama_model_default_params() when running on NET Framework 4.8 and everything else executed flawlessly.
I haven't had time to look into this but I am certain there could be a workaround when calling llama_model_default_params() that could avoid the bug.
On a last comment, NET Framework 4.8/C#7.3 does not support asynchronous streams, so I had to use the following little trick to make the IAsyncEnumerable work and use the various async methods provided my LlamaSharp:
`
IAsyncEnumerable async_enum = executor.InferAsync(prompt, inferenceParams);
IAsyncEnumerator enumerator = async_enum.GetAsyncEnumerator();
try
{
while (await enumerator.MoveNextAsync())
{
string text = enumerator.Current;
Console.Write(text);
}
finally
{
await enumerator.DisposeAsync();
}
`
Beta Was this translation helpful? Give feedback.
All reactions