Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Python 3.12 crashed #311

Open
ZWijger opened this issue Sep 18, 2024 · 4 comments
Open

Python 3.12 crashed #311

ZWijger opened this issue Sep 18, 2024 · 4 comments
Labels
bug Something isn't working

Comments

@ZWijger
Copy link

ZWijger commented Sep 18, 2024

Describe the bug
A clear and concise description of what the bug is.

System: Fedora 40, KDE spin, 16 GB internal memory, 12 × Intel® Core™ i5-10400 CPU @ 2.90G, NVIDIA GTX1650 super.
The following message: We're sorry, it looks like “python3.12” crashed. Please contact the developer if you want to report the issue.
When I use Alpaca. Alpaca is installed as Flatpak.
The mouse pointer is much larger and accessibility is disabled on my system. (KDE plasma version 6.1.4. , KDE FrameWorks 6.5.0, Qt version: 6.7.2, kernel 6.10.9-200, Wayland.

Expected behavior
A clear and concise description of what you expected to happen.

Screenshots
If applicable, add screenshots to help explain your problem.
I can't take a screenshot: the program suddenly closes.

Debugging information
This information was collected when Alpaca was still working.

INFO [main.py | main] Alpaca version: 2.0.3
INFO [connection_handler.py | start] Starting Alpaca's Ollama instance...
INFO [connection_handler.py | start] Started Alpaca's Ollama instance
2024/09/19 00:53:36 routes.go:1125: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: OLLAMA_DEBUG:true OLLAMA_FLASH_ATTENTION:false OLLAMA_HOST:http://127.0.0.1:11435 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_LLM_LIBRARY: OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/home/username_/.var/app/com.jeffser.Alpaca/data/.ollama/models OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://*] OLLAMA_RUNNERS_DIR: OLLAMA_SCHED_SPREAD:false OLLAMA_TMPDIR: ROCR_VISIBLE_DEVICES:]"
time=2024-09-19T00:53:36.603+02:00 level=INFO source=images.go:753 msg="total blobs: 5"
time=2024-09-19T00:53:36.603+02:00 level=INFO source=images.go:760 msg="total unused blobs removed: 0"
time=2024-09-19T00:53:36.603+02:00 level=INFO source=routes.go:1172 msg="Listening on 127.0.0.1:11435 (version 0.3.9)"
time=2024-09-19T00:53:36.604+02:00 level=INFO source=payload.go:30 msg="extracting embedded files" dir=/home/bkerste/.var/app/com.jeffser.Alpaca/cache/tmp/ollama/ollama2197306919/runners
time=2024-09-19T00:53:36.604+02:00 level=DEBUG source=payload.go:182 msg=extracting variant=cpu file=build/linux/x86_64/cpu/bin/libggml.so.gz
time=2024-09-19T00:53:36.604+02:00 level=DEBUG source=payload.go:182 msg=extracting variant=cpu file=build/linux/x86_64/cpu/bin/libllama.so.gz
time=2024-09-19T00:53:36.604+02:00 level=DEBUG source=payload.go:182 msg=extracting variant=cpu file=build/linux/x86_64/cpu/bin/ollama_llama_server.gz
time=2024-09-19T00:53:36.604+02:00 level=DEBUG source=payload.go:182 msg=extracting variant=cpu_avx file=build/linux/x86_64/cpu_avx/bin/libggml.so.gz
time=2024-09-19T00:53:36.604+02:00 level=DEBUG source=payload.go:182 msg=extracting variant=cpu_avx file=build/linux/x86_64/cpu_avx/bin/libllama.so.gz
time=2024-09-19T00:53:36.604+02:00 level=DEBUG source=payload.go:182 msg=extracting variant=cpu_avx file=build/linux/x86_64/cpu_avx/bin/ollama_llama_server.gz
time=2024-09-19T00:53:36.604+02:00 level=DEBUG source=payload.go:182 msg=extracting variant=cpu_avx2 file=build/linux/x86_64/cpu_avx2/bin/libggml.so.gz
time=2024-09-19T00:53:36.604+02:00 level=DEBUG source=payload.go:182 msg=extracting variant=cpu_avx2 file=build/linux/x86_64/cpu_avx2/bin/libllama.so.gz
time=2024-09-19T00:53:36.604+02:00 level=DEBUG source=payload.go:182 msg=extracting variant=cpu_avx2 file=build/linux/x86_64/cpu_avx2/bin/ollama_llama_server.gz
time=2024-09-19T00:53:36.604+02:00 level=DEBUG source=payload.go:182 msg=extracting variant=cuda_v11 file=build/linux/x86_64/cuda_v11/bin/libggml.so.gz
time=2024-09-19T00:53:36.604+02:00 level=DEBUG source=payload.go:182 msg=extracting variant=cuda_v11 file=build/linux/x86_64/cuda_v11/bin/libllama.so.gz
time=2024-09-19T00:53:36.604+02:00 level=DEBUG source=payload.go:182 msg=extracting variant=cuda_v11 file=build/linux/x86_64/cuda_v11/bin/ollama_llama_server.gz
time=2024-09-19T00:53:36.604+02:00 level=DEBUG source=payload.go:182 msg=extracting variant=cuda_v12 file=build/linux/x86_64/cuda_v12/bin/libggml.so.gz
time=2024-09-19T00:53:36.604+02:00 level=DEBUG source=payload.go:182 msg=extracting variant=cuda_v12 file=build/linux/x86_64/cuda_v12/bin/libllama.so.gz
time=2024-09-19T00:53:36.604+02:00 level=DEBUG source=payload.go:182 msg=extracting variant=cuda_v12 file=build/linux/x86_64/cuda_v12/bin/ollama_llama_server.gz
time=2024-09-19T00:53:36.604+02:00 level=DEBUG source=payload.go:182 msg=extracting variant=rocm_v60102 file=build/linux/x86_64/rocm_v60102/bin/libggml.so.gz
time=2024-09-19T00:53:36.604+02:00 level=DEBUG source=payload.go:182 msg=extracting variant=rocm_v60102 file=build/linux/x86_64/rocm_v60102/bin/libllama.so.gz
time=2024-09-19T00:53:36.604+02:00 level=DEBUG source=payload.go:182 msg=extracting variant=rocm_v60102 file=build/linux/x86_64/rocm_v60102/bin/ollama_llama_server.gz
INFO [connection_handler.py | start] client version is 0.3.9
INFO [connection_handler.py | request] GET : http://127.0.0.1:11435/api/tags

Please paste here the debugging information available at 'About Alpaca' > 'Troubleshooting' > 'Debugging Information'
@ZWijger ZWijger added the bug Something isn't working label Sep 18, 2024
@Jeffser
Copy link
Owner

Jeffser commented Sep 21, 2024

Hi thanks for your report, I'll take a look at it.

The mouse pointer is much larger and accessibility is disabled on my system. (KDE plasma version 6.1.4. , KDE FrameWorks 6.5.0, Qt version: 6.7.2, kernel 6.10.9-200, Wayland.

As far as I know this is a general issue with GTK apps on KDE, I believe I watched a Brodie Robertson video where he mentioned it but I can't recall which video that was

@Jeffser
Copy link
Owner

Jeffser commented Sep 29, 2024

Hi I updated the app a couple of days ago, can you confirm if the error was fixed? Thanks

@ZWijger
Copy link
Author

ZWijger commented Oct 1, 2024

I've reinstalled the app on a versioned installation of Fedora KDE spin. The mouse cursor behavior within the window has significantly improved. It now reacts the same as within all KDE apps. However, the Python 3.12 error message hasn't disappeared. Since the KDE version is a spin of the original version, I subsequently performed a clean installation, but this time with the Gnome version. Here too, the app crashes with the message that Python 3.12 has crashed. I'm now unsure whether this is a specific Python problem or an issue within the app itself. I don't know if the Python environment is packaged with the Flatpak. The error messages I get don't provide any further details beyond the mention of this crash. I'm just a simple end-user. I hope you can make something of this information, I'm always willing to provide more details.

@Jeffser
Copy link
Owner

Jeffser commented Oct 9, 2024

Hi, sorry for not getting back to you in this last week, I was busy with college, I pushed an update yesterday that changes the way Alpaca is launched it probably fixes this problem, could you please tell me if it works fine now? thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants