Issue: GGUF Load By Url Is Broken
Problem Statement
GGUF load by URL is not working as expected. The issue is related to the calculate_sha256
function, which now requires a file path instead of a stream. However, the download_file_stream
function is still using a stream, causing the GGUF upload by URL to fail.
Background
The GGUF load by URL feature is a crucial part of the Open WebUI application. It allows users to upload models from a URL, which is then processed and made available for use. However, with the recent changes to the calculate_sha256
function, this feature is no longer working as expected.
Steps to Reproduce
To reproduce this issue, follow these steps:
- Ensure you are using the latest version of Open WebUI.
- Try to upload a GGUF model by URL using the Open WebUI application.
- Observe that the upload process gets stuck at 100%.
Expected Behavior
The expected behavior is that the GGUF load by URL feature should work as usual, allowing users to upload models from a URL and process them successfully.
Actual Behavior
The actual behavior is that the GGUF load by URL feature is not working, causing the upload process to get stuck at 100%.
Logs and Screenshots
The browser console logs show the following error:
+layout.svelte:472 Backend config: Object
+layout.svelte:76 connected 8HXwP-g--S3-N6xSAAAT
+layout.svelte:95 user-list Object
+layout.svelte:100 usage Object
+layout.svelte:95 user-list Object
General.svelte:61 Object
General.svelte:64 false
/ollama/models/download/0:1
Failed to load resource: net::ERR_INCOMPLETE_CHUNKED_ENCODING
ManageOllama.svelte:405 Uncaught (in promise) TypeError: network error
The Open WebUI logs show the following error:
2025-03-12 18:01:49.136 | INFO | open_webui.routers.ollama:get_all_models:300 - get_all_models() - {}
2025-03-12 18:01:49.138 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.20.211:0 - "GET /api/models/base HTTP/1.1" 200 - {}
2025-03-12 18:01:51.158 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.20.211:0 - "GET /ollama/api/tags/0 HTTP/1.1" 200 - {}
2025-03-12 18:02:06.112 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.20.211:0 - "POST /ollama/models/download/0 HTTP/1.1" 200 - {}
Exception in ASGI application
+ Exception Group Traceback (most recent call last):
| File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 76, in collapse_excgroups
| yield
| File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 178, in __call__
| async with anyio.create_task_group() as task_group:
| File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 767, in __aexit__
| raise BaseExceptionGroup(
| ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception)
+-+---------------- 1 ----------------
| Traceback (most recent call last):
| File "/usr/local/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 409, in run_asgi
| result = await app( # type: ignore[func-returns-value]
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
| File "/usr/local/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 60, in __call__
| return await self.app(scope, receive, send)
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
| File "/usr/local/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__
| await super().__call__(scope, receive, send)
| File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 112, in __call__
| await self.middleware_stack(scope, receive, send)
| File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 187, in __call__
| raise exc
| File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 165, in __call__
| await self.app(scope, receive, _send)
| File "/app/backend/open_webui/utils/audit.py", line 178, in __call__
| await self.app(scope, receive_wrapper, send_wrapper)
| File "/usr/local/lib/python3.11/site-packages/starlette/middleware/cors.py", line 93, in __call__
| await self.simple_response(scope, receive, send, request_headers=headers)
| File "/usr/local/lib/python3.11/site-packages/starlette/middleware/cors.py", line 144, in simple_response
| await self.app(scope, receive, send)
| File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 177, in __call__
| with recv_stream, send_stream, collapse_excgroups():
| File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__
| self.gen.throw(typ, value, traceback)
| File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 82, in collapse_excgroups
| raise exc
| File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 180, in __call__
| await response(scope, wrapped_receive, send)
| File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 215, in __call__
| async for chunk in self.body_iterator:
| File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 169, in body_stream
| raise app_exc
| File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 141, in coro
| await self.app(scope, receive_or_disconnect, send_no_error)
| File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 177, in __call__
| with recv_stream, send_stream, collapse_excgroups():
| File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__
| self.gen.throw(typ, value, traceback)
| File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 82, in collapse_excgroups
| raise exc
| File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 180, in __call__
| await response(scope, wrapped_receive, send)
| File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 215, in __call__
| async for chunk in self.body_iterator:
| File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 169, in body_stream
| raise app_exc
| File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 141, in coro
| await self.app(scope, receive_or_disconnect, send_no_error)
| File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 177, in __call__
| with recv_stream, send_stream, collapse_excgroups():
| File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__
| self.gen.throw(typ, value, traceback)
| File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 82, in collapse_excgroups
| raise exc
| File "/usr/local/lib/python3.11/site-packages/starlette/responses.py", line 264, in wrap
| await func()
| File "/usr/local/lib/python3.11/site-packages/starlette/responses.py", line 245, in stream_response
| async for chunk in self.body_iterator:
| File "/app/backend/open_webui/routers/ollama.py", line 1486, in download_file_stream
| hashed = calculate_sha256(file)
| ^^^^^^^^^^^^^^^^^^^^^^
| TypeError: calculate_sha256() missing 1<br/>
**Q&A: GGUF Load by URL is Broken**
=====================================
**Q: What is the issue with GGUF load by URL?**
------------------------------------------------
A: The issue is that the `calculate_sha256` function now requires a file path instead of a stream, but the `download_file_stream` function is still using a stream, causing the GGUF upload by URL to fail.
**Q: What is the expected behavior of GGUF load by URL?**
------------------------------------------------------
A: The expected behavior is that the GGUF load by URL feature should work as usual, allowing users to upload models from a URL and process them successfully.
**Q: What is the actual behavior of GGUF load by URL?**
------------------------------------------------------
A: The actual behavior is that the GGUF load by URL feature is not working, causing the upload process to get stuck at 100%.
**Q: What are the logs and screenshots showing?**
------------------------------------------------
A: The browser console logs show the following error:
+layout.svelte:472 Backend config: Object +layout.svelte:76 connected 8HXwP-g--S3-N6xSAAAT +layout.svelte:95 user-list Object +layout.svelte:100 usage Object +layout.svelte:95 user-list Object General.svelte:61 Object General.svelte:64 false /ollama/models/download/0:1
Failed to load resource: net::ERR_INCOMPLETE_CHUNKED_ENCODING
ManageOllama.svelte:405 Uncaught (in promise) TypeError: network error
The Open WebUI logs show the following error:
2025-03-12 18:01:49.136 | INFO | open_webui.routers.ollama:get_all_models:300 - get_all_models() - } 2025-03-12 18 2025-03-12 18:01:51.158 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.20.211:0 - "GET /ollama/api/tags/0 HTTP/1.1" 200 - } 2025-03-12 18 Exception in ASGI application
- Exception Group Traceback (most recent call last): | File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 76, in collapse_excgroups | yield | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 178, in call | async with anyio.create_task_group() as task_group: | File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 767, in aexit | raise BaseExceptionGroup( | ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception) +-+---------------- 1 ---------------- | Traceback (most recent call last): | File "/usr/local/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 409, in run_asgi | result = await app( # type: ignore[func-returns-value] | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/local/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 60, in call | return await self.app(scope, receive, send) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/local/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in call | await super().call(scope, receive, send) | File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 112, in call | await self.middleware_stack(scope, receive, send) | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 187, in call | raise exc | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 165, in call | await self.app(scope, receive, _send) | File "/app/backend/open_webui/utils/audit.py", line 178, in call | await self.app(scope, receive_wrapper, send_wrapper) | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/cors.py", line 93, in call | await self.simple_response(scope, receive, send, request_headers=headers) | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/cors.py", line 144, in simple_response | await self.app(scope, receive, send) | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 177, in call | with recv_stream, send_stream, collapse_excgroups(): | File "/usr/local/lib/python3.11/contextlib.py", line 158, in exit | self.gen.throw(typ, value, traceback) | File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 82, in collapse_excgroups | raise exc | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 180, in call | await response(scope, wrapped_receive, send) | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 215, in call | async for chunk in self.body_iterator: | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 169, in body_stream | raise app_exc | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 141, in coro | await self.app(scope, receive_or_disconnect, send_no_error) | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 177, in call | with recv_stream, send_stream, collapse_excgroups(): | File "/usr/local/lib/python3.11/contextlib.py", line 158, in exit | self.gen.throw(typ, value, traceback) | File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 82, in collapse_excgroups | raise exc | File "/usr/local/lib/python3.11/site-packages/starlette/responses.py", line 264, in wrap | await func() | File "/usr/local/lib/python3.11/site-packages/starlette/responses.py", line 245, in stream_response | async for chunk in self.body_iterator: | File "/app/backend/open_webui/routers/ollama.py", line 1486, in download_file_stream | hashed = calculate_sha256(file) | ^^^^^^^^^^^^^^^^^^^^^^ | TypeError: calculate_sha256() missing 1 required positional argument: 'chunk_size'
**Q: What is the solution to this issue?**
------------------------------------------------
A: The solution is to update the `download_file_stream` function to use a file path instead of a stream, and to update the `calculate_sha256` function to accept a file path as an argument.
**Q: How can I update the `download_file_stream` function?**
---------------------------------------------------------
A: You can update the `download_file_stream` function by replacing the `stream` argument with a `file_path` argument, and by using the `open` function to read the file from the file path.
**Q: How can I update the `calculate_sha256` function?**
---------------------------------------------------------
A: You can update the `calculate_sha256` function by adding a `file_path` argument, and by using the `open` function to read the file from the file path.
**Q: What are the benefits of updating the `download_file_stream` and `calculate_sha256` functions?**
---------------------------------------------------------
A: The benefits of updating the `download_file_stream` and `calculate_sha256` functions are that they will be able to handle file paths instead of streams, which will allow for more flexibility and ease of use.