-
-
Notifications
You must be signed in to change notification settings - Fork 776
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
openai-python v1.2.3 #4292
Comments
Does the library work if you use the patcher like you do in the raw request example? import pyodide_http
pyodide_http.patch_all() |
No, import micropip
import pyodide_http
pyodide_http.patch_all()
await micropip.install('openai', keep_going=True)
await micropip.install("ssl")
import openai
from openai import OpenAI
client = OpenAI(
api_key="API KEY",
)
chat_completion = client.chat.completions.create(
messages=[
{
"role": "user",
"content": "Say this is a test",
}
],
model="gpt-3.5-turbo",
) Gives the same: |
You may be interested in this issue: koenvo/pyodide-http#37 That mentions that it might be fairly simple to add support for httpx. |
I'm not very familiar with this aspect of httpx but it might not even require monkey patching, you might be able to pass in a custom transport. |
Is it possible to schedule a meeting with you guys - I'm not sure if I completely understand how easy it is to implement this, or the resources required to do something like this. |
Could you check the url passed into
This would be the feasible option for now. Or you may consider adding |
For the CORS error, I follow these steps:
So, no - I don't get the CORS error. I don't get the 404 error - the link works. The only reason the rollback option fails right now is because of the BadZipFile error - can you check if you get the same? |
For httpx - because the comment states nothing other than https://sans-io.readthedocs.io/ - could you direct me to a process that enables me to support it? How do I go about doing it? |
httpx is written with custom IO in mind so there isn't anything specific you need to do to "enable" it, you can directly override the default transport layer with your your own custom transport that forwards the request to pyodide's fetch instead. here's some docs you might find helpful: https://www.python-httpx.org/advanced/#custom-transports for the actual implementation internals, they would likely look very similar to the existing requests patch, https://github.com/koenvo/pyodide-http/blob/main/pyodide_http/_requests.py |
Also, what is the reason for the BadZipFile error? Sometimes wheels install after uploading it, other times when they are uploaded to Github and downloaded. But in this case, they both fail - ideally, I would have expected v0.28.1 to work without this error. |
I guess the URL you used Could you try this URL instead
Note that this URL sets CORS headers, so you don't need to use the CORS extension. |
Thanks, this works - it now fails for the multidict library saying: ValueError: Can't find a pure Python 3 wheel for: 'multidict<5.0,>=4.5'. Can this be resolved if I compile a wheel and install it before installing the openai library? Just to understand another thing: uploading the wheel file to JupyterLite and installing it also results in the same BadZipFile error for me - this shouldn't be the case right? |
Yes.
That's a separate case, but the cause is probably the same. |
After resolving the multidict issue; I am currently stuck because of this: aio-libs/aiohttp#7803 |
Currently, I have resolved all the issue with other dependencies and such - but there is an issue with requesting - max retries exceeded: Code for recreation on JupyterLite: import micropip
await micropip.install('https://raw.githubusercontent.com/psymbio/tiktoken_rust_wasm/main/packages/multidict/multidict-4.7.6-py3-none-any.whl', keep_going=True)
await micropip.install('https://raw.githubusercontent.com/psymbio/tiktoken_rust_wasm/main/packages/frozenlist/frozenlist-1.4.0-py3-none-any.whl', keep_going=True)
await micropip.install('https://raw.githubusercontent.com/psymbio/tiktoken_rust_wasm/main/packages/aiohttp/aiohttp-4.0.0a2.dev0-py3-none-any.whl', keep_going=True)
await micropip.install('https://raw.githubusercontent.com/psymbio/tiktoken_rust_wasm/main/packages/openai_wheel/openai-0.28.1-py3-none-any.whl', keep_going=True)
await micropip.install("ssl")
import ssl
import pyodide_http
pyodide_http.patch_all()
import openai
openai.api_key = "API KEY"
chat_completion = openai.ChatCompletion.create(model="gpt-3.5-turbo", messages=[{"role": "user", "content": "Hello world"}]) Error: OSError Traceback (most recent call last)
File /lib/python3.11/site-packages/urllib3/connection.py:203, in HTTPConnection._new_conn(self)
202 try:
--> 203 sock = connection.create_connection(
204 (self._dns_host, self.port),
205 self.timeout,
206 source_address=self.source_address,
207 socket_options=self.socket_options,
208 )
209 except socket.gaierror as e:
File /lib/python3.11/site-packages/urllib3/util/connection.py:85, in create_connection(address, timeout, source_address, socket_options)
84 try:
---> 85 raise err
86 finally:
87 # Break explicitly a reference cycle
File /lib/python3.11/site-packages/urllib3/util/connection.py:67, in create_connection(address, timeout, source_address, socket_options)
66 # If provided, set socket level options before connecting.
---> 67 _set_socket_options(sock, socket_options)
69 if timeout is not _DEFAULT_TIMEOUT:
File /lib/python3.11/site-packages/urllib3/util/connection.py:100, in _set_socket_options(sock, options)
99 for opt in options:
--> 100 sock.setsockopt(*opt)
OSError: [Errno 50] Protocol not available
The above exception was the direct cause of the following exception:
NewConnectionError Traceback (most recent call last)
File /lib/python3.11/site-packages/urllib3/connectionpool.py:790, in HTTPConnectionPool.urlopen(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, chunked, body_pos, preload_content, decode_content, **response_kw)
789 # Make the request on the HTTPConnection object
--> 790 response = self._make_request(
791 conn,
792 method,
793 url,
794 timeout=timeout_obj,
795 body=body,
796 headers=headers,
797 chunked=chunked,
798 retries=retries,
799 response_conn=response_conn,
800 preload_content=preload_content,
801 decode_content=decode_content,
802 **response_kw,
803 )
805 # Everything went great!
File /lib/python3.11/site-packages/urllib3/connectionpool.py:491, in HTTPConnectionPool._make_request(self, conn, method, url, body, headers, retries, timeout, chunked, response_conn, preload_content, decode_content, enforce_content_length)
490 new_e = _wrap_proxy_error(new_e, conn.proxy.scheme)
--> 491 raise new_e
493 # conn.request() calls http.client.*.request, not the method in
494 # urllib3.request. It also calls makefile (recv) on the socket.
File /lib/python3.11/site-packages/urllib3/connectionpool.py:467, in HTTPConnectionPool._make_request(self, conn, method, url, body, headers, retries, timeout, chunked, response_conn, preload_content, decode_content, enforce_content_length)
466 try:
--> 467 self._validate_conn(conn)
468 except (SocketTimeout, BaseSSLError) as e:
File /lib/python3.11/site-packages/urllib3/connectionpool.py:1096, in HTTPSConnectionPool._validate_conn(self, conn)
1095 if conn.is_closed:
-> 1096 conn.connect()
1098 if not conn.is_verified:
File /lib/python3.11/site-packages/urllib3/connection.py:611, in HTTPSConnection.connect(self)
610 sock: socket.socket | ssl.SSLSocket
--> 611 self.sock = sock = self._new_conn()
612 server_hostname: str = self.host
File /lib/python3.11/site-packages/urllib3/connection.py:218, in HTTPConnection._new_conn(self)
217 except OSError as e:
--> 218 raise NewConnectionError(
219 self, f"Failed to establish a new connection: {e}"
220 ) from e
222 # Audit hooks are only available in Python 3.8+
NewConnectionError: <urllib3.connection.HTTPSConnection object at 0x2b9c368>: Failed to establish a new connection: [Errno 50] Protocol not available
The above exception was the direct cause of the following exception:
MaxRetryError Traceback (most recent call last)
File /lib/python3.11/site-packages/requests/adapters.py:486, in HTTPAdapter.send(self, request, stream, timeout, verify, cert, proxies)
485 try:
--> 486 resp = conn.urlopen(
487 method=request.method,
488 url=url,
489 body=request.body,
490 headers=request.headers,
491 redirect=False,
492 assert_same_host=False,
493 preload_content=False,
494 decode_content=False,
495 retries=self.max_retries,
496 timeout=timeout,
497 chunked=chunked,
498 )
500 except (ProtocolError, OSError) as err:
File /lib/python3.11/site-packages/urllib3/connectionpool.py:874, in HTTPConnectionPool.urlopen(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, chunked, body_pos, preload_content, decode_content, **response_kw)
871 log.warning(
872 "Retrying (%r) after connection broken by '%r': %s", retries, err, url
873 )
--> 874 return self.urlopen(
875 method,
876 url,
877 body,
878 headers,
879 retries,
880 redirect,
881 assert_same_host,
882 timeout=timeout,
883 pool_timeout=pool_timeout,
884 release_conn=release_conn,
885 chunked=chunked,
886 body_pos=body_pos,
887 preload_content=preload_content,
888 decode_content=decode_content,
889 **response_kw,
890 )
892 # Handle redirect?
File /lib/python3.11/site-packages/urllib3/connectionpool.py:874, in HTTPConnectionPool.urlopen(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, chunked, body_pos, preload_content, decode_content, **response_kw)
871 log.warning(
872 "Retrying (%r) after connection broken by '%r': %s", retries, err, url
873 )
--> 874 return self.urlopen(
875 method,
876 url,
877 body,
878 headers,
879 retries,
880 redirect,
881 assert_same_host,
882 timeout=timeout,
883 pool_timeout=pool_timeout,
884 release_conn=release_conn,
885 chunked=chunked,
886 body_pos=body_pos,
887 preload_content=preload_content,
888 decode_content=decode_content,
889 **response_kw,
890 )
892 # Handle redirect?
File /lib/python3.11/site-packages/urllib3/connectionpool.py:844, in HTTPConnectionPool.urlopen(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, chunked, body_pos, preload_content, decode_content, **response_kw)
842 new_e = ProtocolError("Connection aborted.", new_e)
--> 844 retries = retries.increment(
845 method, url, error=new_e, _pool=self, _stacktrace=sys.exc_info()[2]
846 )
847 retries.sleep()
File /lib/python3.11/site-packages/urllib3/util/retry.py:515, in Retry.increment(self, method, url, response, error, _pool, _stacktrace)
514 reason = error or ResponseError(cause)
--> 515 raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type]
517 log.debug("Incremented Retry for (url='%s'): %r", url, new_retry)
MaxRetryError: HTTPSConnectionPool(host='api.openai.com', port=443): Max retries exceeded with url: /v1/chat/completions (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x2b9c368>: Failed to establish a new connection: [Errno 50] Protocol not available'))
During handling of the above exception, another exception occurred:
ConnectionError Traceback (most recent call last)
File /lib/python3.11/site-packages/openai/api_requestor.py:606, in APIRequestor.request_raw(self, method, url, params, supplied_headers, files, stream, request_id, request_timeout)
605 try:
--> 606 result = _thread_context.session.request(
607 method,
608 abs_url,
609 headers=headers,
610 data=data,
611 files=files,
612 stream=stream,
613 timeout=request_timeout if request_timeout else TIMEOUT_SECS,
614 proxies=_thread_context.session.proxies,
615 )
616 except requests.exceptions.Timeout as e:
File /lib/python3.11/site-packages/requests/sessions.py:589, in Session.request(self, method, url, params, data, headers, cookies, files, auth, timeout, allow_redirects, proxies, hooks, stream, verify, cert, json)
588 send_kwargs.update(settings)
--> 589 resp = self.send(prep, **send_kwargs)
591 return resp
File /lib/python3.11/site-packages/requests/sessions.py:703, in Session.send(self, request, **kwargs)
702 # Send the request
--> 703 r = adapter.send(request, **kwargs)
705 # Total elapsed time of the request (approximately)
File /lib/python3.11/site-packages/requests/adapters.py:519, in HTTPAdapter.send(self, request, stream, timeout, verify, cert, proxies)
517 raise SSLError(e, request=request)
--> 519 raise ConnectionError(e, request=request)
521 except ClosedPoolError as e:
ConnectionError: HTTPSConnectionPool(host='api.openai.com', port=443): Max retries exceeded with url: /v1/chat/completions (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x2b9c368>: Failed to establish a new connection: [Errno 50] Protocol not available'))
The above exception was the direct cause of the following exception:
APIConnectionError Traceback (most recent call last)
Cell In[2], line 9
7 import openai
8 openai.api_key = "API KEY"
----> 9 chat_completion = openai.ChatCompletion.create(model="gpt-3.5-turbo", messages=[{"role": "user", "content": "Hello world"}])
File /lib/python3.11/site-packages/openai/api_resources/chat_completion.py:25, in ChatCompletion.create(cls, *args, **kwargs)
23 while True:
24 try:
---> 25 return super().create(*args, **kwargs)
26 except TryAgain as e:
27 if timeout is not None and time.time() > start + timeout:
File /lib/python3.11/site-packages/openai/api_resources/abstract/engine_api_resource.py:155, in EngineAPIResource.create(cls, api_key, api_base, api_type, request_id, api_version, organization, **params)
129 @classmethod
130 def create(
131 cls,
(...)
138 **params,
139 ):
140 (
141 deployment_id,
142 engine,
(...)
152 api_key, api_base, api_type, api_version, organization, **params
153 )
--> 155 response, _, api_key = requestor.request(
156 "post",
157 url,
158 params=params,
159 headers=headers,
160 stream=stream,
161 request_id=request_id,
162 request_timeout=request_timeout,
163 )
165 if stream:
166 # must be an iterator
167 assert not isinstance(response, OpenAIResponse)
File /lib/python3.11/site-packages/openai/api_requestor.py:289, in APIRequestor.request(self, method, url, params, headers, files, stream, request_id, request_timeout)
278 def request(
279 self,
280 method,
(...)
287 request_timeout: Optional[Union[float, Tuple[float, float]]] = None,
288 ) -> Tuple[Union[OpenAIResponse, Iterator[OpenAIResponse]], bool, str]:
--> 289 result = self.request_raw(
290 method.lower(),
291 url,
292 params=params,
293 supplied_headers=headers,
294 files=files,
295 stream=stream,
296 request_id=request_id,
297 request_timeout=request_timeout,
298 )
299 resp, got_stream = self._interpret_response(result, stream)
300 return resp, got_stream, self.api_key
File /lib/python3.11/site-packages/openai/api_requestor.py:619, in APIRequestor.request_raw(self, method, url, params, supplied_headers, files, stream, request_id, request_timeout)
617 raise error.Timeout("Request timed out: {}".format(e)) from e
618 except requests.exceptions.RequestException as e:
--> 619 raise error.APIConnectionError(
620 "Error communicating with OpenAI: {}".format(e)
621 ) from e
622 util.log_debug(
623 "OpenAI API response",
624 path=abs_url,
(...)
627 request_id=result.headers.get("X-Request-Id"),
628 )
629 # Don't read the whole stream for debug logging unless necessary.
APIConnectionError: Error communicating with OpenAI: HTTPSConnectionPool(host='api.openai.com', port=443): Max retries exceeded with url: /v1/chat/completions (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x2b9c368>: Failed to establish a new connection: [Errno 50] Protocol not available')) |
Is there any update on the: issue? |
@psymbio No, I am not working on this issue currently. Did you try |
Yes, I have patched pyodide-http in the above code - but it gives this error. For the newer version of the openai library (that uses httpx), I'm working on building the custom transport layer as well. |
Updates: Currently there's some work going on to support urllib3 on Pyodide which will be really helpful. It's in testing phase so, I've added test cases for the PR on urllib3 urllib3/urllib3#3195. In the new release of the library OpenAI has implemented "configuring http clients" (https://github.com/openai/openai-python/blob/main/README.md#configuring-the-http-client). So essentially, using the fix for the Transport API (see: https://www.python-httpx.org/advanced/#custom-transports, https://gist.github.com/florimondmanca/d56764d78d748eb9f73165da388e546e (copy paste this in a python file), and urllib3/urllib3#2073) the working code would look something like: import micropip
await micropip.install('https://raw.githubusercontent.com/psymbio/tiktoken_rust_wasm/main/packages/multidict/multidict-4.7.6-py3-none-any.whl', keep_going=True)
await micropip.install('https://raw.githubusercontent.com/psymbio/tiktoken_rust_wasm/main/packages/frozenlist/frozenlist-1.4.0-py3-none-any.whl', keep_going=True)
await micropip.install('https://raw.githubusercontent.com/psymbio/tiktoken_rust_wasm/main/packages/aiohttp/aiohttp-4.0.0a2.dev0-py3-none-any.whl', keep_going=True)
await micropip.install('https://raw.githubusercontent.com/psymbio/tiktoken_rust_wasm/main/packages/openai_wheel/openai-1.3.7-py3-none-any.whl', keep_going=True)
await micropip.install('https://raw.githubusercontent.com/psymbio/tiktoken_rust_wasm/main/packages/urllib3/urllib3-2.1.0-py3-none-any.whl', keep_going=True)
# although here httpcore needs to be in the newest version and https://gist.github.com/florimondmanca/d56764d78d748eb9f73165da388e546e needs to be updated
await micropip.install('https://raw.githubusercontent.com/psymbio/tiktoken_rust_wasm/main/packages/httpcore/httpcore-0.12.3-py3-none-any.whl', keep_going=True)
await micropip.install("ssl")
import ssl
import pyodide_http
pyodide_http.patch_all()
import openai
import httpx
from openai import OpenAI
import urllib3
from urllib3_transport import URLLib3Transport
urllib3_transport_client = httpx.Client(transport=URLLib3Transport())
client = OpenAI(
base_url="https://api.openai.com/v1/completions",
api_key="xxx",
http_client=urllib3_trasport_client
) But the problem is that the transport support was for httpcore 0.12.x and now httpcore is at 1.0.2; 0.12.x had the implmentation for Possible solutions:
I'll probably mention this issue in other issues to keep track of the issue. |
Updates: Commented on gist and did the custom transport update: encode/httpx#2994 Closed previous ticket and opened a new ticket on the openai-python repo. |
Solution: import micropip
await micropip.install('https://raw.githubusercontent.com/psymbio/pyodide_wheels/main/multidict/multidict-4.7.6-py3-none-any.whl', keep_going=True)
await micropip.install('https://raw.githubusercontent.com/psymbio/pyodide_wheels/main/frozenlist/frozenlist-1.4.0-py3-none-any.whl', keep_going=True)
await micropip.install('https://raw.githubusercontent.com/psymbio/pyodide_wheels/main/aiohttp/aiohttp-4.0.0a2.dev0-py3-none-any.whl', keep_going=True)
await micropip.install('https://raw.githubusercontent.com/psymbio/pyodide_wheels/main/openai/openai-1.3.7-py3-none-any.whl', keep_going=True)
await micropip.install('https://raw.githubusercontent.com/psymbio/pyodide_wheels/main/urllib3/urllib3-2.1.0-py3-none-any.whl', keep_going=True)
await micropip.install("ssl")
import ssl
await micropip.install("httpx", keep_going=True)
import httpx
await micropip.install('https://raw.githubusercontent.com/psymbio/pyodide_wheels/main/urllib3/urllib3-2.1.0-py3-none-any.whl', keep_going=True)
import urllib3
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
import json
class URLLib3Transport(httpx.BaseTransport):
def __init__(self):
self.pool = urllib3.PoolManager()
def handle_request(self, request: httpx.Request):
payload = json.loads(request.content.decode("utf-8").replace("'",'"'))
urllib3_response = self.pool.request(request.method, str(request.url), headers=request.headers, json=payload) # Convert httpx.URL to string
content = json.loads(urllib3_response.data.decode('utf-8')) # Decode the data and load as JSON
stream = httpx.ByteStream(json.dumps(content).encode("utf-8")) # Convert back to JSON and encode
headers = [(b"content-type", b"application/json")]
return httpx.Response(200, headers=headers, stream=stream)
client = httpx.Client(transport=URLLib3Transport())
from openai import OpenAI
openai_client = OpenAI(
base_url="https://api.openai.com/v1/",
api_key="xxx",
http_client=client
)
response = openai_client.chat.completions.with_raw_response.create(
messages=[{
"role": "user",
"content": "sing me a song",
}],
model="gpt-3.5-turbo",
max_tokens=30,
temperature=0.7
)
completion = response.parse()
print(completion) Output: ChatCompletion(id='chatcmpl-8U6JZbqVEFJp5t9lh4jZrADf4dKP2', choices=[Choice(finish_reason='length', index=0, message=ChatCompletionMessage(content="(Verse 1)\nIn a world full of wonder, let's sing a melody,\nWhere love and joy intertwine, and hearts can be set", role='assistant', function_call=None, tool_calls=None))], created=1702184805, model='gpt-3.5-turbo-0613', object='chat.completion', system_fingerprint=None, usage=CompletionUsage(completion_tokens=30, prompt_tokens=11, total_tokens=41)) Thanks @RobertCraigie and @ryanking13! |
Amazing @psymbio! Thanks for sharing your solution :) |
I gave the example a try, and I get furthur then previously when I would fail at the:
Now I get down to:
|
See also #4549, which would add openai to the package set. |
So, Openai in the latest version of their library openai-python shifted from requests to httpx (openai/openai-python#742 and https://github.com/openai/openai-python/blob/main/README.md). I wanted to understand what's the best possible way to make this work with Pyodide?
Should I consider rolling back to v0.28.0 which uses requests?
Create the wheel:
git clone --branch release-v0.28.1 https://github.com/openai/openai-python.git cd openai-python/ python3 setup.py bdist_wheel
Use it on JupyterLite:
I get the
BadZipFile
error for this.Should I consider patching the current library to use requests wherever httpx is used (this is a lot of places - and I'm not sure if this is possible to do, also if their library updates this means I will have to as well).
Will this work as-is if I create a wheel and install it?
Currently this fails for me; To recreate this run the following:
git clone https://github.com/openai/openai-python.git cd openai-python/ python -m pip install build python -m build
This create a .whl file in the
dist/
directory, which I then upload to https://jupyter.org/try-jupyter/lab/ and write the following code:If this gives an error:
BadZipFile
try this instead:Run the following code on JupyterLite
However, I still get the same
BadZipFile
error. Not entirely sure what triggers it, for some packages like (https://github.com/psymbio/tiktoken_rust_wasm/blob/main/packages/build_emscripten_cached_weights/tiktoken-0.5.1-cp311-cp311-emscripten_3_1_45_wasm32.whl) downloading the wheel from Github instead of placing the wheel in JupyterLite, seems to be the fix.Another way to get it to work is:
This results in the real issue the package doesn't work:
Doing something simpler like calling the APIs myself/creating a simpler library that does this:
Also, not able to tell whether this error should just be titled:
BadZipFile
error instead. Thanks for looking into this, in advance.🐍 Package Request
"httpx>=0.23.0, <1",
"pydantic>=1.9.0, <3",
"typing-extensions>=4.5, <5",
"anyio>=3.5.0, <4",
"distro>=1.7.0, <2",
"tqdm > 4"
The text was updated successfully, but these errors were encountered: