-
Notifications
You must be signed in to change notification settings - Fork 575
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can I replace OpenAPI with Azure Cognitive Services API and what modifications would be necessary for this migration? #64
Comments
@DhananjayanOnline yes I think it'd be pretty easy to replace as the LlamaIndex framework does have an implementation of the generic LLM interface for Azure's OpenAI service - see LlamaIndex docs on how to set this up. I think the main places where you'd need to make changes in the codebase are in |
@sourabhdesai I've made the changes in the code as per your previous suggestion, but I'm encountering a response that says, 'Sorry, I either couldn't comprehend your question or I don't have an answer for it.' It appears that the engine is returning an empty response. |
Experiencing same issue and same behavior , when switching to AzureOpenAI , when i see the code , i see that verification for support for function call is here , but not sure why it's happening |
I am currently facing an error when using the AzureOpenAI library, The error message I am receiving is as follows: Traceback (most recent call last):
File "/home/jay/.cache/pypoetry/virtualenvs/llama-app-backend-D3oLmLlb-py3.11/lib/python3.11/site-packages/tenacity/__init__.py", line 382, in __call__
result = fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/home/jay/.cache/pypoetry/virtualenvs/llama-app-backend-D3oLmLlb-py3.11/lib/python3.11/site-packages/llama_index/embeddings/openai.py", line 166, in get_embeddings
data = openai.Embedding.create(input=list_of_text, model=engine, **kwargs).data
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/jay/.cache/pypoetry/virtualenvs/llama-app-backend-D3oLmLlb-py3.11/lib/python3.11/site-packages/openai/api_resources/embedding.py", line 33, in create
response = super().create(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/jay/.cache/pypoetry/virtualenvs/llama-app-backend-D3oLmLlb-py3.11/lib/python3.11/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 151, in create
) = cls.__prepare_create_request(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/jay/.cache/pypoetry/virtualenvs/llama-app-backend-D3oLmLlb-py3.11/lib/python3.11/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 85, in __prepare_create_request
raise error.InvalidRequestError(
openai.error.InvalidRequestError: Must provide an 'engine' or 'deployment_id' parameter to create a <class 'openai.api_resources.embedding.Embedding'> I've made the necessary changes in the code as per your previous suggestion. |
I am also facing the same issue with empty responses when using the I've confirmed that my parameters are correct, as I have valid embeddings being generated and I can get valid responses by calling Has anyone had success with AzureOpenAI so far? |
No description provided.
The text was updated successfully, but these errors were encountered: