<@ULVA73B9P> How can I use azure openai in marvin?...
# marvin-in-the-wild
s
@Marvin How can I use azure openai in marvin? The endpoint URI does NOT have prefix "openai" ( it was omitted by my load balancer).
m
It appears that there was an error while fetching the docs, and no relevant GitHub issues were found. Given your requirements, I can suggest a general approach to configure a custom endpoint URI if you're working with Azure OpenAI within a platform like Prefect: 1. Ensure Custom Endpoint Configuration: Modify your configuration to use the custom endpoint provided by your load balancer. This usually means setting the endpoint URL in your environment variables or directly in your code where you initialize the Azure OpenAI service. 2. Environment Variable Setup: Ensure that your environment variables for the Azure OpenAI service correctly reflect the custom endpoint. An example of setting these variables in a Python script is provided below:
Copy code
python
   import os

   os.environ["AZURE_OPENAI_ENDPOINT"] = "<https://your-custom-endpoint.com/>"
   os.environ["AZURE_OPENAI_KEY"] = "your-azure-openai-key"
3. Initializing Azure OpenAI with Custom Endpoint: When initializing the Azure OpenAI client, make sure to use the environment variables and custom endpoints:
Copy code
python
   from azure.ai.openai import OpenAIClient

   endpoint = os.getenv("AZURE_OPENAI_ENDPOINT")
   api_key = os.getenv("AZURE_OPENAI_KEY")

   client = OpenAIClient(endpoint=endpoint, credential=api_key)
   response = client.completions.create(
       engine="davinci-codex",
       prompt="Once upon a time...",
       max_tokens=50
   )
   print(response.choices[0].text)
4. Prefect Configuration: If you're configuring this within Prefect, make sure the flow's environment or runtime configuration uses the correct endpoint:
Copy code
python
   from prefect import flow, task
   import os

   @task
   def call_openai():
       from azure.ai.openai import OpenAIClient
       endpoint = os.getenv("AZURE_OPENAI_ENDPOINT")
       api_key = os.getenv("AZURE_OPENAI_KEY")
       client = OpenAIClient(endpoint=endpoint, credential=api_key)
       response = client.completions.create(
           engine="davinci-codex",
           prompt="Once upon a time...",
           max_tokens=50
       )
       return response.choices[0].text

   @flow
   def azure_openai_flow():
       result = call_openai()
       print(result)

   if __name__ == "__main__":
       azure_openai_flow()
If any adjustments or more specific details are required, please provide them so I can offer a more tailored solution.
s
@Marvin great! how can I use the custom client for "@ai_fn" decorator?
m
It seems there are no specific resources for configuring a custom Azure OpenAI client with the
@ai_fn
decorator. However, we can deduce a general approach based on typical patterns: ### Custom Client in
@ai_fn
Decorator If you’re using a custom client in a context like the
@ai_fn
decorator in Prefect or another framework, you'll typically need to manually configure the client and integrate it into your function. Here's an example of how you might do it: 1. Define the Custom Client: Create a client that uses the custom endpoint and API key:
Copy code
python
   from azure.ai.openai import OpenAIClient
   import os

   def get_custom_openai_client():
       endpoint = os.getenv("AZURE_OPENAI_ENDPOINT")
       api_key = os.getenv("AZURE_OPENAI_KEY")
       return OpenAIClient(endpoint=endpoint, credential=api_key)
2. Integrate with
@ai_fn
: Define the function that will use the
@ai_fn
decorator and incorporate the custom client:
Copy code
python
   from prefect import flow, task
   from prefect_ai import ai_fn

   @ai_fn
   def generate_text(prompt: str):
       client = get_custom_openai_client()
       response = client.completions.create(
           engine="davinci-codex",
           prompt=prompt,
           max_tokens=50
       )
       return response.choices[0].text

   @flow
   def azure_openai_flow(prompt: str):
       result = generate_text(prompt)
       print(result)

   if __name__ == "__main__":
       os.environ["AZURE_OPENAI_ENDPOINT"] = "<https://your-custom-endpoint.com/>"
       os.environ["AZURE_OPENAI_KEY"] = "your-azure-openai-key"
       azure_openai_flow("Once upon a time...")
### Explanation: - Custom Client Function: - `get_custom_openai_client`: Sets the custom endpoint and credential. - Decorated Function: - `generate_text`: Uses the custom client to generate text based on the given prompt. - Main Flow: - `azure_openai_flow`: Calls the decorated function and prints the result. By following these steps, you can utilize a custom Azure OpenAI client within a function decorated with
@ai_fn
. Please let me know if you need more specific details!
s
@Marvin NO I meant "@ai_fn" can be called with some arguments and it has "client" as its name. so I wonder if I use that argument to use our custom client.
m
Unfortunately, the search results did not provide specific information regarding the
@ai_fn
decorator and its arguments, particularly how to use a custom client. However, based on the general usage patterns, you can typically pass custom clients or configurations as arguments to decorators. Here's a guided way to use the
@ai_fn
decorator with a custom Azure OpenAI client: ### Using
@ai_fn
With Custom Client 1. Define the Custom Client: You have already created a method to get the custom Azure OpenAI client using the custom endpoint.
Copy code
python
   from azure.ai.openai import OpenAIClient
   import os

   def get_custom_openai_client():
       endpoint = os.getenv("AZURE_OPENAI_ENDPOINT")
       api_key = os.getenv("AZURE_OPENAI_KEY")
       return OpenAIClient(endpoint=endpoint, credential=api_key)
2. Define the Function with
@ai_fn
: Use the
client
parameter to pass the custom client to the
@ai_fn
decorator.
Copy code
python
   from prefect import flow, task
   from prefect_ai import ai_fn

   custom_client = get_custom_openai_client()

   @ai_fn(client=custom_client)
   def generate_text(prompt: str):
       response = custom_client.completions.create(
           engine="davinci-codex",
           prompt=prompt,
           max_tokens=50
       )
       return response.choices[0].text

   @flow
   def azure_openai_flow(prompt: str):
       result = generate_text(prompt)
       print(result)

   if __name__ == "__main__":
       os.environ["AZURE_OPENAI_ENDPOINT"] = "<https://your-custom-endpoint.com/>"
       os.environ["AZURE_OPENAI_KEY"] = "your-azure-openai-key"
       azure_openai_flow("Once upon a time...")
### Explanation: - Custom Client Function: - `get_custom_openai_client`: Configures and returns the custom Azure OpenAI client with the custom endpoint. - Decorated Function: - `@ai_fn(client=custom_client)`: Passes the custom Azure OpenAI client to the decorated function
generate_text
. - Main Flow: - `azure_openai_flow`: Calls the decorated function and prints the generated text. This way, you can leverage a custom Azure OpenAI client with the
@ai_fn
decorator by passing it through the
client
argument. Feel free to adjust the details for your specific use case and let me know if you need further assistance!
s
thank you. let me check,
@Marvin I did but it was not resolved.
m
Given that the
@ai_fn
decorator does not work as expected when trying to pass a custom client directly, we need another approach to incorporate the custom Azure OpenAI client into your decorated function. Here’s an alternative approach to ensure the custom Azure OpenAI client is correctly utilized within the function: ### Custom Wrapper for
@ai_fn
1. Define Custom Client Loader: Ensure you have the custom client loader.
Copy code
python
   from azure.ai.openai import OpenAIClient
   import os

   def get_custom_openai_client():
       endpoint = os.getenv("AZURE_OPENAI_ENDPOINT")
       api_key = os.getenv("AZURE_OPENAI_KEY")
       return OpenAIClient(endpoint=endpoint, credential=api_key)
2. Define Decorated Function with Wrapper: Use a wrapper to ensure the custom client is available within the decorated function.
Copy code
python
   from prefect import flow
   from prefect_ai import ai_fn

   custom_client = get_custom_openai_client()

   def custom_ai_fn(*args, **kwargs):
       client = kwargs.pop("client", None)
       if client is None:
           client = custom_client

       @ai_fn(*args, **kwargs)
       def inner_function(prompt: str):
           response = client.completions.create(
               engine="davinci-codex",
               prompt=prompt,
               max_tokens=50
           )
           return response.choices[0].text

       return inner_function(*args, **kwargs)

   @flow
   def azure_openai_flow(prompt: str):
       result = custom_ai_fn(prompt=prompt)
       print(result)

   if __name__ == "__main__":
       os.environ["AZURE_OPENAI_ENDPOINT"] = "<https://your-custom-endpoint.com/>"
       os.environ["AZURE_OPENAI_KEY"] = "your-azure-openai-key"
       azure_openai_flow("Once upon a time...")
### Explanation: - Custom Client Loaded: -
get_custom_openai_client()
loads the custom Azure OpenAI client. - Wrapper Function: -
custom_ai_fn
wraps
ai_fn
and ensures the custom client is passed or used by default. - Main Flow: -
azure_openai_flow
calls the wrapped function and prints the result. This way, the custom client is always available within the function, ensuring the custom endpoint is used correctly. Does this approach resolve the issue?
s
no thank you. close this ticket.
t
Hi @syakesaba have you heard of ControlFlow? Its a new library that is largely a successor of the Marvin libary. The docs for configuring LLMs for ControlFlow is here.