I see in the docs that we can use Llangchain suppo...
# marvin-ai
b
I see in the docs that we can use Llangchain supported models, but if I have my GCP credentials in a Prefect Block, what is the most direct way to change the LLM model to one on Vertex AI? Are there any examples of a basic flow?
I was able to sort this out. Curious if there is another more "direct" path, but below appears to have done the trick. Setup:
Copy code
import vertexai
from vertexai.generative_models import GenerativeModel
from google.oauth2.service_account import Credentials

# get the credentials from Prefect
from prefect_gcp import GcpCredentials
gcp_credentials_block = GcpCredentials.load("MY_GCP_PROJECT_ID")

# create the credentials
service_account_json_str = gcp_credentials_block.service_account_info.get_secret_value()
credentials = Credentials.from_service_account_info(service_account_json_str)

from langchain_google_vertexai import ChatVertexAI
model = ChatVertexAI(model="gemini-1.5-flash", credentials=credentials)

import controlflow as cf # info warning 

# set the model from Vertex AI as the default model
cf.defaults.model = model
With above setup, a small example from the quick start
Copy code
emails = [
    "Hello, I need an update on the project status.",
    "Subject: Exclusive offer just for you!",
    "Urgent: Project deadline moved up by one week.",
]


reply = cf.run(
    "Write a polite reply to an email",
    context=dict(email=emails[0])
)

print(reply)
Yields:
Copy code
>>> print(reply)
Hello, thank you for your email. I will provide an update on the project status as soon as possible.
j
I’m glad that works! GCP is always tough with creds, in the future an agent block would be great