If you see a request for an OpenAI API key but haven’t explicitly configured OpenAI, it’s because Agno uses OpenAI models by default in several places, including:
The default model when unspecified in Agent
The default embedder is OpenAIEmbedder with VectorDBs, unless specified
It is best to specify the model for the agent explicitly, otherwise it would default to OpenAIChat.For example, to use Google’s Gemini instead of OpenAI:
Copy
Ask AI
from agno.agent import Agent, RunResponsefrom agno.models.google import Geminiagent = Agent( model=Gemini(id="gemini-1.5-flash"), markdown=True,)# Print the response in the terminalagent.print_response("Share a 2 sentence horror story.")
For more details on configuring different model providers, check our models documentation
The same applies to embeddings. If you want to use a different embedder instead of OpenAIEmbedder, configure it explicitly.For example, to use Google’s Gemini as an embedder, use GeminiEmbedder:
Copy
Ask AI
from agno.agent import AgentKnowledgefrom agno.vectordb.pgvector import PgVectorfrom agno.embedder.google import GeminiEmbedder# Embed sentence in databaseembeddings = GeminiEmbedder().get_embedding("The quick brown fox jumps over the lazy dog.")# Print the embeddings and their dimensionsprint(f"Embeddings: {embeddings[:5]}")print(f"Dimensions: {len(embeddings)}")# Use an embedder in a knowledge baseknowledge_base = AgentKnowledge( vector_db=PgVector( db_url="postgresql+psycopg://ai:ai@localhost:5532/ai", table_name="gemini_embeddings", embedder=GeminiEmbedder(), ), num_documents=2,)