-
Notifications
You must be signed in to change notification settings - Fork 6.5k
feat(vertex-ai): function calling: convert markdown 'samples' to python #12811
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Changes from 13 commits
ac8aaaa
9134412
b0788f5
693f914
ade95c7
d7f744b
5b9e49b
2772747
1f8c008
6e34636
9e73443
98df51c
87aa2a4
26f725a
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,210 @@ | ||
# Copyright 2024 Google LLC | ||
# | ||
# Licensed under the Apache License, Version 2.0 (the "License"); | ||
# you may not use this file except in compliance with the License. | ||
# You may obtain a copy of the License at | ||
# | ||
# https://www.apache.org/licenses/LICENSE-2.0 | ||
# | ||
# Unless required by applicable law or agreed to in writing, software | ||
# distributed under the License is distributed on an "AS IS" BASIS, | ||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
# See the License for the specific language governing permissions and | ||
# limitations under the License. | ||
|
||
import os | ||
|
||
|
||
PROJECT_ID = os.getenv("GOOGLE_CLOUD_PROJECT") | ||
|
||
|
||
def create_app() -> object: | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Consolidates code samples for the https://cloud.google.com/vertex-ai/generative-ai/docs/multimodal/function-calling#how-works doc section into a single, testable script. Each sub-step of this section presents a discrete code snippet (block) for the final application, demonstrating a single function calling concept at a time. This compilation creates a complete, maintainable implementation for DevRel purposes, reducing the amount of separate files we'd have if we keep each snippet in a separate file. |
||
# [START generativeaionvertexai_gemini_function_calling_app_setup] | ||
from typing import List | ||
|
||
import vertexai | ||
|
||
from vertexai.generative_models import ( | ||
Content, | ||
FunctionDeclaration, | ||
GenerationConfig, | ||
GenerativeModel, | ||
Part, | ||
Tool, | ||
ToolConfig, | ||
) | ||
|
||
# Initialize Vertex AI | ||
# TODO(developer): Update and un-comment below lines | ||
# PROJECT_ID = 'your-project-id' | ||
vertexai.init(project=PROJECT_ID, location="us-central1") | ||
|
||
# Initialize Gemini model | ||
model = GenerativeModel(model_name="gemini-1.5-flash-002") | ||
# [END generativeaionvertexai_gemini_function_calling_app_setup] | ||
|
||
# [START generativeaionvertexai_gemini_function_calling_app_declare_1] | ||
function_name = "get_current_weather" | ||
get_current_weather_func = FunctionDeclaration( | ||
name=function_name, | ||
description="Get the current weather in a given location", | ||
# Function parameters are specified in JSON schema format | ||
parameters={ | ||
"type": "object", | ||
"properties": { | ||
"location": {"type": "string", "description": "The city name of the location for which to get the weather."} | ||
}, | ||
}, | ||
) | ||
# [END generativeaionvertexai_gemini_function_calling_app_declare_1] | ||
|
||
# [START generativeaionvertexai_gemini_function_calling_app_declare_2] | ||
extract_sale_records_func = FunctionDeclaration( | ||
name="extract_sale_records", | ||
description="Extract sale records from a document.", | ||
parameters={ | ||
"type": "object", | ||
"properties": { | ||
"records": { | ||
"type": "array", | ||
"description": "A list of sale records", | ||
"items": { | ||
"description": "Data for a sale record", | ||
"type": "object", | ||
"properties": { | ||
"id": {"type": "integer", "description": "The unique id of the sale."}, | ||
"date": {"type": "string", "description": "Date of the sale, in the format of MMDDYY, e.g., 031023"}, | ||
"total_amount": {"type": "number", "description": "The total amount of the sale."}, | ||
"customer_name": {"type": "string", "description": "The name of the customer, including first name and last name."}, | ||
"customer_contact": {"type": "string", "description": "The phone number of the customer, e.g., 650-123-4567."}, | ||
}, | ||
"required": ["id", "date", "total_amount"], | ||
}, | ||
}, | ||
}, | ||
"required": ["records"], | ||
}, | ||
) | ||
# [END generativeaionvertexai_gemini_function_calling_app_declare_2] | ||
|
||
# [START generativeaionvertexai_gemini_function_calling_app_declare_3] | ||
# Define a function. Could be a local function or you can import the requests library to call an API | ||
def multiply_numbers(numbers: List[int]) -> int: | ||
""" | ||
Calculates the product of all numbers in an array. | ||
|
||
Args: | ||
numbers: An array of numbers to be multiplied. | ||
|
||
Returns: | ||
The product of all the numbers. If the array is empty, returns 1. | ||
""" | ||
|
||
if not numbers: # Handle empty array | ||
return 1 | ||
|
||
product = 1 | ||
for num in numbers: | ||
product *= num | ||
|
||
return product | ||
|
||
multiply_number_func = FunctionDeclaration.from_func(multiply_numbers) | ||
|
||
''' | ||
multiply_number_func contains the following schema: | ||
|
||
name: "multiply_numbers" | ||
description: "Calculates the product of all numbers in an array." | ||
parameters { | ||
type_: OBJECT | ||
properties { | ||
key: "numbers" | ||
value { | ||
description: "An array of numbers to be multiplied." | ||
title: "Numbers" | ||
} | ||
} | ||
required: "numbers" | ||
description: "Calculates the product of all numbers in an array." | ||
title: "multiply_numbers" | ||
} | ||
''' | ||
# [END generativeaionvertexai_gemini_function_calling_app_declare_3] | ||
|
||
# [START generativeaionvertexai_gemini_function_calling_app_prompt] | ||
# Define the user's prompt in a Content object that we can reuse in model calls | ||
user_prompt_content = Content( | ||
role="user", | ||
parts=[ | ||
Part.from_text("What is the weather like in Boston?"), | ||
], | ||
) | ||
# [END generativeaionvertexai_gemini_function_calling_app_prompt] | ||
|
||
# [START generativeaionvertexai_gemini_function_calling_app_submit] | ||
# Define a tool that includes some of the functions that we declared earlier | ||
tool = Tool( | ||
function_declarations=[get_current_weather_func, extract_sale_records_func, multiply_number_func], | ||
) | ||
|
||
# Send the prompt and instruct the model to generate content using the Tool object that you just created | ||
response = model.generate_content( | ||
user_prompt_content, | ||
generation_config=GenerationConfig(temperature=0), | ||
tools=[tool], | ||
tool_config=ToolConfig( | ||
function_calling_config=ToolConfig.FunctionCallingConfig( | ||
# ANY mode forces the model to predict only function calls | ||
mode=ToolConfig.FunctionCallingConfig.Mode.ANY, | ||
# Allowed function calls to predict when the mode is ANY. If empty, any of | ||
# the provided function calls will be predicted. | ||
allowed_function_names=["get_current_weather"], | ||
) | ||
) | ||
) | ||
# [END generativeaionvertexai_gemini_function_calling_app_submit] | ||
|
||
# flake8: noqa=F841 # Intentionally unused variable `location` in sample code | ||
# [START generativeaionvertexai_gemini_function_calling_app_invoke] | ||
# Check the function name that the model responded with, and make an API call to an external system | ||
if (response.candidates[0].function_calls[0].name == "get_current_weather"): | ||
# Extract the arguments to use in your API call | ||
location = response.candidates[0].function_calls[0].args["location"] | ||
|
||
# Here you can use your preferred method to make an API request to fetch the current weather, for example: | ||
# api_response = requests.post(weather_api_url, data={"location": location}) | ||
|
||
# In this example, we'll use synthetic data to simulate a response payload from an external API | ||
api_response = """{ "location": "Boston, MA", "temperature": 38, "description": "Partly Cloudy", | ||
"icon": "partly-cloudy", "humidity": 65, "wind": { "speed": 10, "direction": "NW" } }""" | ||
# [END generativeaionvertexai_gemini_function_calling_app_invoke] | ||
# flake8: qa=F841 | ||
|
||
# [START generativeaionvertexai_gemini_function_calling_app_generate] | ||
response = model.generate_content( | ||
[ | ||
user_prompt_content, # User prompt | ||
response.candidates[0].content, # Function call response | ||
Content( | ||
parts=[ | ||
Part.from_function_response( | ||
name="get_current_weather", | ||
response={ | ||
"content": api_response, # Return the API response to Gemini | ||
}, | ||
) | ||
], | ||
), | ||
], | ||
tools=[tool], | ||
) | ||
# Get the model summary response | ||
summary = response.text | ||
# [END generativeaionvertexai_gemini_function_calling_app_generate] | ||
|
||
return { "weather_response": summary, "tool": tool } | ||
|
||
|
||
if __name__ == "__main__": | ||
create_app() |
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -16,11 +16,16 @@ | |
|
||
from google.api_core.exceptions import ResourceExhausted | ||
|
||
import pytest | ||
|
||
from vertexai.generative_models import GenerativeModel | ||
|
||
import advanced_example | ||
import basic_example | ||
import chat_example | ||
import chat_function_calling_basic | ||
import chat_function_calling_config | ||
import function_calling_application | ||
import parallel_function_calling_example | ||
|
||
|
||
|
@@ -52,12 +57,14 @@ def test_function_calling_advanced_function_selection() -> None: | |
) | ||
|
||
|
||
@pytest.mark.skip(reason="Blocked on b/... ") | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Both samples that use OpenAI client are not working anymore (log):
Link to the doc section with OpenAI samples: https://cloud.google.com/vertex-ai/generative-ai/docs/model-reference/function-calling#python-openai There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Created a general issue to initiate the discussion about current CI process. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. The error
The issue you've created is a good first step to track this problem. Consider adding more details about your environment (Python version, OpenAI library version, proxy configuration if applicable) to help diagnose the root cause. |
||
@backoff.on_exception(backoff.expo, ResourceExhausted, max_time=10) | ||
def test_function_calling_basic() -> None: | ||
response = chat_function_calling_basic.generate_text() | ||
assert "get_current_weather" in response.choices[0].message.tool_calls[0].id | ||
|
||
|
||
@pytest.mark.skip(reason="Blocked on b/... ") | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I will need help with creating a real ticket in Buganizer. |
||
@backoff.on_exception(backoff.expo, ResourceExhausted, max_time=10) | ||
def test_function_calling_config() -> None: | ||
response = chat_function_calling_config.generate_text() | ||
|
@@ -85,3 +92,27 @@ def test_function_calling_chat() -> None: | |
def test_parallel_function_calling() -> None: | ||
response = parallel_function_calling_example.parallel_function_calling_example() | ||
assert response is not None | ||
|
||
|
||
def test_function_calling_app() -> None: | ||
result = function_calling_application.create_app() | ||
assert result["weather_response"] is not None | ||
|
||
tool = result["tool"] | ||
model = GenerativeModel(model_name="gemini-1.5-pro-002", tools=[tool]) | ||
chat_session = model.start_chat() | ||
|
||
response = chat_session.send_message("What will be 1 multiplied by 2?") | ||
assert response is not None | ||
|
||
extract_sales_prompt = """ | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I am adding a test for each toolbox function created by this sample because of this reason. The toolbox is lazily evaluated, and it's not enough to simply create a function declaration to check that it's working. |
||
I need to parse a series of sale transactions written down in a text editor and extract | ||
full sales records for each transaction. | ||
1 / 031023 / $123,02 | ||
2 / 031123 / $12,99 | ||
3 / 031123 / $12,99 | ||
4 / 031223 / $15,99 | ||
5 / 031223 / $2,20 | ||
""" | ||
response = chat_session.send_message(extract_sales_prompt) | ||
assert response |
Uh oh!
There was an error while loading. Please reload this page.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Gemini Flash is not working anymore with this code sample (see the log):
The actual response from the model is now:
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Created a general issue to initiate the discussion about current CI process.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for reporting this, Valeriy-Burlaka. The Gemini Flash model appears to have been deprecated. The change to
gemini-1.5-pro-002
in the pull request reflects the current, supported model. Your created issue is the appropriate place to track the broader CI implications.