Spring AI Function Calling Example

In Spring AI, function calling is the ability for the model to request one or more function calls to be made on its behalf by the ChatBot application.

Spring AI Function Calling Flow Diagram

When interacting with LLMs, we would like the LLM to have access to external systems, like a remote web API that retrieves realtime information or perform an action, or services that perform some computation. Function calling is the ability for the model to request one or more function calls to be made on its behalf, so it can properly answer a user’s prompt with realtime data.

1. How does Function Calling Work?

At a high level, LLMs are auto-completion programs on steroids. They are pretty good at generating text with the historical knowledge they are already trained on. They cannot find real-time information from remote APIs or perform scientific calculations on our behalf.

So, it is our responsibility to write a function that can be invoked to find the realtime information needed to answer the user query.

  • When passing user prompts to LLM, we must provide the information about this function along with the other metadata.
  • LLMs will send back a function execution request when it needs the information to answer the user query.
  • The chatbot app calls the service method or function that will find the required real-time information.
  • The chatbot app sends the JSON response back to the LLM.
  • The LLM looks at the JSON response, interprets that information, and then replies back with the text including the real-time information.

2. Function Calling with Spring Boot and Spring AI

To enable the function calling feature, we have to perform two steps:

  • Define a function as Spring bean
  • Specify the function in chat options when communicating with LLM

2.1. Defining a Function

In Spring AI, we define a function as a bean of type Function. This will be invoked when LLM needs the realtime information. In this example, when the function is invoked, it executes the StockPriceService::getStockPrice method.

import com.howtodoinjava.ai.demo.StockPriceService.Stock;
import java.util.function.Function;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Description;

@Configuration(proxyBeanMethods = false)
public class Functions {

  @Bean
  @Description("Get price by stock name")
  public Function<Stock, Double> priceByStockNameFunction(StockPriceService stockPriceService) {
    return stockPriceService::getStockPrice;
  }
}

The stockPriceService.getStockPrice() is the method where we perform the API call or calculations. This helps in decoupling the API call logic from the LLM function calling logic.

The bean descriptions are very important. They help the LLM understand what a function will do, and thus decide whether this function needs to be called in the context of the conversation.

We have used a Map of stock names and prices for demo purposes. In real-life applications, it will be an HTTP call to an external service.

@Service
public class StockPriceService {

  private static final Map<Stock, Double> data = new ConcurrentHashMap<>();

  static {
    data.put(new Stock("Google"), 101.00);
    data.put(new Stock("Microsoft"), 100.00);
    //...
  }

  Double getStockPrice(Stock stock) {

    // === Call an external service here ===

    return data.keySet().stream()
      .filter(s -> s.name().equalsIgnoreCase(stock.name()))
      .map(s -> data.get(s))
      .findFirst()
      .orElse(-1.0);
  }

  public record Stock(String name) {
  }
}

2.2. Passing Function Contract in Chat Options

Now when the function has been defined, we need to send this contract information to LLM. The ChatClient.ChatClientRequestSpec class provides functions() method that can be used to pass the function bean name along with the user prompt.

import org.springframework.ai.chat.client.ChatClient;
import org.springframework.stereotype.Service;

@Service
public class ChatService {

  private final ChatClient chatClient;

  ChatService(ChatClient.Builder chatClientBuilder) {
    this.chatClient = chatClientBuilder.build();
  }

  String getPriceByStockName(String stockName) {
    var userPromptTemplate = "Get the latest price for {stockName}.";
    return chatClient.prompt()
      .user(userSpec -> userSpec
        .text(userPromptTemplate)
        .param("stockName", stockName)
      )
      .functions("priceByStockNameFunction")    // Function contract
      .call()
      .content();
  }
}

If we are using the ChatModel class then we can pass the function information using OpenAiChatOptions builder.

var userPromptTemplate = "Get the latest price for {stockName}.";

PromptTemplate promptTemplate = new PromptTemplate(userPromptTemplate);
Message message = promptTemplate.createMessage(Map.of("stockName", stockName));

ChatResponse response = chatModel.call(
  new Prompt(List.of(message), OpenAiChatOptions.builder()
    .withFunction("priceByStockNameFunction").build()));

return response.getResult().getOutput().getContent();

2.3. Controller

Finally, we can expose the ChatService in the ChatBot app using the REST endpoint so the end-users can call this service to interact with LLM.

@RestController
public class ChatController {

  private final ChatService chatService;

  ChatController(ChatService chatService) {
    this.chatService = chatService;
  }

  @GetMapping("/chat/function")
  String chat(@RequestParam String stockName) {
    return chatService.getPriceByStockName(stockName);
  }
}

3. Demo

For this demo, we are using OpenAI GPT model for responding to the user prompts so let’s start with adding its dependency. To create a project from scratch, read the Getting Started with Spring AI guide.

<dependency>
  <groupId>org.springframework.ai</groupId>
  <artifactId>spring-ai-openai-spring-boot-starter</artifactId>
</dependency>
<dependency>
  <groupId>org.springframework.boot</groupId>
  <artifactId>spring-boot-starter-web</artifactId>
</dependency>

Also, do not forget to specify the OpenAI API key in the environment variables or properties file.

spring.ai.openai.api-key=${OPENAI_API_KEY}

Next, start the Spring Boot application as a web application and invoke the stock price API:

curl --location 'http://localhost:8080/chat/function?stockName=Microsoft'

This will return the response:

The latest price for Microsoft is $100.00.

4. Summary

In this short tutorial, we learned to call external APIs using function calling with Spring AI and Spring Boot. The important thing to understand is that LLMs do not have capabilities to invoke external APIs and this is our responsibility to write these functions.

When the user sends a prompt, we must include the function information (with appropriate function description) to help LLM identify if the function call is needed in the current context or not. Spring AI helps in abstracting this whole workflow as a framework feature and only requires us to define the function and a good description.

Happy Learning !!

Source Code on Github

Weekly Newsletter

Stay Up-to-Date with Our Weekly Updates. Right into Your Inbox.

Comments

Subscribe
Notify of
0 Comments
Most Voted
Newest Oldest
Inline Feedbacks
View all comments

About Us

HowToDoInJava provides tutorials and how-to guides on Java and related technologies.

It also shares the best practices, algorithms & solutions and frequently asked interview questions.