PolarSPARC

Pragmatic Bytes on Spring AI


Bhaskar S 06/21/2025


Overview

Large Enterprise(s) typically use the Java stack in combination with the Spring Boot framework. Spring AI is a new addition to the Spring framework family and provides portable API support across a range of LLM model providers (such as, Anthropic, Google, Ollama, OpenAI, etc.) for developing generative AI applications in Java.

The Spring AI offers the following core API:


Installation and Setup

The installation and setup will can on a Ubuntu 24.04 LTS based Linux desktop. Ensure that Docker is installed and setup on the desktop (see instructions).

Also, ensure that Ollama platform is installed and setup on the desktop (see instructions).

Further, ensure at least Java 17 or above is installed and setup. In addition, ensure Apache Maven is installed and setup.

We will use Spring AI with Spring Boot for all the demonstrations in this article.

To pull and download the docker image for pgvector, execute the following command in a terminal window:


$ docker pull pgvector/pgvector:pg17


Assuming that the ip address on the Linux desktop is 192.168.1.25, start the Ollama platform by executing the following command in the terminal window:


$ docker run --rm --name ollama -p 192.168.1.25:11434:11434 -v $HOME/.ollama:/root/.ollama ollama/ollama:0.9.0


If the linux desktop has Nvidia GPU with decent amount of VRAM (at least 16 GB) and has been enabled for use with docker (see instructions), then execute the following command instead to start Ollama:


$ docker run --rm --name ollama --gpus=all -p 192.168.1.25:11434:11434 -v $HOME/.ollama:/root/.ollama ollama/ollama:0.9.0


For the LLM model, we will be using the recently released IBM Granite 3.3 2B model.

Open a new terminal window and execute the following docker command to download the LLM model:


$ docker exec -it ollama ollama run granite3.3:2b



!!! ATTENTION !!!

PgVector vector embedding in Spring AI CANNOT have more than 2000 dimensions - IBM Granite 3.3 2B model generates a 2048 dimension vector !!!

For the LLM embedding model, we will be using the latest nomic-embed-text:v1.5 model.

Open a new terminal window and execute the following docker command to download the LLM embedding model:


$ docker exec -it ollama ollama run nomic-embed-text:v1.5


To start the PgVector server, execute the following command in a new terminal window:


$ docker run --rm --name pgvector -e POSTGRES_DB=ps_vector -e POSTGRES_USER=pgusr -e POSTGRES_PASSWORD=pgusr\$123 -p 5432:5432 -v /tmp/pgvector:/var/lib/postgresql/data pgvector/pgvector:pg17


To setup the root Java directory structure for the demonstrations in this article, execute the following commands:


$ cd $HOME

$ mkdir -p $HOME/java/SpringAI

$ cd $HOME/java/SpringAI


The following is the listing for the parent Maven project file pom.xml that will be located at $HOME/java/SpringAI:


pom.xml
<?xml version="1.0" encoding="UTF-8"?>

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">

    <modelVersion>4.0.0</modelVersion>

    <name>SpringAI</name>
    <description>Pragmatic Bytes on SpringAI</description>
    <groupId>com.polarsparc</groupId>
    <artifactId>SpringAI</artifactId>
    <version>1.0</version>
    <packaging>pom</packaging>

    <parent>
      <groupId>org.springframework.boot</groupId>
      <artifactId>spring-boot-starter-parent</artifactId>
      <version>3.5.0</version>
      <relativePath/>
    </parent>

    <properties>
      <java.version>17</java.version>
      <spring-ai.version>1.0.0</spring-ai.version>
    </properties>

    <dependencies>
      <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter</artifactId>
      </dependency>
      <dependency>
        <groupId>org.springframework.ai</groupId>
        <artifactId>spring-ai-starter-model-ollama</artifactId>
      </dependency>
      <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-test</artifactId>
        <scope>test</scope>
      </dependency>
    </dependencies>

    <dependencyManagement>
      <dependencies>
        <dependency>
          <groupId>org.springframework.ai</groupId>
          <artifactId>spring-ai-bom</artifactId>
          <version>${spring-ai.version}</version>
          <type>pom</type>
          <scope>import</scope>
        </dependency>
      </dependencies>
    </dependencyManagement>

    <build>
      <plugins>
        <plugin>
          <groupId>org.springframework.boot</groupId>
          <artifactId>spring-boot-maven-plugin</artifactId>
        </plugin>
      </plugins>
    </build>

</project>

This completes all the system installation and setup for the Spring AI hands-on demonstration.


Hands-on with Spring AI

In the First application, we will have the AI application send a user prompt request to the Granite 3.3 model running on the Ollama platform.

First Application


To setup the Java directory structure for the First application, execute the following commands:


$ cd $HOME/java/SpringAI

$ mkdir -p First/{src,target}

$ mkdir -p First/src/{main/java/com/polarsparc/springai,main/resources}

$ mkdir -p First/src/main/java/com/polarsparc/springai/{config,service}

$ cd $HOME/java/SpringAI/First


The following is the listing for the Maven project file pom.xml that will be used:


pom.xml
<?xml version="1.0" encoding="UTF-8"?>

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">

  <modelVersion>4.0.0</modelVersion>

  <parent>
    <groupId>com.polarsparc</groupId>
    <artifactId>SpringAI</artifactId>
    <version>1.0</version>
  </parent>

  <name>First</name>
  <description>First SpringAI Application</description>
  <artifactId>First</artifactId>
  <version>1.0</version>

</project>

The following is the listing for the logger properties file simplelogger.properties located in the directory src/main/resources:


simplelogger.properties
#
### SLF4J Simple Logger properties
#
org.slf4j.simpleLogger.defaultLogLevel=info
org.slf4j.simpleLogger.showDateTime=true
org.slf4j.simpleLogger.dateTimeFormat=yyyy-MM-dd HH:mm:ss:SSS
org.slf4j.simpleLogger.showThreadName=true

The following is the listing for the Spring Boot application properties file application.properties located in the directory src/main/resources:


application.properties
spring.main.banner-mode=off
spring.application.name=First-SpringAI
spring.main.web-application-type=none
spring.ai.ollama.base-url=http://192.168.1.25:11434
spring.ai.ollama.chat.model=granite3.3:2b
spring.ai.ollama.chat.options.temperature=0.2

The following is the Spring Boot configuration class that defines the Ollama specific ChatClient container bean:


Listing.1
/*
 * Name:   OllamaChatConfig
 * Author: Bhaskar S
 * Date:   06/19/2025
 * Blog:   https://polarsparc.github.io
 */

package com.polarsparc.springai.config;

import org.springframework.ai.chat.client.ChatClient;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;

@Configuration
public class OllamaChatConfig {
  @Bean
  public ChatClient getOllamaChatClient(ChatClient.Builder chatClientBuilder) {
    return chatClientBuilder.build();
  }
}

The following is the Spring Boot service class that exposes a method to interact with the LLM model running on the Ollama platform:


Listing.2
/*
 * Name:   OllamaChatService
 * Author: Bhaskar S
 * Date:   06/19/2025
 * Blog:   https://polarsparc.github.io
 */

package com.polarsparc.springai.service;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.ai.chat.client.ChatClient;
import org.springframework.stereotype.Service;

@Service
public class OllamaChatService {
  private static final Logger LOGGER = LoggerFactory.getLogger(OllamaChatService.class);

  private final ChatClient chatClient;

  public OllamaChatService(ChatClient chatClient) {
    this.chatClient = chatClient;
  }

  public void getChatResponse(String prompt) {
    String response = null;

    LOGGER.info("User provided prompt for chat:\n {}", prompt);

    if (chatClient != null) {
      response = chatClient.prompt()
              .user(prompt)
              .call()
              .content();

      LOGGER.info("Response from chat:\n {}", response);
    } else {
      LOGGER.warn("Chat client NOT initialized !!!");
    }
  }
}

The following is the main Spring Boot application to test the SpringAI chat functionality:


Listing.3
/*
 * Name:   FirstSpringAiApplication
 * Author: Bhaskar S
 * Date:   06/19/2025
 * Blog:   https://polarsparc.github.io
 */

package com.polarsparc.springai;

import com.polarsparc.springai.service.OllamaChatService;
import org.springframework.boot.CommandLineRunner;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;

@SpringBootApplication(scanBasePackages = "com.polarsparc")
public class FirstSpringAiApplication implements CommandLineRunner {
  private final OllamaChatService chatService;

  public FirstSpringAiApplication(OllamaChatService chatService) {
    this.chatService = chatService;
  }

  public static void main(String[] args) {
    SpringApplication.run(FirstSpringAiApplication.class, args);
  }

  @Override
  public void run(String... args) throws Exception {
    String prompt = "describe the spring boot framework in les than 50 words";
    chatService.getChatResponse(prompt);
  }
}

To execute the code from Listing.3, open a terminal window and run the following commands:


$ cd $HOME/java/SpringAI/First

$ mvn spring-boot:run


The following would be the typical output:


Output.1

[INFO] --- spring-boot:3.5.0:run (default-cli) @ First ---
[INFO] Attaching agents: []
2025-06-20T20:58:30.627-04:00  INFO 147139 --- [First-SpringAI] [           main] c.p.springai.FirstSpringAiApplication    : Starting FirstSpringAiApplication using Java 23.0.1 with PID 147139 (/home/polarsparc/java/SpringAI/First/target/classes started by polarsparc in /home/polarsparc/java/SpringAI/First)
2025-06-20T20:58:30.629-04:00  INFO 147139 --- [First-SpringAI] [           main] c.p.springai.FirstSpringAiApplication    : No active profile set, falling back to 1 default profile: "default"
2025-06-20T20:58:31.147-04:00  INFO 147139 --- [First-SpringAI] [           main] c.p.springai.FirstSpringAiApplication    : Started FirstSpringAiApplication in 0.723 seconds (process running for 0.87)
2025-06-20T20:58:31.148-04:00  INFO 147139 --- [First-SpringAI] [           main] c.p.springai.service.OllamaChatService   : User provided prompt for chat:
 describe the spring boot framework in les than 50 words
2025-06-20T20:58:33.026-04:00  INFO 147139 --- [First-SpringAI] [           main] c.p.springai.service.OllamaChatService   : Response from chat:
 Spring Boot is an open-source Java framework for building microservices and applications. It simplifies development with conventions, auto-configuration, and embedded servers, enabling rapid application development and easy deployment on various platforms.
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  4.063 s
[INFO] Finished at: 2025-06-20T20:58:33-04:00
[INFO] ------------------------------------------------------------------------

In the Second application, we will have the AI application display a structured output response.

Second Application


To setup the Java directory structure for the Second application, execute the following commands:


$ cd $HOME/java/SpringAI

$ mkdir -p Second/{src,target}

$ mkdir -p Second/src/{main/java/com/polarsparc/springai,main/resources}

$ mkdir -p Second/src/main/java/com/polarsparc/springai/{config,model,service}

$ cd $HOME/java/SpringAI/Second


The following is the listing for the Maven project file pom.xml that will be used:


pom.xml
<?xml version="1.0" encoding="UTF-8"?>

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">

  <modelVersion>4.0.0</modelVersion>

  <parent>
    <groupId>com.polarsparc</groupId>
    <artifactId>SpringAI</artifactId>
    <version>1.0</version>
  </parent>

  <name>Second</name>
  <description>Second SpringAI Application</description>
  <artifactId>Second</artifactId>
  <version>1.0</version>

</project>

The following is the listing for the logger properties file simplelogger.properties located in the directory src/main/resources:


simplelogger.properties
#
### SLF4J Simple Logger properties
#
org.slf4j.simpleLogger.defaultLogLevel=info
org.slf4j.simpleLogger.showDateTime=true
org.slf4j.simpleLogger.dateTimeFormat=yyyy-MM-dd HH:mm:ss:SSS
org.slf4j.simpleLogger.showThreadName=true

The following is the listing for the Spring Boot application properties file application.properties located in the directory src/main/resources:


application.properties
spring.main.banner-mode=off
spring.application.name=Second-SpringAI
spring.main.web-application-type=none
spring.ai.ollama.base-url=http://192.168.1.25:11434
spring.ai.ollama.chat.model=granite3.3:2b
spring.ai.ollama.chat.options.temperature=0.2

The following is the Spring Boot configuration class that defines the Ollama specific ChatClient container bean:


Listing.4
/*
 * Name:   OllamaChatConfig
 * Author: Bhaskar S
 * Date:   06/19/2025
 * Blog:   https://polarsparc.github.io
 */

package com.polarsparc.springai.config;

import org.springframework.ai.chat.client.ChatClient;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;

@Configuration
public class OllamaChatConfig {
  @Bean
  public ChatClient getOllamaChatClient(ChatClient.Builder chatClientBuilder) {
    return chatClientBuilder.build();
  }
}

The following is the structured output model class that the Ollama response will be mapped to:


Listing.5
/*
 * Name:   CapitalCity
 * Author: Bhaskar S
 * Date:   06/19/2025
 * Blog:   https://polarsparc.github.io
 */

package com.polarsparc.springai.model;

public record CapitalCity(String country, String city, float areaInSqft, long population) {}

The following is the Spring Boot service class that exposes a method to interact with the LLM model running on the Ollama platform:


Listing.6
/*
 * Name:   OllamaChatService
 * Author: Bhaskar S
 * Date:   06/19/2025
 * Blog:   https://polarsparc.github.io
 */

package com.polarsparc.springai.service;

import com.polarsparc.springai.model.CapitalCity;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.ai.chat.client.ChatClient;
import org.springframework.stereotype.Service;

@Service
public class OllamaChatService {
  private static final Logger LOGGER = LoggerFactory.getLogger(OllamaChatService.class);

  private final ChatClient chatClient;

  public OllamaChatService(ChatClient chatClient) {
    this.chatClient = chatClient;
  }

  public void getCapitalCity(String prompt) {
    CapitalCity capitalCity = null;

    LOGGER.info("User provided prompt:\n {}", prompt);

    if (chatClient != null) {
      capitalCity = chatClient.prompt()
              .user(prompt)
              .call()
              .entity(CapitalCity.class);

      LOGGER.info("CapitalCity response from chat:\n {}", capitalCity);
    } else {
      LOGGER.warn("Chat client NOT initialized !!!");
    }
  }
}

The following is the main Spring Boot application to test the SpringAI structured output chat functionality:


Listing.7
/*
 * Name:   SecondSpringAiApplication
 * Author: Bhaskar S
 * Date:   06/19/2025
 * Blog:   https://polarsparc.github.io
 */

package com.polarsparc.springai;

import com.polarsparc.springai.service.OllamaChatService;
import org.springframework.boot.CommandLineRunner;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;

@SpringBootApplication(scanBasePackages = "com.polarsparc")
public class SecondSpringAiApplication implements CommandLineRunner {
  private final OllamaChatService chatService;

  public SecondSpringAiApplication(OllamaChatService chatService) {
    this.chatService = chatService;
  }

  public static void main(String[] args) {
    SpringApplication.run(SecondSpringAiApplication.class, args);
  }

  @Override
  public void run(String... args) throws Exception {
    String prompt = "what is the capital of the country India and fetch its details";
    chatService.getCapitalCity(prompt);
  }
}

To execute the code from Listing.7, open a terminal window and run the following commands:


$ cd $HOME/java/SpringAI/Second

$ mvn spring-boot:run


The following would be the typical output:


Output.2

[INFO] --- spring-boot:3.5.0:run (default-cli) @ Second ---
[INFO] Attaching agents: []
2025-06-20T21:24:57.709-04:00  INFO 156689 --- [Second-SpringAI] [           main] c.p.springai.SecondSpringAiApplication   : Starting SecondSpringAiApplication using Java 23.0.1 with PID 156689 (/home/polarsparc/java/SpringAI/Second/target/classes started by polarsparc in /home/polarsparc/java/SpringAI/Second)
2025-06-20T21:24:57.711-04:00  INFO 156689 --- [Second-SpringAI] [           main] c.p.springai.SecondSpringAiApplication   : No active profile set, falling back to 1 default profile: "default"
2025-06-20T21:24:58.220-04:00  INFO 156689 --- [Second-SpringAI] [           main] c.p.springai.SecondSpringAiApplication   : Started SecondSpringAiApplication in 0.705 seconds (process running for 0.846)
2025-06-20T21:24:58.221-04:00  INFO 156689 --- [Second-SpringAI] [           main] c.p.springai.service.OllamaChatService   : User provided prompt:
 what is the capital of the country India and fetch its details
2025-06-20T21:25:00.217-04:00  INFO 156689 --- [Second-SpringAI] [           main] c.p.springai.service.OllamaChatService   : CapitalCity response from chat:
 CapitalCity[country=India, city=New Delhi, areaInSqft=344750.0, population=302909388]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  4.142 s
[INFO] Finished at: 2025-06-20T21:25:00-04:00
[INFO] ------------------------------------------------------------------------

In the Third application, we will have the AI application leverage the vector database to store the Nvidia 4th Quarter 2024 financial report and query the vector database to analyze it.

Third Application


To setup the Java directory structure for the Third application, execute the following commands:


$ cd $HOME/java/SpringAI

$ mkdir -p Third/{src,target}

$ mkdir -p Third/src/{main/java/com/polarsparc/springai,main/resources}

$ mkdir -p Third/src/main/java/com/polarsparc/springai/{config,service}

$ cd $HOME/java/SpringAI/Third


The following is the listing for the Maven project file pom.xml that will be used:


pom.xml
<?xml version="1.0" encoding="UTF-8"?>

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">

    <modelVersion>4.0.0</modelVersion>

    <parent>
      <groupId>com.polarsparc</groupId>
      <artifactId>SpringAI</artifactId>
      <version>1.0</version>
    </parent>

    <name>Third</name>
    <description>Third SpringAI Application</description>
    <artifactId>Third</artifactId>
    <version>1.0</version>

    <dependencies>
      <dependency>
        <groupId>org.springframework.ai</groupId>
        <artifactId>spring-ai-starter-vector-store-pgvector</artifactId>
      </dependency>
      <dependency>
        <groupId>org.springframework.ai</groupId>
        <artifactId>spring-ai-pdf-document-reader</artifactId>
      </dependency>
      <dependency>
        <groupId>org.springframework.ai</groupId>
        <artifactId>spring-ai-advisors-vector-store</artifactId>
      </dependency>
    </dependencies>

</project>

The following is the listing for the logger properties file simplelogger.properties located in the directory src/main/resources:


simplelogger.properties
#
### SLF4J Simple Logger properties
#
org.slf4j.simpleLogger.defaultLogLevel=info
org.slf4j.simpleLogger.showDateTime=true
org.slf4j.simpleLogger.dateTimeFormat=yyyy-MM-dd HH:mm:ss:SSS
org.slf4j.simpleLogger.showThreadName=true

The following is the listing for the Spring Boot application properties file application.properties located in the directory src/main/resources:


application.properties
spring.main.banner-mode=off
spring.application.name=Third-SpringAI
spring.main.web-application-type=none
spring.ai.ollama.base-url=http://192.168.1.25:11434
spring.ai.ollama.chat.model=granite3.3:2b
spring.ai.ollama.chat.options.temperature=0.2
spring.ai.ollama.embedding.options.model=nomic-embed-text:v1.5
spring.datasource.url=jdbc:postgresql://192.168.1.25:5432/ps_vector
spring.datasource.username=pgusr
spring.datasource.password=pgusr$123
spring.ai.vectorstore.pgvector.initialize-schema=true
spring.ai.vectorstore.pgvector.index-type=HNSW
spring.ai.vectorstore.pgvector.distance-type=COSINE_DISTANCE
spring.ai.vectorstore.pgvector.dimensions=768

Ensure that the PDF file NVIDIA-4Q-2024.pdf is downloaded and stored in the directory src/main/resources.

The following is the Spring Boot configuration class that defines the Ollama specific ChatClient container bean:


Listing.8
/*
 * Name:   OllamaChatConfig
 * Author: Bhaskar S
 * Date:   06/19/2025
 * Blog:   https://polarsparc.github.io
 */

package com.polarsparc.springai.config;

import org.springframework.ai.chat.client.ChatClient;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;

@Configuration
public class OllamaChatConfig {
  @Bean
  public ChatClient getOllamaChatClient(ChatClient.Builder chatClientBuilder) {
    return chatClientBuilder.build();
  }
}

The following is the Spring Boot service class that exposes a method to store a PDF file to the vector database:


Listing.9
/*
 * Name:   PgVectorService
 * Author: Bhaskar S
 * Date:   06/19/2025
 * Blog:   https://polarsparc.github.io
 */

package com.polarsparc.springai.service;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.ai.document.Document;
import org.springframework.ai.reader.ExtractedTextFormatter;
import org.springframework.ai.reader.pdf.PagePdfDocumentReader;
import org.springframework.ai.reader.pdf.config.PdfDocumentReaderConfig;
import org.springframework.ai.transformer.splitter.TokenTextSplitter;
import org.springframework.ai.vectorstore.VectorStore;
import org.springframework.core.io.ClassPathResource;
import org.springframework.stereotype.Service;

import java.net.MalformedURLException;
import java.util.List;

@Service
public class PgVectorService {
  private static final Logger LOGGER = LoggerFactory.getLogger(PgVectorService.class);

  private final VectorStore vectorStore;

  public PgVectorService(VectorStore vectorStore) {
    this.vectorStore = vectorStore;
  }

  public void storePDF(String file) throws MalformedURLException {
    ClassPathResource pdfResource = new ClassPathResource(file);

    LOGGER.info("PDF to store: {}", pdfResource.getFilename());

    if (vectorStore != null) {
      PagePdfDocumentReader pdfDocumentReader = new PagePdfDocumentReader(pdfResource,
              PdfDocumentReaderConfig.builder()
                      .withPageExtractedTextFormatter(ExtractedTextFormatter.builder()
                              .build())
                      .withPagesPerDocument(1)
                      .build());

      TokenTextSplitter tokenTextSplitter = new TokenTextSplitter();

      List splitDocuments = tokenTextSplitter.split(pdfDocumentReader.read());
      for (Document splitDocument: splitDocuments) {
        splitDocument.getMetadata().put("filename", pdfResource.getFilename());
      }

      vectorStore.write(splitDocuments);

      LOGGER.info("Successfully stored the PDF: {}", pdfResource.getFilename());
    } else {
      LOGGER.warn("Vector store NOT initialized !!!");
    }
  }
}

The following is the Spring Boot service class that exposes a method to interact with the LLM model running on the Ollama platform, which uses the content from the vector database:


Listing.10
/*
 * Name:   OllamaChatService
 * Author: Bhaskar S
 * Date:   06/19/2025
 * Blog:   https://polarsparc.github.io
 */

package com.polarsparc.springai.service;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.ai.chat.client.ChatClient;
import org.springframework.ai.chat.client.advisor.vectorstore.QuestionAnswerAdvisor;
import org.springframework.ai.vectorstore.VectorStore;
import org.springframework.stereotype.Service;

@Service
public class OllamaChatService {
  private static final Logger LOGGER = LoggerFactory.getLogger(OllamaChatService.class);

  private final ChatClient chatClient;
  private final VectorStore vectorStore;

  public OllamaChatService(ChatClient chatClient, VectorStore vectorStore) {
    this.chatClient = chatClient;
    this.vectorStore = vectorStore;
  }

  public void queryVectorStore(String prompt) {
    String response = null;

    if (chatClient != null && vectorStore != null) {
      LOGGER.info("User provided prompt:\n {}", prompt);

      response = chatClient.prompt()
              .user(prompt)
              .advisors(new QuestionAnswerAdvisor(vectorStore))
              .call()
              .content();

      LOGGER.info("Response from vector store:\n {}", response);
    } else {
      LOGGER.warn("Chat client AND/OR Vector store NOT initialized !!!");
    }
  }
}

The following is the main Spring Boot application to test the SpringAI functionality of using the vector database:


Listing.11
/*
 * Name:   ThirdSpringAiApplication
 * Author: Bhaskar S
 * Date:   06/19/2025
 * Blog:   https://polarsparc.github.io
 */

package com.polarsparc.springai;

import com.polarsparc.springai.service.OllamaChatService;
import com.polarsparc.springai.service.PgVectorService;
import org.springframework.boot.CommandLineRunner;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;

@SpringBootApplication(scanBasePackages = "com.polarsparc")
public class ThirdSpringAiApplication implements CommandLineRunner {
  private final OllamaChatService chatService;
  private final PgVectorService vectorService;

  public ThirdSpringAiApplication(OllamaChatService chatService, PgVectorService vectorService) {
    this.chatService = chatService;
    this.vectorService = vectorService;
  }

  public static void main(String[] args) {
    SpringApplication.run(ThirdSpringAiApplication.class, args);
  }

  @Override
  public void run(String... args) throws Exception {
    vectorService.storePDF("NVIDIA-4Q-2024.pdf");

    String prompt = "what was the total revenue and total expenses for nvidia in q4 2024";
    chatService.queryVectorStore(prompt);
  }
}

To execute the code from Listing.11, open a terminal window and run the following commands:


$ cd $HOME/java/SpringAI/Third

$ mvn spring-boot:run


The following would be the typical output:


Output.3

[INFO] --- spring-boot:3.5.0:run (default-cli) @ Third ---
[INFO] Attaching agents: []
2025-06-20T22:00:50.545-04:00  INFO 169678 --- [Third-SpringAI] [           main] c.p.springai.ThirdSpringAiApplication    : Starting ThirdSpringAiApplication using Java 23.0.1 with PID 169678 (/home/polarsparc/java/SpringAI/Third/target/classes started by polarsparc in /home/polarsparc/java/SpringAI/Third)
2025-06-20T22:00:50.550-04:00  INFO 169678 --- [Third-SpringAI] [           main] c.p.springai.ThirdSpringAiApplication    : No active profile set, falling back to 1 default profile: "default"
2025-06-20T22:00:51.309-04:00  INFO 169678 --- [Third-SpringAI] [           main] o.s.a.v.pgvector.PgVectorStore           : Using the vector table name: vector_store. Is empty: false
2025-06-20T22:00:51.311-04:00  INFO 169678 --- [Third-SpringAI] [           main] o.s.a.v.pgvector.PgVectorStore           : Initializing PGVectorStore schema for table: vector_store in schema: public
2025-06-20T22:00:51.311-04:00  INFO 169678 --- [Third-SpringAI] [           main] o.s.a.v.pgvector.PgVectorStore           : vectorTableValidationsEnabled false
2025-06-20T22:00:51.313-04:00  INFO 169678 --- [Third-SpringAI] [           main] com.zaxxer.hikari.HikariDataSource       : HikariPool-1 - Starting...
2025-06-20T22:00:51.400-04:00  INFO 169678 --- [Third-SpringAI] [           main] com.zaxxer.hikari.pool.HikariPool        : HikariPool-1 - Added connection org.postgresql.jdbc.PgConnection@56be10fa
2025-06-20T22:00:51.401-04:00  INFO 169678 --- [Third-SpringAI] [           main] com.zaxxer.hikari.HikariDataSource       : HikariPool-1 - Start completed.
2025-06-20T22:00:51.476-04:00  INFO 169678 --- [Third-SpringAI] [           main] c.p.springai.ThirdSpringAiApplication    : Started ThirdSpringAiApplication in 1.129 seconds (process running for 1.271)
2025-06-20T22:00:51.477-04:00  INFO 169678 --- [Third-SpringAI] [           main] c.p.springai.service.PgVectorService     : PDF to store: NVIDIA-4Q-2024.pdf
2025-06-20T22:00:51.579-04:00  INFO 169678 --- [Third-SpringAI] [           main] o.s.ai.reader.pdf.PagePdfDocumentReader  : Processing PDF page: 1
2025-06-20T22:00:51.641-04:00  INFO 169678 --- [Third-SpringAI] [           main] o.s.ai.reader.pdf.PagePdfDocumentReader  : Processing PDF page: 2
2025-06-20T22:00:51.659-04:00  INFO 169678 --- [Third-SpringAI] [           main] o.s.ai.reader.pdf.PagePdfDocumentReader  : Processing PDF page: 3
2025-06-20T22:00:51.680-04:00  INFO 169678 --- [Third-SpringAI] [           main] o.s.ai.reader.pdf.PagePdfDocumentReader  : Processing PDF page: 4
2025-06-20T22:00:51.695-04:00  INFO 169678 --- [Third-SpringAI] [           main] o.s.ai.reader.pdf.PagePdfDocumentReader  : Processing PDF page: 5
2025-06-20T22:00:51.706-04:00  INFO 169678 --- [Third-SpringAI] [           main] o.s.ai.reader.pdf.PagePdfDocumentReader  : Processing PDF page: 6
2025-06-20T22:00:51.725-04:00  INFO 169678 --- [Third-SpringAI] [           main] o.s.ai.reader.pdf.PagePdfDocumentReader  : Processing PDF page: 7
2025-06-20T22:00:51.744-04:00  INFO 169678 --- [Third-SpringAI] [           main] o.s.ai.reader.pdf.PagePdfDocumentReader  : Processing PDF page: 8
2025-06-20T22:00:51.762-04:00  INFO 169678 --- [Third-SpringAI] [           main] o.s.ai.reader.pdf.PagePdfDocumentReader  : Processing PDF page: 9
2025-06-20T22:00:51.783-04:00  INFO 169678 --- [Third-SpringAI] [           main] o.s.ai.reader.pdf.PagePdfDocumentReader  : Processing PDF page: 10
2025-06-20T22:00:51.786-04:00  INFO 169678 --- [Third-SpringAI] [           main] o.s.ai.reader.pdf.PagePdfDocumentReader  : Processing 10 pages
2025-06-20T22:00:51.800-04:00  INFO 169678 --- [Third-SpringAI] [           main] o.s.a.transformer.splitter.TextSplitter  : Splitting up document into 2 chunks.
2025-06-20T22:00:51.807-04:00  INFO 169678 --- [Third-SpringAI] [           main] o.s.a.transformer.splitter.TextSplitter  : Splitting up document into 2 chunks.
2025-06-20T22:00:51.815-04:00  INFO 169678 --- [Third-SpringAI] [           main] o.s.a.transformer.splitter.TextSplitter  : Splitting up document into 2 chunks.
2025-06-20T22:00:51.825-04:00  INFO 169678 --- [Third-SpringAI] [           main] o.s.a.transformer.splitter.TextSplitter  : Splitting up document into 2 chunks.
2025-06-20T22:00:51.827-04:00  INFO 169678 --- [Third-SpringAI] [           main] o.s.a.transformer.splitter.TextSplitter  : Splitting up document into 2 chunks.
2025-06-20T22:00:51.830-04:00  INFO 169678 --- [Third-SpringAI] [           main] o.s.a.transformer.splitter.TextSplitter  : Splitting up document into 2 chunks.
2025-06-20T22:00:51.834-04:00  INFO 169678 --- [Third-SpringAI] [           main] o.s.a.transformer.splitter.TextSplitter  : Splitting up document into 2 chunks.
2025-06-20T22:00:53.025-04:00  INFO 169678 --- [Third-SpringAI] [           main] c.p.springai.service.PgVectorService     : Successfully stored the PDF: NVIDIA-4Q-2024.pdf
2025-06-20T22:00:53.025-04:00  INFO 169678 --- [Third-SpringAI] [           main] c.p.springai.service.OllamaChatService   : User provided prompt:
 what was the total revenue and total expenses for nvidia in q4 2024
2025-06-20T22:00:56.794-04:00  INFO 169678 --- [Third-SpringAI] [           main] c.p.springai.service.OllamaChatService   : Response from vector store:
 Based on the provided financial results for NVIDIA's Q4 2024, here are the total revenue and total expenses:

- Total Revenue (Q4 2024): $22.1 billion
- Total Expenses (Q4 2024): $11.329 billion

Please note that the context does not provide a breakdown of these figures into GAAP and Non-GAAP expenses, so I couldn't provide those details separately.
2025-06-20T22:00:56.796-04:00  INFO 169678 --- [Third-SpringAI] [ionShutdownHook] com.zaxxer.hikari.HikariDataSource       : HikariPool-1 - Shutdown initiated...
2025-06-20T22:00:56.798-04:00  INFO 169678 --- [Third-SpringAI] [ionShutdownHook] com.zaxxer.hikari.HikariDataSource       : HikariPool-1 - Shutdown completed.
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  7.518 s
[INFO] Finished at: 2025-06-20T22:00:57-04:00
[INFO] ------------------------------------------------------------------------

In the Fourth application, we will have the AI application leverage external tools for processing.

Fourth Application


To setup the Java directory structure for the Fourth application, execute the following commands:


$ cd $HOME/java/SpringAI

$ mkdir -p Fourth/{src,target}

$ mkdir -p Fourth/src/{main/java/com/polarsparc/springai,main/resources}

$ mkdir -p Fourth/src/main/java/com/polarsparc/springai/{config,service,util}

$ cd $HOME/java/SpringAI/Fourth


The following is the listing for the Maven project file pom.xml that will be used:


pom.xml
<?xml version="1.0" encoding="UTF-8"?>

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">

  <modelVersion>4.0.0</modelVersion>

  <parent>
    <groupId>com.polarsparc</groupId>
    <artifactId>SpringAI</artifactId>
    <version>1.0</version>
  </parent>

  <name>Fourth</name>
  <description>Fourth SpringAI Application</description>
  <artifactId>Fourth</artifactId>
  <version>1.0</version>

</project>

The following is the listing for the logger properties file simplelogger.properties located in the directory src/main/resources:


simplelogger.properties
#
### SLF4J Simple Logger properties
#
org.slf4j.simpleLogger.defaultLogLevel=info
org.slf4j.simpleLogger.showDateTime=true
org.slf4j.simpleLogger.dateTimeFormat=yyyy-MM-dd HH:mm:ss:SSS
org.slf4j.simpleLogger.showThreadName=true

The following is the listing for the Spring Boot application properties file application.properties located in the directory src/main/resources:


application.properties
spring.main.banner-mode=off
spring.application.name=Fourth-SpringAI
spring.main.web-application-type=none
spring.ai.ollama.base-url=http://192.168.1.25:11434
spring.ai.ollama.chat.model=granite3.3:2b
spring.ai.ollama.chat.options.temperature=0.2

The following is the utility class that provides access to a simple interest calculator tool which can be used by the LLM model:


Listing.12
/*
 * Name:   OllamaCalculatorTools
 * Author: Bhaskar S
 * Date:   06/19/2025
 * Blog:   https://polarsparc.github.io
 */

package com.polarsparc.springai.util;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.ai.tool.annotation.Tool;

public class OllamaCalculatorTools {
  private static final Logger LOGGER = LoggerFactory.getLogger(OllamaCalculatorTools.class);

  @Tool(description = "Tool to compute simple interest rate for a year")
  public float simple_interest(float principal, float rate) {
    LOGGER.info("Input - principal: {}, rate: {}", principal, rate);

    float interest = (principal * rate) / 100.00f;

    LOGGER.info("Simple interest for principal: {}, rate: {} is: {}", principal, rate, interest);

    return interest;
  }
}

The following is the Spring Boot configuration class that defines the Ollama specific ChatClient and calculator tool container beans:


Listing.13
/*
 * Name:   OllamaChatConfig
 * Author: Bhaskar S
 * Date:   06/19/2025
 * Blog:   https://polarsparc.github.io
 */

package com.polarsparc.springai.config;

import com.polarsparc.springai.util.OllamaCalculatorTools;
import org.springframework.ai.chat.client.ChatClient;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;

@Configuration
public class OllamaChatConfig {
  @Bean
  public ChatClient getOllamaChatClient(ChatClient.Builder chatClientBuilder) {
    return chatClientBuilder.build();
  }

  @Bean
  public OllamaCalculatorTools getOllamaCalculatorTools() {
    return new OllamaCalculatorTools();
  }
}

The following is the Spring Boot service class that exposes a method to interact with the LLM model running on the Ollama platform which will use the calculator tool:


Listing.14
/*
 * Name:   OllamaChatService
 * Author: Bhaskar S
 * Date:   06/19/2025
 * Blog:   https://polarsparc.github.io
 */

package com.polarsparc.springai.service;

import com.polarsparc.springai.util.OllamaCalculatorTools;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.ai.chat.client.ChatClient;
import org.springframework.stereotype.Service;

@Service
public class OllamaChatService {
  private static final Logger LOGGER = LoggerFactory.getLogger(OllamaChatService.class);

  private final ChatClient chatClient;
  private final OllamaCalculatorTools calculatorTools;

  public OllamaChatService(ChatClient chatClient, OllamaCalculatorTools calculatorTools) {
    this.chatClient = chatClient;
    this.calculatorTools = calculatorTools;
  }

  public void getSimpleInterest(String prompt) {
    String response = null;

    LOGGER.info("User provided prompt using tools:\n {}", prompt);

    if (chatClient != null) {
      response = chatClient.prompt()
              .user(prompt)
              .tools(calculatorTools)
              .call()
              .content();

      LOGGER.info("Response from chat using tools:\n {}", response);
    } else {
      LOGGER.warn("Chat client NOT initialized !!!");
    }
  }
}

The following is the main Spring Boot application to test the SpringAI tool usage functionality:


Listing.15
/*
 * Name:   FourthSpringAiApplication
 * Author: Bhaskar S
 * Date:   06/19/2025
 * Blog:   https://polarsparc.github.io
 */

package com.polarsparc.springai;

import com.polarsparc.springai.service.OllamaChatService;
import org.springframework.boot.CommandLineRunner;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;

@SpringBootApplication(scanBasePackages = "com.polarsparc")
public class FourthSpringAiApplication implements CommandLineRunner {
  private final OllamaChatService chatService;

  public FourthSpringAiApplication(OllamaChatService chatService) {
    this.chatService = chatService;
  }

  public static void main(String[] args) {
    SpringApplication.run(FourthSpringAiApplication.class, args);
  }

  @Override
  public void run(String... args) throws Exception {
    String prompt = "compute the simple interest for a principal of 1000 at the rate of 4.75";
    chatService.getSimpleInterest(prompt);
  }
}

To execute the code from Listing.15, open a terminal window and run the following commands:


$ cd $HOME/java/SpringAI/Fourth

$ mvn spring-boot:run


The following would be the typical output:


Output.4

[INFO] --- spring-boot:3.5.0:run (default-cli) @ Fourth ---
[INFO] Attaching agents: []
2025-06-20T22:21:01.813-04:00  INFO 178448 --- [Fourth-SpringAI] [           main] c.p.springai.FourthSpringAiApplication   : Starting FourthSpringAiApplication using Java 23.0.1 with PID 178448 (/home/polarsparc/java/SpringAI/Fourth/target/classes started by polarsparc in /home/polarsparc/java/SpringAI/Fourth)
2025-06-20T22:21:01.814-04:00  INFO 178448 --- [Fourth-SpringAI] [           main] c.p.springai.FourthSpringAiApplication   : No active profile set, falling back to 1 default profile: "default"
2025-06-20T22:21:02.319-04:00  INFO 178448 --- [Fourth-SpringAI] [           main] c.p.springai.FourthSpringAiApplication   : Started FourthSpringAiApplication in 0.699 seconds (process running for 0.84)
2025-06-20T22:21:02.321-04:00  INFO 178448 --- [Fourth-SpringAI] [           main] c.p.springai.service.OllamaChatService   : User provided prompt using tools:
 compute the simple interest for a principal of 1000 at the rate of 4.75
2025-06-20T22:21:04.138-04:00  INFO 178448 --- [Fourth-SpringAI] [           main] c.p.springai.util.OllamaCalculatorTools  : Input - principal: 1000.0, rate: 4.75
2025-06-20T22:21:04.138-04:00  INFO 178448 --- [Fourth-SpringAI] [           main] c.p.springai.util.OllamaCalculatorTools  : Simple interest for principal: 1000.0, rate: 4.75 is: 47.5
2025-06-20T22:21:04.448-04:00  INFO 178448 --- [Fourth-SpringAI] [           main] c.p.springai.service.OllamaChatService   : Response from chat using tools:
 The simple interest for a principal of 1000 at the rate of 4.75% per annum is 47.5.
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  4.234 s
[INFO] Finished at: 2025-06-20T22:21:04-04:00
[INFO] ------------------------------------------------------------------------

With this, we conclude the various demonstrations on using the Spring AI framework !!!


References

Spring AI Documentation



© PolarSPARC