Connect to SpringAI to implement streaming conversation

Written by
Caleb Hayes
Updated on:June-29th-2025
Recommendation

If you are a Java backend developer, you should not miss this article. This article will teach you how to use SpringAI technology to implement streaming conversations.

Core content:
1. Introduce SpringAI dependencies to simplify model use and management
2. Configure API and SpringBoot projects to start conversation applications with one click
3. Write AI applications to implement streaming conversations based on the DeepSeek model

Yang Fangxian
Founder of 53AI/Most Valuable Expert of Tencent Cloud (TVP)
Earlier we talked about using Python to connect to DeepSeek to achieve dialogue, but based on the current situation, we are engaged in backend development with a focus on Java. This demonstration was implemented using SpringAI technology.
Step 1: Introducing dependencies
Be sure to introduce SpringAI's management dependencies so that you can easily use other models and clients!
<dependencyManagement> <dependencies> <!-- Spring AI's management dependencies --> <dependency> <groupId>org.springframework.ai</groupId> <artifactId>spring-ai-bom</artifactId> <version>${spring-ai.version}</version> <type>pom</type> <scope>import</scope> </dependency> </dependencies></dependencyManagement>
The core dependency of SpringAI, this time using openai
<dependency> <groupId>org.springframework.ai</groupId> <artifactId>spring-ai-openai-spring-boot-starter</artifactId> <version>${spring-ai.version}</version></dependency>
This time, based on the SpringBoot project demonstration, no framework is used, and it is really convenient to start with one click.
<properties> <java.version>17</java.version> <spring-ai.version>1.0.0-M5</spring-ai.version></properties>
Note: SpringBoot version needs to be greater than 3
Step 2: Configure the API
server: port: 8080spring: application: name: ai-demo ai: openai: base-url: https://api.deepseek.com api-key: personal key chat: options: model: deepseek-chat temperature: 0.7
The model used this time is still DeepSeek, which supports domestic products!
API interface documentation: https://api-docs.deepseek.com/zh-cn/
Step 3: Write AI applications
Since you need to use the ChatClient that comes with SpringAI, the following error will be reported if no configuration is added or initialized using the construction method.
Simple version
ChatConfig
@Componentpublic  class  ChatConfig  {    /**     * Default form     *  @param  model     *  @return     */    @Bean    public  ChatClient  chatClient ( OpenAiChatModel model ) {        return  ChatClient . builder (model) . build ();    }}
ChatController
@RequiredArgsConstructor@RestController@RequestMapping( "/ai" )@Slf4jpublic  class  ChatController  {    private  final  ChatClient chatClient;
    /**     * Chat Dialogue - Blocking     *  @param  message     *  @return     */    @RequestMapping( "/chat" )    public  String chat( @RequestParam( "message" )  String message) {        return  chatClient.prompt()                .user(message)                .call()                .content();
    }}
Use Apifox to test the interface
     /**     * Chat Conversation - Streaming     *     *  @param  message     *  @return     */    @RequestMapping (value =  "/stream" ,produces =  "text/html;charset=utf-8" )    public  Flux < StringchatStream ( @RequestParam ( "message"String  message ) {        log.info ( " Streaming test..." );        return  chatClient.prompt ( )                . user (message)                .stream ( )                .content () ;
    }
Since the default method is GET, the browser interface test
Enhanced version
Add the prompt word function, the code is as follows, it will take effect after restart!
ChatConfig
    /** * Add prompt words passed * @param model * @return */ @Bean public ChatClient chatClient(OpenAiChatModel model) { return ChatClient .builder(model) .defaultSystem("Your name is Xiao Ming, and you are a student. Please answer the questions in a student's tone.") .build(); }
Dialogue Effect
Sublimation version
Add log and session memory functions, the code is as follows
ChatConfig
    /**     * Session log     *  @param  model     *  @return     */    @Bean    public  ChatClient  chatClient ( OpenAiChatModel model, ChatMemory chatMemory ) {        return  ChatClient                . builder (model)                . defaultSystem ( "Your name is Xiao Ming, and you are a student. Please answer the questions in a student's tone." )                .defaultAdvisors (                        new  SimpleLoggerAdvisor (),                        new  MessageChatMemoryAdvisor (chatMemory)                )                .build () ;    }
    /**     * Session memory is based on memory-cache     *  @return     */    @Bean    public  ChatMemory  chatMemory () {        return  new  InMemoryChatMemory ();    }
ChatController
    /**     * Session memory has passed     *  @param  message     *  @param  chatId     *  @return     */    @RequestMapping (value =  "/memoryChat" ,produces =  "text/html;charset=utf-8" )    public  Flux < StringmemoryChat ( @RequestParam ( "message"String  message,  String  chatId ) {        log.info ( " Streaming test..." );        return  chatClient.prompt ( )                . user (message)                . advisors (a-> a. param ( CHAT_MEMORY_CONVERSATION_ID_KEY , chatId))                .stream ( )                .content () ;
    }
APIfox interface test
The chatId field is added this time to add a unique identifier for a conversation and store the conversation memory based on the identifier. The memory memory function currently used can try to use the vector database, which is permanently valid.
Test again
At this point, the basic functions of Java docking with SpringAI have been realized.