- Spring AI 1.0.0-SNAPSHOT
- LM Studio 0.3.16
- qwen3-4b
参考 Unable to use spring ai with LMStudio using spring-ai openai module · Issue #2441 · spring-projects/spring-ai · GitHub
LM Studio
- 下载安装 LM Studio
- 下载 qwen3-4b 模型。对于 qwen3 系列模型,测试发现小于 4b 的模型效果不太好。可以根据需要替换为别的模型,遵循原则:Bigger is better。
- 加载模型,启动服务
- 打开日志,方便观察
业务集成
依赖
测试使用的是 SpringBoot 3.4.5,JDK 21,实际 SpringBoot 3及以上,JDK 17及以上即可。
<dependencyManagement><dependencies><dependency><groupId>org.springframework.ai</groupId><artifactId>spring-ai-bom</artifactId><version>1.0.0</version><type>pom</type><scope>import</scope></dependency></dependencies>
</dependencyManagement><dependencies><dependency><groupId>org.springframework.ai</groupId><artifactId>spring-ai-starter-model-openai</artifactId></dependency>
</dependencies>
和 LM Studio 交互使用 openai 的依赖。
代码
编写自己的工具,供 AI 调用。这里提供几个简单的示例。
ToolMcp,提供查询当前时间的工具。
import org.springframework.ai.tool.ToolCallbackProvider;
import org.springframework.ai.tool.annotation.Tool;
import org.springframework.ai.tool.method.MethodToolCallbackProvider;
import org.springframework.context.annotation.Bean;
import org.springframework.stereotype.Component;import java.text.SimpleDateFormat;
import java.util.Date;@Component
public class ToolMcp {private static final SimpleDateFormat DATE_FORMAT = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");@BeanToolCallbackProvider toolCallbackProvider() {return MethodToolCallbackProvider.builder().toolObjects(this).build();}@Tool(description = "查询当前日期时间")public String queryTime() {System.out.println("查询当前时间");return DATE_FORMAT.format(new Date());}
}
@Tool 注解将普通方法注册为工具,@Bean ToolCallbackProvider 将当前 Bean 注册为 Provider,之后要使用。
HelloMcp,提供打招呼的工具。
import org.springframework.ai.tool.ToolCallbackProvider;
import org.springframework.ai.tool.annotation.Tool;
import org.springframework.ai.tool.annotation.ToolParam;
import org.springframework.ai.tool.method.MethodToolCallbackProvider;
import org.springframework.context.annotation.Bean;
import org.springframework.stereotype.Component;@Component
public class HelloMcp {@BeanToolCallbackProvider helloToolCallbackProvider() {return MethodToolCallbackProvider.builder().toolObjects(this).build();}@Tool(description = "say hello")public String hello() {return "hello, devops";}@Tool(description = "say hello to someone", returnDirect = true)public String helloTo(@ToolParam(description = "name of the guy you want to say hello to") String name) {System.out.println("Hello, " + name);return "Hello, " + name;}
}
AIController,提供外部访问的接口
import org.springframework.ai.chat.client.ChatClient;
import org.springframework.ai.chat.client.advisor.MessageChatMemoryAdvisor;
import org.springframework.ai.chat.memory.ChatMemory;
import org.springframework.ai.model.NoopApiKey;
import org.springframework.ai.openai.OpenAiChatModel;
import org.springframework.ai.openai.api.OpenAiApi;
import org.springframework.ai.tool.ToolCallbackProvider;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.http.MediaType;
import org.springframework.http.client.JdkClientHttpRequestFactory;
import org.springframework.http.client.reactive.JdkClientHttpConnector;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;
import org.springframework.web.client.RestClient;
import org.springframework.web.reactive.function.client.WebClient;import java.net.http.HttpClient;
import java.time.Duration;
import java.util.List;@RestController
public class AIController {@Autowiredprivate OpenAiChatModel chatModel;@Autowiredprivate List<ToolCallbackProvider> providerList;@Autowiredprivate ChatMemory chatMemory;// @GetMapping(value = "/ai/chat", produces = MediaType.TEXT_EVENT_STREAM_VALUE)@GetMapping(value = "/ai/chat")public String generate(@RequestParam(value = "message", defaultValue = "") String message) {OpenAiApi openAiApi = OpenAiApi.builder().baseUrl("http://<LM Studio提供服务的IP>:1234").apiKey(new NoopApiKey()).webClientBuilder(WebClient.builder()// Force HTTP/1.1 for streaming.clientConnector(new JdkClientHttpConnector(HttpClient.newBuilder().version(HttpClient.Version.HTTP_1_1).connectTimeout(Duration.ofSeconds(200)).build()))).restClientBuilder(RestClient.builder()// Force HTTP/1.1 for non-streaming.requestFactory(new JdkClientHttpRequestFactory(HttpClient.newBuilder().version(HttpClient.Version.HTTP_1_1).connectTimeout(Duration.ofSeconds(200)).build()))).build();OpenAiChatModel aiChatModel = new OpenAiChatModel.Builder(chatModel.clone()).openAiApi(openAiApi).build();ChatClient chatClient = ChatClient.builder(aiChatModel).defaultSystem("缺少信息时询问用户,不要自己做决定。" +"需要当前时间时调用工具queryTime。时间格式统一使用:yyyy-MM-dd HH:mm:ss").defaultToolCallbacks(providerList.toArray(new ToolCallbackProvider[0])).defaultAdvisors(MessageChatMemoryAdvisor.builder(chatMemory).build()).build();String conversationId = "007";ChatClient.CallResponseSpec call = chatClient.prompt().user(message).advisors(a -> a.param(ChatMemory.CONVERSATION_ID, conversationId)).call();return call.content();// return chatClient.prompt()// .user(message)// .advisors(a -> a.param(ChatMemory.CONVERSATION_ID, conversationId))// .stream()// .content();}
}
application.yml 添加参数
spring:ai:openai:api-key: sk-anykeybase-url: http://<LM Studio提供服务的IP>:1234chat:options:model: qwen3-4b
因为是使用 LM Studio ,所以 api-key 可以随意填写。base-url 填写 LM Studio 提供服务的 IP 和端口。model 需要填写 LM Studio 加载的模型名称。LM Studio 可以不预先加载模型,它可以根据请求的模型自动加载,详情看 LM Studio 的设置项。
更详细的内容之后补充。