logo
1

怎么用LangChain调用ERNIE的function call

ERNIE支持functions的模型可以结合LangChain快速开发带自定义插件应用。
千帆平台的官方文档示例是这样的
  
  
  
  
  
  
"functions": [
{
"name": "get_current_weather",
"description": "获得指定地点的天气",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "省,市名,例如:河北省,石家庄"
},
"unit": {
"type": "string",
"enum": ["摄氏度", "华氏度"]
}
},
"required": ["location"]
}
}
]
LangChain定义Tool有几种方式,常用的 tool注解和继承BaseTool的类。
使用langchain_core.utils.function_calling包下的方法convert_to_openai_tool,可以把Tool转成描述function的json格式,符合官方的格式要求,很方便。
  
  
  
  
  
  
from langchain_core.tools import tool
from langchain_core.utils.function_calling import convert_to_openai_tool
from langchain_community.chat_models.baidu_qianfan_endpoint import QianfanChatEndpoint
@tool
def get_current_weather(city: str) -> str:
"""获取当前城市的天气"""
return f"当前{city}的天气是晴天"
tool_json = convert_to_openai_tool(get_current_weather)
chat_model = QianfanChatEndpoint(model="ERNIE-3.5-8K", streaming=True).bind(functions=[tool_json["function"]])
for chuck in chat_model.stream("北京的天气怎样?"):
print(chuck)
这里只是让大模型判断是不是要调用function,如果要调用Tool,还需要增加下一步的逻辑代码。返回如下
  
  
  
  
  
  
content="" additional_kwargs={
"finish_reason": "function_call",
"request_id": "as-4dyf6zy4",
"object": "chat.completion",
"search_info": []
} response_metadata={
"token_usage": {
"prompt_tokens": 63,
"completion_tokens": 33,
"total_tokens": 96
},
"model_name": "ERNIE-3.5-8K",
"finish_reason": "function_call",
"id": "as-4dyf6zy4",
"object": "chat.completion",
"created": 1712320443,
"result": "",
"is_truncated": False,
"need_clear_history": False,
"function_call": {
"name": "get_current_weather",
"thoughts": "用户想要知道北京的天气情况。我可以使用get_current_weather工具来获取北京的天气信息。",
"arguments": "{"city":"北京"}"
},
"usage": {
"prompt_tokens": 63,
"completion_tokens": 33,
"total_tokens": 96
}
}
大模型返回的thoughts挺好的,可惜如果使用LangChain的Message作为输入输出,框架在封装的时候会丢掉,输出里面没办法拿到thoughts。
评论
用户头像