moe-mizrak / laravel-openrouter
Laravel 开放路由包(LLM 统一接口)
Requires
- php: ^8.1
- ext-json: *
- caseyamcl/guzzle_retry_middleware: ^2.9
- guzzlehttp/guzzle: ^7.8
- spatie/data-transfer-object: ^3.9.1
Requires (Dev)
- fakerphp/faker: ^1.12
- mockery/mockery: ^1.0
- orchestra/testbench: ^6.0
- phpunit/phpunit: ^9.0
This package is not auto-updated.
Last update: 2024-09-24 09:28:39 UTC
README
此 Laravel 包提供了一个易于使用的接口,用于将 OpenRouter 集成到您的 Laravel 应用程序中。 OpenRouter 是一个用于大型语言模型(LLM)的统一接口,允许您通过单个 API 与各种 AI 模型 交互。
目录
🤖 要求
- PHP 8.1 或 更高版本
🏁 入门
您可以通过 composer 安装 此包
composer require moe-mizrak/laravel-openrouter
您可以使用以下命令 发布 配置文件
php artisan vendor:publish --tag=laravel-openrouter
这是已发布配置文件的内容
return [ 'api_endpoint' => env('OPENROUTER_API_ENDPOINT', 'https://openrouter.ai/api/v1/'), 'api_key' => env('OPENROUTER_API_KEY'), ];
⚙️ 配置
发布包配置文件后,您需要将以下环境变量添加到您的 .env 文件中
OPENROUTER_API_ENDPOINT=https://openrouter.ai/api/v1/ OPENROUTER_API_KEY=your_api_key
- OPENROUTER_API_ENDPOINT: OpenRouter API 的端点 URL(默认:https://openrouter.ai/api/v1/)。
- OPENROUTER_API_KEY: 访问 OpenRouter API 的 API 密钥。您可以从 OpenRouter 控制台 获取此密钥。
🎨 使用
此包提供了两种与 OpenRouter API 交互的方式
- 使用
LaravelOpenRouter
Facade - 直接实例化
OpenRouterRequest
类。
两种方法都使用 ChatData
DTO 类来结构发送到 API 的数据。
理解 ChatData DTO
ChatData
类用于封装向 OpenRouter API 发送聊天请求所需的数据。以下是关键属性的说明
- messages(数组 | null):表示聊天消息的
MessageData
对象数组。此字段与prompt
字段进行 XOR 锁定。 - prompt(字符串 | null):表示聊天请求的提示字符串。此字段与
messages
字段进行 XOR 锁定。 - model(字符串 | null):用于聊天请求的模型名称。如果未指定,将使用用户的默认模型。此字段与
models
字段进行 XOR 锁定。 - response_format(ResponseFormatData | null):表示所需响应格式的
ResponseFormatData
类的实例。 - stop(数组 | 字符串 | null):指定聊天生成的停止序列的值。
- stream(布尔值 | null):表示是否启用流式传输的布尔值。
LLM 参数
这些属性控制生成的响应的各个方面(更多信息请参见 参数信息)
- max_tokens(整数 | null):可以生成的完成的最大令牌数。默认为 1024。
- temperature(整数 | null):一个介于 0 和 2 之间的值,用于控制输出的随机性。
- top_p(整数 | null):一个介于 0 和 1 之间的值,用于核采样,是温度采样的替代方案。
- top_k(整数 | null):一个介于 1 和无穷大之间的值,用于 top-k 采样(对于 OpenAI 模型不可用)。
- frequency_penalty (int|null): 根据现有频率对新的标记进行惩罚的值,介于-2和2之间。
- presence_penalty (int|null): 根据新标记是否已出现在文本中对其进行惩罚的值,介于-2和2之间。
- repetition_penalty (int|null): 对重复标记进行惩罚的值,介于0和2之间。
- seed (int|null): 用于确定性采样的值(仅适用于OpenAI模型,处于测试阶段)。
函数调用
仅由OpenAI模型原生支持。对于其他模型,我们在提示的最后提交一个以YAML格式化的字符串,包含这些工具。
- tool_choice (string|array|null): 指定函数调用时工具选择的值(仅适用于OpenAI模型)。
- tools (array|null): 用于函数调用的
ToolCallData
对象数组。
其他可选参数
- logit_bias (array|null): 用于修改指定标记在完成中出现可能性的数组。
仅适用于OpenRouter的参数
- transforms (array|null): 用于配置提示转换的数组。
- models (array|null): 如果主模型不可用,将自动尝试的模型数组。此字段与
model
字段是互斥的。 - route (string|null): 指定路由类型(例如,
RouteType::FALLBACK
)的值。 - provider (ProviderPreferencesData|null): 用于配置供应商偏好的
ProviderPreferencesData
DTO对象实例。
创建 ChatData 实例
这是一个示例聊天数据实例
$chatData = new ChatData([ 'messages' => [ new MessageData([ 'role' => RoleType::USER, 'content' => [ new TextContentData([ 'type' => TextContentData::ALLOWED_TYPE, 'text' => 'This is a sample text content.', ]), new ImageContentPartData([ 'type' => ImageContentPartData::ALLOWED_TYPE, 'image_url' => new ImageUrlData([ 'url' => 'https://example.com/image.jpg', 'detail' => 'Sample image', ]), ]), ], ]), ], 'response_format' => new ResponseFormatData([ 'type' => 'json_object', ]), 'stop' => ['stop_token'], 'stream' => true, 'max_tokens' => 1024, 'temperature' => 0.7, 'top_p' => 0.9, 'top_k' => 50, 'frequency_penalty' => 0.5, 'presence_penalty' => 0.2, 'repetition_penalty' => 1.2, 'seed' => 42, 'tool_choice' => 'auto', 'tools' => [ // ToolCallData instances ], 'logit_bias' => [ '50256' => -100, ], 'transforms' => ['middle-out'], 'models' => ['model1', 'model2'], 'route' => RouteType::FALLBACK, 'provider' => new ProviderPreferencesData([ 'allow_fallbacks' => true, 'require_parameters' => true, 'data_collection' => DataCollectionType::ALLOW, ]), ]);
使用 Facade
LaravelOpenRouter
门面对象提供了方便的方式执行OpenRouter API请求。
聊天请求
要发送聊天请求,创建一个ChatData
实例并将其传递给chatRequest
方法
$content = 'Tell me a story about a rogue AI that falls in love with its creator.'; // Your desired prompt or content $model = 'mistralai/mistral-7b-instruct:free'; // The OpenRouter model you want to use (https://openrouter.ai/docs#models) $messageData = new MessageData([ 'content' => $content, 'role' => RoleType::USER, ]); $chatData = new ChatData([ 'messages' => [ $messageData, ], 'model' => $model, 'max_tokens' => 100, // Adjust this value as needed ]); $chatResponse = LaravelOpenRouter::chatRequest($chatData);
也支持流式聊天请求,可以使用chatStreamRequest
函数如下使用
$content = 'Tell me a story about a rogue AI that falls in love with its creator.'; // Your desired prompt or content $model = 'mistralai/mistral-7b-instruct:free'; // The OpenRouter model you want to use (https://openrouter.ai/docs#models) $messageData = new MessageData([ 'content' => $content, 'role' => RoleType::USER, ]); $chatData = new ChatData([ 'messages' => [ $messageData, ], 'model' => $model, 'max_tokens' => 100, // Adjust this value as needed ]); /* * Calls chatStreamRequest ($promise is type of PromiseInterface) */ $promise = LaravelOpenRouter::chatStreamRequest($chatData); // Waits until the promise completes if possible. $stream = $promise->wait(); // $stream is type of GuzzleHttp\Psr7\Stream /* * 1) You can retrieve whole raw response as: - Choose 1) or 2) depending on your case. */ $rawResponseAll = $stream->getContents(); // Instead of chunking streamed response as below - while (! $stream->eof()), it waits and gets raw response all together. $response = LaravelOpenRouter::filterStreamingResponse($rawResponseAll); // Optionally you can use filterStreamingResponse to filter raw streamed response, and map it into array of responseData DTO same as chatRequest response format. // 2) Or Retrieve streamed raw response as it becomes available: while (! $stream->eof()) { $rawResponse = $stream->read(1024); // readByte can be set as desired, for better performance 4096 byte (4kB) can be used. /* * Optionally you can use filterStreamingResponse to filter raw streamed response, and map it into array of responseData DTO same as chatRequest response format. */ $response = LaravelOpenRouter::filterStreamingResponse($rawResponse); }
由于chatStreamRequest
会为您执行,因此您不需要在ChatData
中指定'stream' = true
。
这是预期的示例原始响应(来自OpenRouter流块的原始响应)$rawResponse
""" : OPENROUTER PROCESSING\n \n data: {"id":"gen-eWgGaEbIzFq4ziGGIsIjyRtLda54","model":"mistralai/mistral-7b-instruct:free","object":"chat.completion.chunk","created":1718885921,"choices":[{"index":0,"delta":{"role":"assistant","content":"Title"},"finish_reason":null}]}\n \n data: {"id":"gen-eWgGaEbIzFq4ziGGIsIjyRtLda54","model":"mistralai/mistral-7b-instruct:free","object":"chat.completion.chunk","created":1718885921,"choices":[{"index":0,"delta":{"role":"assistant","content":": Quant"},"finish_reason":null}]}\n \n data: {"id":"gen-eWgGaEbIzFq4ziGGIsIjyRtLda54","model":"mistralai/mistral-7b-instruct:free","object":"chat.completion.chunk","created":1718885921,"choices":[{"index":0,"delta":{"role":"assistant","content":"um Echo"},"finish_reason":null}]}\n \n data: {"id":"gen-eWgGaEbIzFq4ziGGIsIjyRtLda54","model":"mistralai/mistral-7b-instruct:free","object":"chat.completion.chunk","created":1718885921,"choices":[{"index":0,"delta":{"role":"assistant","content":": A Sym"},"finish_reason":null}]}\n \n data: {"id":"gen-eWgGaEbIzFq4ziGG """ """ IsIjyRtLda54","model":"mistralai/mistral-7b-instruct:free","object":"chat.completion.chunk","created":1718885921,"choices":[{"index":0,"delta":{"role":"assistant","content":"phony of Code"},"finish_reason":null}]}\n \n data: {"id":"gen-eWgGaEbIzFq4ziGGIsIjyRtLda54","model":"mistralai/mistral-7b-instruct:free","object":"chat.completion.chunk","created":1718885921,"choices":[{"index":0,"delta":{"role":"assistant","content":"\n\nIn"},"finish_reason":null}]}\n \n data: {"id":"gen-eWgGaEbIzFq4ziGGIsIjyRtLda54","model":"mistralai/mistral-7b-instruct:free","object":"chat.completion.chunk","created":1718885921,"choices":[{"index":0,"delta":{"role":"assistant","content":" the heart of"},"finish_reason":null}]}\n \n data: {"id":"gen-eWgGaEbIzFq4ziGGIsIjyRtLda54","model":"mistralai/mistral-7b-instruct:free","object":"chat.completion.chunk","created":1718885921,"choices":[{"index":0,"delta":{"role":"assistant","content":" the bustling"},"finish_reason":null}]}\n \n data: {"id":"gen-eWgGaEbIzFq4ziGGIsIjyRtLda54","model":"mistralai/mistra """ """ l-7b-instruct:free","object":"chat.completion.chunk","created":1718885921,"choices":[{"index":0,"delta":{"role":"assistant","content":" city of Ne"},"finish_reason":null}]}\n \n data: {"id":"gen-eWgGaEbIzFq4ziGGIsIjyRtLda54","model":"mistralai/mistral-7b-instruct:free","object":"chat.completion.chunk","created":1718885921,"choices":[{"index":0,"delta":{"role":"assistant","content":"o-Tok"},"finish_reason":null}]}\n \n data: {"id":"gen-eWgGaEbIzFq4ziGGIsIjyRtLda54","model":"mistralai/mistral-7b-instruct:free","object":"chat.completion.chunk","created":1718885921,"choices":[{"index":0,"delta":{"role":"assistant","content":"yo, a"},"finish_reason":null}]}\n \n data: {"id":"gen-eWgGaEbIzFq4ziGGIsIjyRtLda54","model":"mistralai/mistral-7b-instruct:free","object":"chat.completion.chunk","created":1718885921,"choices":[{"index":0,"delta":{"role":"assistant","content":" brilliant young research"},"finish_reason":null}]}\n \n data: {"id":"gen-eWgGaEbIzFq4ziGGIsIjyRtLda54","model":"mistralai/mistral-7b-instruct:free","object":"chat.com """ ... : OPENROUTER PROCESSING\n \n data: {"id":"gen-C6Xym94jZcvJv2vVpxYSyw2tV1fR","model":"mistralai/mistral-7b-instruct:free","object":"chat.completion.chunk","created":1718887189,"choices":[{"index":0,"delta":{"role":"assistant","content":""},"finish_reason":null}],"usage":{"prompt_tokens":23,"completion_tokens":100,"total_tokens":123}}\n \n data: [DONE]\n
最后的data:
包含流的使用信息。当流结束时,来自OpenRouter服务器的响应为data: [DONE]\n
。
这是经过filterStreamingResponse后的示例响应
[
ResponseData(
id: "gen-QcWgjEtiEDNHgomV2jjoQpCZlkRZ",
model: "mistralai/mistral-7b-instruct:free",
object: "chat.completion.chunk",
created: 1718888436,
choices: [
[
"index" => 0,
"delta" => [
"role" => "assistant",
"content" => "Title"
],
"finish_reason" => null
]
],
usage: null
),
ResponseData(
id: "gen-QcWgjEtiEDNHgomV2jjoQpCZlkRZ",
model: "mistralai/mistral-7b-instruct:free",
object: "chat.completion.chunk",
created: 1718888436,
choices: [
[
"index" => 0,
"delta" => [
"role" => "assistant",
"content" => "Quant"
],
"finish_reason" => null
]
],
usage: null
),
...
new ResponseData([
'id' => 'gen-QcWgjEtiEDNHgomV2jjoQpCZlkRZ',
'model' => 'mistralai/mistral-7b-instruct:free',
'object' => 'chat.completion.chunk',
'created' => 1718888436,
'choices' => [
[
'index' => 0,
'delta' => [
'role' => 'assistant',
'content' => '',
],
'finish_reason' => null,
],
],
'usage' => new UsageData([
'prompt_tokens' => 23,
'completion_tokens' => 100,
'total_tokens' => 123,
]),
]),
]
如果您想保持conversation continuity
(对话连续性),即历史聊天将被记住并考虑在新的聊天请求中,则需要将历史消息与新的消息一起发送
$model = 'mistralai/mistral-7b-instruct:free'; $firstMessage = new MessageData([ 'role' => RoleType::USER, 'content' => 'My name is Moe, the AI necromancer.', ]); $chatData = new ChatData([ 'messages' => [ $firstMessage, ], 'model' => $model, ]); // This is the chat which you want LLM to remember $oldResponse = LaravelOpenRouter::chatRequest($chatData); /* * You can skip part above and just create your historical message below (maybe you retrieve historical messages from DB etc.) */ // Here adding historical response to new message $historicalMessage = new MessageData([ 'role' => RoleType::ASSISTANT, // set as assistant since it is a historical message retrieved previously 'content' => Arr::get($oldResponse->choices[0],'message.content'), // Historical response content retrieved from previous chat request ]); // This is your new message $newMessage = new MessageData([ 'role' => RoleType::USER, 'content' => 'Who am I?', ]); $chatData = new ChatData([ 'messages' => [ $historicalMessage, $newMessage, ], 'model' => $model, ]); $response = LaravelOpenRouter::chatRequest($chatData);
预期响应
$content = Arr::get($response->choices[0], 'message.content'); // content = You are Moe, a fictional character and AI Necromancer, as per the context of the conversation we've established. In reality, you are the user interacting with me, an assistant designed to help answer questions and engage in friendly conversation.
成本请求
要检索生成成本,首先进行chat request
并获取generationId
。然后,将generationId传递给costRequest
方法
$content = 'Tell me a story about a rogue AI that falls in love with its creator.'; // Your desired prompt or content $model = 'mistralai/mistral-7b-instruct:free'; // The OpenRouter model you want to use (https://openrouter.ai/docs#models) $messageData = new MessageData([ 'content' => $content, 'role' => RoleType::USER, ]); $chatData = new ChatData([ 'messages' => [ $messageData, ], 'model' => $model, 'max_tokens' => 100, // Adjust this value as needed ]); $chatResponse = LaravelOpenRouter::chatRequest($chatData); $generationId = $chatResponse->id; // generation id which will be passed to costRequest $costResponse = LaravelOpenRouter::costRequest($generationId);
限制请求
要检索API密钥的速率限制和剩余信用额度
$limitResponse = LaravelOpenRouter::limitRequest();
使用 OpenRouterRequest 类
您也可以在类的构造函数中注入OpenRouterRequest
类并直接使用其方法。
public function __construct(protected OpenRouterRequest $openRouterRequest) {}
聊天请求
同样,要发送聊天请求,创建一个ChatData
实例并将其传递给chatRequest
方法
$content = 'Tell me a story about a rogue AI that falls in love with its creator.'; // Your desired prompt or content $model = 'mistralai/mistral-7b-instruct:free'; // The OpenRouter model you want to use (https://openrouter.ai/docs#models) $messageData = new MessageData([ 'content' => $content, 'role' => RoleType::USER, ]); $chatData = new ChatData([ 'messages' => [ $messageData, ], 'model' => $model, 'max_tokens' => 100, // Adjust this value as needed ]); $response = $this->openRouterRequest->chatRequest($chatData);
成本请求
同样,要检索生成成本,创建一个chat request
以获取generationId
,然后传递generationId
到costRequest
方法
$content = 'Tell me a story about a rogue AI that falls in love with its creator.'; $model = 'mistralai/mistral-7b-instruct:free'; // The OpenRouter model you want to use (https://openrouter.ai/docs#models) $messageData = new MessageData([ 'content' => $content, 'role' => RoleType::USER, ]); $chatData = new ChatData([ 'messages' => [ $messageData, ], 'model' => $model, 'max_tokens' => 100, // Adjust this value as needed ]); $chatResponse = $this->openRouterRequest->chatRequest($chatData); $generationId = $chatResponse->id; // generation id which will be passed to costRequest $costResponse = $this->openRouterRequest->costRequest($generationId);
限制请求
同样,要检索API密钥的速率限制和剩余信用额度
$limitResponse = $this->openRouterRequest->limitRequest();
💫 贡献
我们欢迎贡献!如果您想改进此包,只需创建一个包含您更改的pull request。您的努力有助于增强其功能和文档。
📜 许可证
Laravel OpenRouter是一个开源软件,受MIT许可
许可。