helgesverre / mistral
Mistral.ai API 的 Laravel 客户端
v1.3.1
2024-04-20 14:53 UTC
Requires
- php: ^8.2
- saloonphp/laravel-plugin: ^v3.5
- spatie/laravel-data: ^3|^4
- spatie/laravel-package-tools: ^1.16
Requires (Dev)
- larastan/larastan: ^2.0.1
- laravel/pint: ^1.0
- nunomaduro/collision: ^7|^8
- orchestra/testbench: ^8.0|^9.0
- pestphp/pest: ^2.20
- pestphp/pest-plugin-arch: ^2.0
- pestphp/pest-plugin-laravel: ^2.0
- phpstan/extension-installer: ^1.1
- phpstan/phpstan-deprecation-rules: ^1.0
- phpstan/phpstan-phpunit: ^1.0
README
Mistral.AI 的 Laravel 客户端
Mistral.ai Laravel 客户端允许 Laravel 应用程序与 Mistral.ai API 交互,提供了对聊天完成和文本嵌入等功能的直接访问。
您可以在 console.mistral.ai 获取您的 API 密钥。
安装
您可以通过 composer 安装此包
composer require helgesverre/mistral
您可以使用以下命令发布配置文件
php artisan vendor:publish --tag="mistral-config"
这是已发布配置文件的内容
return [ 'api_key' => env('MISTRAL_API_KEY'), 'base_url' => env('MISTRAL_BASE_URL', 'https://api.mistral.ai'), 'timeout' => env('MISTRAL_TIMEOUT', 30), ];
使用方法
客户端实例化
创建 Mistral 客户端的实例以开始与 API 交互。此实例将是您向 Mistral.AI 发送请求的主要接口。
use HelgeSverre\Mistral\Enums\Model; use HelgeSverre\Mistral\Mistral; // Instantiate the client $mistral = new Mistral(apiKey: config('mistral.api_key')); // Or use the Facade (Laravel) Mistral::chat(); Mistral::simpleChat(); Mistral::embedding(); Mistral::models();
资源
模型
资源
列出可用的模型
// Models $response = $mistral->models()->list(); /** @var \HelgeSverre\Mistral\Dto\Embedding\EmbeddingResponse $dto */ $dto = $response->dto();
嵌入
资源
创建嵌入
$response = $mistral->embedding()->create([ "A string here", "Another one here", ]); /** @var EmbeddingResponse $dto */ $dto = $response->dto();
聊天
资源
创建聊天完成
$response = $mistral->chat()->create( messages: [ [ "role" => "user", "content" => "Write hello world in BASH", ] ], model: Model::medium->value, temperature: 0.4, maxTokens: 100, safeMode: false ); /** @var ChatCompletionResponse $dto */ $dto = $response->dto();
通过函数调用创建聊天完成
$response = $this->mistral->chat()->create( messages: [ [ 'role' => Role::user->value, 'content' => 'What is the weather in Bergen, Norway?', ], ], model: Model::large->value, maxTokens: 1000, tools: [ [ 'type' => 'function', 'function' => [ 'name' => 'searchWeather', 'description' => 'Get the weather for a location', 'parameters' => [ 'type' => 'object', 'required' => [ 'location', ], 'properties' => [ 'location' => [ 'type' => 'string', 'description' => 'The location to get the weather for.', ], ], ], ], ], [ 'type' => 'function', 'function' => [ 'name' => 'sendWeatherNotification', 'description' => 'Send notification about weather to a user', 'parameters' => [ 'type' => 'object', 'required' => [ 'userId', 'message', ], 'properties' => [ 'userId' => [ 'type' => 'string', 'description' => 'the id of the user', ], 'message' => [ 'type' => 'string', 'description' => 'the message to send the user', ], ], ], ], ], ], toolChoice: 'any', ); // Tool calls are returned in the response $response->json('choices.0.message.tool_calls'); $response->json('choices.0.message.tool_calls.0.id'); $response->json('choices.0.message.tool_calls.0.type'); $response->json('choices.0.message.tool_calls.0.function'); $response->json('choices.0.message.tool_calls.0.function.name'); $response->json('choices.0.message.tool_calls.0.function.arguments'); // Or using the dto /** @var ChatCompletionResponse $dto */ $dto = $response->dto(); $dto->choices; // array of ChatCompletionChoice foreach ($dto->choices as $choice) { $choice->message; // ChatCompletionMessage foreach ($choice->message->toolCalls as $toolCall) { $toolCall->id; // null $toolCall->type; // function $toolCall->function; // FunctionCall $toolCall->function->name; // 'searchWeather' $toolCall->function->arguments; // '{"location":"Bergen, Norway"}' $toolCall->function->args(); // ['location' => 'Bergen, Norway'] } }
创建流式聊天完成
// Returns a generator, which you can iterate over to get the streamed chunks $stream = $this->mistral->chat()->createStreamed( messages: [ [ 'role' => 'user', 'content' => 'Make a markdown list of 10 common fruits' ], ], model: Model::small->value, ); foreach ($stream as $chunk) { /** @var StreamedChatCompletionResponse $chunk */ echo $chunk->id; // 'cmpl-0339459d35cb441b9f111b94216cff97' echo $chunk->model; // 'mistral-small' echo $chunk->object; // 'chat.completion.chunk' echo $chunk->created; // DateTime foreach ($chunk->choices as $choice) { $choice->index; // 0 $choice->delta->role; // 'assistant' $choice->delta->content; // 'Fruit list...' $choice->finishReason; // 'length' } }
SimpleChat
资源
为了方便,客户端还提供了一个简单的聊天完成方法,该方法返回一个更简单、更紧凑且扁平的 DTO,这对于快速原型设计很有用。
创建简单的聊天完成
$response = $mistral->simpleChat()->create( messages: [ [ "role" => "user", "content" => "Hello world!", ], ], model: Model::medium->value, temperature: 0.4, maxTokens: 1500, safeMode: false ); /** @var ChatCompletionResponse $response */
SimpleChat
资源
为了方便,客户端还提供了一个简单的聊天完成方法,该方法返回一个更简单、更紧凑和更扁平的 DTO,这对于快速原型设计很有用。
创建流式简单聊天完成
// Returns a generator, which you can iterate over to get the streamed chunks $response = $this->mistral->simpleChat()->stream( messages: [ [ 'role' => "user", 'content' => 'Say the word "banana"', ], ], maxTokens: 100, ); foreach ($response as $chunk) { /** @var SimpleStreamChunk $chunk */ $chunk->id; // 'cmpl-716e95d336db4e51a04cbcf2b84d1a76' $chunk->model; // 'mistral-medium' $chunk->object; // 'chat.completion.chunk' $chunk->created; // '2024-01-03 12:00:00' $chunk->role; // 'assistant' $chunk->content; // 'the text \n' $chunk->finishReason; // 'length' }
DTO 列表
以下列出在此包中可用的所有 DTO。
- 聊天
- 嵌入
- 模型
- SimpleChat
- 其他
可用的 Mistral 模型列表
以下模型可在 Mistral API 中使用。您可以使用此包中的 Model
枚举来引用它们,或直接使用字符串值。
测试
cp .env.example .env
composer test
composer analyse src
许可证
MIT 许可证 (MIT)。请参阅 许可证文件 了解更多信息。
免责声明
Mistral 和 Mistral 标志是 Mistral.ai 的商标。此包与 Mistral.ai 无关、未获认可或赞助。所有商标和注册商标均为其各自所有者的财产。
有关更多信息,请参阅 Mistral.AI。