Recommend this page to a friend! |
Download |
Info | Documentation | Files | Install with Composer | Download | Reputation | Support forum | Blog | Links |
Last Updated | Ratings | Unique User Downloads | Download Rankings | |||||
2024-08-14 (3 days ago) | Not yet rated by the users | Total: 16 This week: 16 | All time: 11,329 This week: 8 |
Version | License | PHP version | Categories | |||
ollama-php 1.0 | Custom (specified... | 7 | Algorithms, Web services, Artificial ..., P... |
Description | Author | ||||||||
This package can generate responses to questions using Ollama AI. |
|
This is a PHP library for Ollama. Ollama is an open-source project that serves as a powerful and user-friendly platform for running LLMs on your local machine. It acts as a bridge between the complexities of LLM technology and the desire for an accessible and customizable AI experience. <p align="left"> <a href="https://packagist.org/packages/ardagnsrn/ollama-php"><img alt="Total Downloads" src="https://img.shields.io/packagist/dt/ardagnsrn/ollama-php"></a> <a href="https://packagist.org/packages/ardagnsrn/ollama-php"><img alt="Latest Version" src="https://img.shields.io/packagist/v/ardagnsrn/ollama-php"></a> <a href="https://packagist.org/packages/ardagnsrn/ollama-php"><img alt="License" src="https://img.shields.io/github/license/ardagnsrn/ollama-php"></a> </p>
Whether you use this project, have learned something from it, or just like it, please consider supporting it by buying me a coffee, so I can dedicate more time on open-source projects like this :)
<a href="https://www.buymeacoffee.com/ardagnsrn" target="_blank"><img src="https://www.buymeacoffee.com/assets/img/custom_images/orange_img.png" alt="Buy Me A Coffee" style="height: auto !important;width: auto !important;" ></a>
> Requires PHP 8.1+\ > Requires Ollama
> You can find Official Ollama documentation here.
First, install Ollama PHP via the Composer package manager:
composer require ardagnsrn/ollama-php
Then, you can create a new Ollama client instance:
// with default base URL
$client = \ArdaGnsrn\Ollama\Ollama::client();
// or with custom base URL
$client = \ArdaGnsrn\Ollama\Ollama::client('http://localhost:11434');
Completions
Resourcecreate
Generate a response for a given prompt with a provided model.
$completions = $client->completions()->create([
'model' => 'llama3.1',
'prompt' => 'Once upon a time',
]);
$completions->response; // '...in a land far, far away...'
$response->toArray(); // ['model' => 'llama3.1', 'response' => '...in a land far, far away...', ...]
createStreamed
Generate a response for a given prompt with a provided model and stream the response.
$completions = $client->completions()->createStreamed([
'model' => 'llama3.1',
'prompt' => 'Once upon a time',
]);
foreach ($completions as $completion) {
echo $completion->response;
}
// 1. Iteration: '...in'
// 2. Iteration: ' a'
// 3. Iteration: ' land'
// 4. Iteration: ' far,'
// ...
Chat
Resourcecreate
Generate a response for a given prompt with a provided model.
$response = $client->chat()->create([
'model' => 'llama3.1',
'messages' => [
['role' => 'system', 'content' => 'You are a llama.'],
['role' => 'user', 'content' => 'Hello!'],
['role' => 'assistant', 'content' => 'Hi! How can I help you today?'],
['role' => 'user', 'content' => 'I need help with my taxes.'],
],
]);
$response->message->content; // 'Ah, taxes... chew chew Hmm, not really sure how to help with that.'
$response->toArray(); // ['model' => 'llama3.1', 'message' => ['role' => 'assistant', 'content' => 'Ah, taxes...'], ...]
Also, you can use the tools
parameter to provide custom functions to the chat. tools
parameter can not be used
with createStreamed
method.
$response = $client->chat()->create([
'model' => 'llama3.1',
'messages' => [
['role' => 'user', 'content' => 'What is the weather today in Paris?'],
],
'tools' => [
[
'type' => 'function',
'function' => [
'name' => 'get_current_weather',
'description' => 'Get the current weather',
'parameters' => [
'type' => 'object',
'properties' => [
'location' => [
'type' => 'string',
'description' => 'The location to get the weather for, e.g. San Francisco, CA',
],
'format' => [
'type' => 'string',
'description' => 'The location to get the weather for, e.g. San Francisco, CA',
'enum' => ['celsius', 'fahrenheit']
],
],
'required' => ['location', 'format'],
],
],
]
]
]);
$toolCall = $response->message->toolCalls[0];
$toolCall->function->name; // 'get_current_weather'
$toolCall->function->arguments; // ['location' => 'Paris', 'format' => 'celsius']
$response->toArray(); // ['model' => 'llama3.1', 'message' => ['role' => 'assistant', 'toolCalls' => [...]], ...]
createStreamed
Generate a response for a given prompt with a provided model and stream the response.
$responses = $client->chat()->createStreamed([
'model' => 'llama3.1',
'messages' => [
['role' => 'system', 'content' => 'You are a llama.'],
['role' => 'user', 'content' => 'Hello!'],
['role' => 'assistant', 'content' => 'Hi! How can I help you today?'],
['role' => 'user', 'content' => 'I need help with my taxes.'],
],
]);
foreach ($responses as $response) {
echo $response->message->content;
}
// 1. Iteration: 'Ah,'
// 2. Iteration: ' taxes'
// 3. Iteration: '... '
// 4. Iteration: ' *chew,'
// ...
Models
Resourcelist
List all available models.
$response = $client->models()->list();
$response->toArray(); // ['models' => [['name' => 'llama3.1', ...], ['name' => 'llama3.1:80b', ...], ...]]
show
Show details of a specific model.
$response = $client->models()->show('llama3.1');
$response->toArray(); // ['modelfile' => '...', 'parameters' => '...', 'template' => '...']
create
Create a new model.
$response = $client->models()->create([
'name' => 'mario',
'modelfile' => "FROM llama3.1\nSYSTEM You are mario from Super Mario Bros."
]);
$response->status; // 'success'
createStreamed
Create a new model and stream the response.
$responses = $client->models()->createStreamed([
'name' => 'mario',
'modelfile' => "FROM llama3.1\nSYSTEM You are mario from Super Mario Bros."
]);
foreach ($responses as $response) {
echo $response->status;
}
copy
Copy an existing model.
$client->models()->copy('llama3.1', 'llama3.2'); // bool
delete
Delete a model.
$client->models()->delete('mario'); // bool
pull
Pull a model from the Ollama server.
$response = $client->models()->pull('llama3.1');
$response->toArray() // ['status' => 'downloading digestname', 'digest' => 'digestname', 'total' => 2142590208, 'completed' => 241970]
pullStreamed
Pull a model from the Ollama server and stream the response.
$responses = $client->models()->pullStreamed('llama3.1');
foreach ($responses as $response) {
echo $response->status;
}
push
Push a model to the Ollama server.
$response = $client->models()->push('llama3.1');
$response->toArray() // ['status' => 'uploading digestname', 'digest' => 'digestname', 'total' => 2142590208]
pushStreamed
Push a model to the Ollama server and stream the response.
$responses = $client->models()->pushStreamed('llama3.1');
foreach ($responses as $response) {
echo $response->status;
}
runningList
List all running models.
$response = $client->models()->runningList();
$response->toArray(); // ['models' => [['name' => 'llama3.1', ...], ['name' => 'llama3.1:80b', ...], ...]]
Blobs
Resourceexists
Check if a blob exists.
$client->blobs()->exists('blobname'); // bool
create
Create a new blob.
$client->blobs()->create('blobname'); // bool
Embed
Resourcecreate
Generate an embedding for a given text with a provided model.
$response = $client->embed()->create([
'model' => 'llama3.1',
'input' => [
"Why is the sky blue?",
]
]);
$response->toArray(); // ['model' => 'llama3.1', 'embedding' => [0.1, 0.2, ...], ...]
composer test
Please see CHANGELOG for more information on what has changed recently.
Please see CONTRIBUTING for details.
The MIT License (MIT). Please see License File for more information.
Files (39) |
File | Role | Description | ||
---|---|---|---|---|
src (2 files, 3 directories) | ||||
tests (1 file, 1 directory) | ||||
.editorconfig | Data | Auxiliary data | ||
CHANGELOG.md | Data | Auxiliary data | ||
composer.json | Data | Auxiliary data | ||
LICENSE.md | Lic. | License text | ||
phpunit.xml.dist | Data | Auxiliary data | ||
README.md | Doc. | Documentation |
Files (39) | / | src |
File | Role | Description | ||
---|---|---|---|---|
Contracts (7 files) | ||||
Resources (5 files) | ||||
Responses (1 file, 4 directories) | ||||
Ollama.php | Class | Class source | ||
OllamaClient.php | Class | Class source |
Files (39) | / | src | / | Contracts |
File | Role | Description |
---|---|---|
BlobsContract.php | Class | Class source |
ChatContract.php | Class | Class source |
CompletionsContract.php | Class | Class source |
EmbedContract.php | Class | Class source |
ModelsContract.php | Class | Class source |
ResponseContract.php | Class | Class source |
StreamResponseContract.php | Class | Class source |
Files (39) | / | src | / | Resources |
File | Role | Description |
---|---|---|
Blobs.php | Class | Class source |
Chat.php | Class | Class source |
Completions.php | Class | Class source |
Embed.php | Class | Class source |
Models.php | Class | Class source |
Files (39) | / | src | / | Responses |
File | Role | Description | ||
---|---|---|---|---|
Chat (4 files) | ||||
Completions (1 file) | ||||
Embed (1 file) | ||||
Models (9 files) | ||||
StreamResponse.php | Class | Class source |
Files (39) | / | src | / | Responses | / | Chat |
File | Role | Description |
---|---|---|
ChatMessageResponse.php | Class | Class source |
ChatMessageToolCallFunctionResponse.php | Class | Class source |
ChatMessageToolCallResponse.php | Class | Class source |
ChatResponse.php | Class | Class source |
Files (39) | / | src | / | Responses | / | Completions |
File | Role | Description |
---|---|---|
CompletionResponse.php | Class | Class source |
Files (39) | / | src | / | Responses | / | Models |
File | Role | Description |
---|---|---|
CreateModelResponse.php | Class | Class source |
ListModelsModelDetailsResponse.php | Class | Class source |
ListModelsModelResponse.php | Class | Class source |
ListModelsResponse.php | Class | Class source |
ListRunningModelsModelResponse.php | Class | Class source |
ListRunningModelsResponse.php | Class | Class source |
PullModelResponse.php | Class | Class source |
PushModelResponse.php | Class | Class source |
ShowModelResponse.php | Class | Class source |
Files (39) | / | tests | / | Resources |
File | Role | Description |
---|---|---|
ChatTest.php | Class | Class source |
CompletionsTest.php | Class | Class source |
The PHP Classes site has supported package installation using the Composer tool since 2013, as you may verify by reading this instructions page. |
Install with Composer |
Version Control | Unique User Downloads | Download Rankings | |||||||||||||||
100% |
|
|
Applications that use this package |
If you know an application of this package, send a message to the author to add a link here.