PHP Classes

How to Use a PHP AI Generator to Ask Questions in Natural Language and Get Responses Using the Package Ollama PHP: Generate responses to questions using Ollama AI

Recommend this page to a friend!
  Info   Documentation   View files Files   Install with Composer Install with Composer   Download Download   Reputation   Support forum   Blog    
Last Updated Ratings Unique User Downloads Download Rankings
2024-08-14 (3 days ago) RSS 2.0 feedNot yet rated by the usersTotal: 16 This week: 16All time: 11,329 This week: 8Up
Version License PHP version Categories
ollama-php 1.0Custom (specified...7Algorithms, Web services, Artificial ..., P...
Description 

Author

This package can generate responses to questions using Ollama AI.

It can send HTTP requests to call the Ollama API and ask questions in natural language using a given artificial intelligence model.

The package can also get the responses to the questions returned by the Ollama API.

It can also perform several operations with artificial intelligence models like create, list, pull, push, get the details, copy, and delete a model.

Picture of Arda Günsüren
  Performance   Level  
Name: Arda Günsüren <contact>
Classes: 2 packages by
Country: Turkey Turkey

Documentation

Ollama PHP Library

This is a PHP library for Ollama. Ollama is an open-source project that serves as a powerful and user-friendly platform for running LLMs on your local machine. It acts as a bridge between the complexities of LLM technology and the desire for an accessible and customizable AI experience. <p align="left"> <a href="https://packagist.org/packages/ardagnsrn/ollama-php"><img alt="Total Downloads" src="https://img.shields.io/packagist/dt/ardagnsrn/ollama-php"></a> <a href="https://packagist.org/packages/ardagnsrn/ollama-php"><img alt="Latest Version" src="https://img.shields.io/packagist/v/ardagnsrn/ollama-php"></a> <a href="https://packagist.org/packages/ardagnsrn/ollama-php"><img alt="License" src="https://img.shields.io/github/license/ardagnsrn/ollama-php"></a> </p>

?? Buy me a coffee

Whether you use this project, have learned something from it, or just like it, please consider supporting it by buying me a coffee, so I can dedicate more time on open-source projects like this :)

<a href="https://www.buymeacoffee.com/ardagnsrn" target="_blank"><img src="https://www.buymeacoffee.com/assets/img/custom_images/orange_img.png" alt="Buy Me A Coffee" style="height: auto !important;width: auto !important;" ></a>

Table of Contents

Get Started

> Requires PHP 8.1+\ > Requires Ollama

> You can find Official Ollama documentation here.

First, install Ollama PHP via the Composer package manager:

composer require ardagnsrn/ollama-php

Then, you can create a new Ollama client instance:

// with default base URL
$client = \ArdaGnsrn\Ollama\Ollama::client();

// or with custom base URL
$client = \ArdaGnsrn\Ollama\Ollama::client('http://localhost:11434');

Usage

Completions Resource

create

Generate a response for a given prompt with a provided model.

$completions = $client->completions()->create([
    'model' => 'llama3.1',
    'prompt' => 'Once upon a time',
]);

$completions->response; // '...in a land far, far away...'

$response->toArray(); // ['model' => 'llama3.1', 'response' => '...in a land far, far away...', ...]

createStreamed

Generate a response for a given prompt with a provided model and stream the response.

$completions = $client->completions()->createStreamed([
    'model' => 'llama3.1',
    'prompt' => 'Once upon a time',
]);


foreach ($completions as $completion) {
    echo $completion->response;
}
// 1. Iteration: '...in'
// 2. Iteration: ' a'
// 3. Iteration: ' land'
// 4. Iteration: ' far,'
// ...

Chat Resource

create

Generate a response for a given prompt with a provided model.

$response = $client->chat()->create([
    'model' => 'llama3.1',
    'messages' => [
        ['role' => 'system', 'content' => 'You are a llama.'],
        ['role' => 'user', 'content' => 'Hello!'],
        ['role' => 'assistant', 'content' => 'Hi! How can I help you today?'],
        ['role' => 'user', 'content' => 'I need help with my taxes.'],
    ],
]);

$response->message->content; // 'Ah, taxes... chew chew Hmm, not really sure how to help with that.'

$response->toArray(); // ['model' => 'llama3.1', 'message' => ['role' => 'assistant', 'content' => 'Ah, taxes...'], ...]

Also, you can use the tools parameter to provide custom functions to the chat. tools parameter can not be used with createStreamed method.

$response = $client->chat()->create([
    'model' => 'llama3.1',
    'messages' => [
        ['role' => 'user', 'content' => 'What is the weather today in Paris?'],
    ],
    'tools' => [
        [
            'type' => 'function',
            'function' => [
                'name' => 'get_current_weather',
                'description' => 'Get the current weather',
                'parameters' => [
                    'type' => 'object',
                    'properties' => [
                        'location' => [
                            'type' => 'string',
                            'description' => 'The location to get the weather for, e.g. San Francisco, CA',
                        ],
                        'format' => [
                            'type' => 'string',
                            'description' => 'The location to get the weather for, e.g. San Francisco, CA',
                            'enum' => ['celsius', 'fahrenheit']
                        ],
                    ],
                    'required' => ['location', 'format'],
                ],
            ],
        ]
    ]
]);

$toolCall = $response->message->toolCalls[0];

$toolCall->function->name; // 'get_current_weather'
$toolCall->function->arguments; // ['location' => 'Paris', 'format' => 'celsius']

$response->toArray(); // ['model' => 'llama3.1', 'message' => ['role' => 'assistant', 'toolCalls' => [...]], ...]

createStreamed

Generate a response for a given prompt with a provided model and stream the response.

$responses = $client->chat()->createStreamed([
    'model' => 'llama3.1',
    'messages' => [
        ['role' => 'system', 'content' => 'You are a llama.'],
        ['role' => 'user', 'content' => 'Hello!'],
        ['role' => 'assistant', 'content' => 'Hi! How can I help you today?'],
        ['role' => 'user', 'content' => 'I need help with my taxes.'],
    ],
]);


foreach ($responses as $response) {
    echo $response->message->content;
}
// 1. Iteration: 'Ah,'
// 2. Iteration: ' taxes'
// 3. Iteration: '... '
// 4. Iteration: ' *chew,'
// ...

Models Resource

list

List all available models.

$response = $client->models()->list();

$response->toArray(); // ['models' => [['name' => 'llama3.1', ...], ['name' => 'llama3.1:80b', ...], ...]]

show

Show details of a specific model.

$response = $client->models()->show('llama3.1');

$response->toArray(); // ['modelfile' => '...', 'parameters' => '...', 'template' => '...']

create

Create a new model.

$response = $client->models()->create([
    'name' => 'mario',
    'modelfile' => "FROM llama3.1\nSYSTEM You are mario from Super Mario Bros."
]);

$response->status; // 'success'

createStreamed

Create a new model and stream the response.

$responses = $client->models()->createStreamed([
    'name' => 'mario',
    'modelfile' => "FROM llama3.1\nSYSTEM You are mario from Super Mario Bros."
]);

foreach ($responses as $response) {
    echo $response->status;
}

copy

Copy an existing model.

$client->models()->copy('llama3.1', 'llama3.2'); // bool

delete

Delete a model.

$client->models()->delete('mario'); // bool

pull

Pull a model from the Ollama server.

$response = $client->models()->pull('llama3.1'); 
$response->toArray() // ['status' => 'downloading digestname', 'digest' => 'digestname', 'total' => 2142590208, 'completed' => 241970]

pullStreamed

Pull a model from the Ollama server and stream the response.

$responses = $client->models()->pullStreamed('llama3.1'); 

foreach ($responses as $response) {
    echo $response->status; 
}

push

Push a model to the Ollama server.

$response = $client->models()->push('llama3.1');
$response->toArray() // ['status' => 'uploading digestname', 'digest' => 'digestname', 'total' => 2142590208]

pushStreamed

Push a model to the Ollama server and stream the response.

$responses = $client->models()->pushStreamed('llama3.1');

foreach ($responses as $response) {
    echo $response->status; 
}

runningList

List all running models.

$response = $client->models()->runningList();

$response->toArray(); // ['models' => [['name' => 'llama3.1', ...], ['name' => 'llama3.1:80b', ...], ...]]

Blobs Resource

exists

Check if a blob exists.

$client->blobs()->exists('blobname'); // bool

create

Create a new blob.

$client->blobs()->create('blobname'); // bool

Embed Resource

create

Generate an embedding for a given text with a provided model.

$response = $client->embed()->create([
    'model' => 'llama3.1',
    'input' => [
        "Why is the sky blue?",
    ]
]);

$response->toArray(); // ['model' => 'llama3.1', 'embedding' => [0.1, 0.2, ...], ...]

Testing

composer test

Changelog

Please see CHANGELOG for more information on what has changed recently.

Contributing

Please see CONTRIBUTING for details.

Credits

License

The MIT License (MIT). Please see License File for more information.


  Files folder image Files (39)  
File Role Description
Files folder imagesrc (2 files, 3 directories)
Files folder imagetests (1 file, 1 directory)
Accessible without login Plain text file .editorconfig Data Auxiliary data
Accessible without login Plain text file CHANGELOG.md Data Auxiliary data
Accessible without login Plain text file composer.json Data Auxiliary data
Accessible without login Plain text file LICENSE.md Lic. License text
Accessible without login Plain text file phpunit.xml.dist Data Auxiliary data
Accessible without login Plain text file README.md Doc. Documentation

  Files folder image Files (39)  /  src  
File Role Description
Files folder imageContracts (7 files)
Files folder imageResources (5 files)
Files folder imageResponses (1 file, 4 directories)
  Plain text file Ollama.php Class Class source
  Plain text file OllamaClient.php Class Class source

  Files folder image Files (39)  /  src  /  Contracts  
File Role Description
  Plain text file BlobsContract.php Class Class source
  Plain text file ChatContract.php Class Class source
  Plain text file CompletionsContract.php Class Class source
  Plain text file EmbedContract.php Class Class source
  Plain text file ModelsContract.php Class Class source
  Plain text file ResponseContract.php Class Class source
  Plain text file StreamResponseContract.php Class Class source

  Files folder image Files (39)  /  src  /  Resources  
File Role Description
  Plain text file Blobs.php Class Class source
  Plain text file Chat.php Class Class source
  Plain text file Completions.php Class Class source
  Plain text file Embed.php Class Class source
  Plain text file Models.php Class Class source

  Files folder image Files (39)  /  src  /  Responses  
File Role Description
Files folder imageChat (4 files)
Files folder imageCompletions (1 file)
Files folder imageEmbed (1 file)
Files folder imageModels (9 files)
  Plain text file StreamResponse.php Class Class source

  Files folder image Files (39)  /  src  /  Responses  /  Chat  
File Role Description
  Plain text file ChatMessageResponse.php Class Class source
  Plain text file ChatMessageToolCallFunctionResponse.php Class Class source
  Plain text file ChatMessageToolCallResponse.php Class Class source
  Plain text file ChatResponse.php Class Class source

  Files folder image Files (39)  /  src  /  Responses  /  Completions  
File Role Description
  Plain text file CompletionResponse.php Class Class source

  Files folder image Files (39)  /  src  /  Responses  /  Embed  
File Role Description
  Plain text file EmbedResponse.php Class Class source

  Files folder image Files (39)  /  src  /  Responses  /  Models  
File Role Description
  Plain text file CreateModelResponse.php Class Class source
  Plain text file ListModelsModelDetailsResponse.php Class Class source
  Plain text file ListModelsModelResponse.php Class Class source
  Plain text file ListModelsResponse.php Class Class source
  Plain text file ListRunningModelsModelResponse.php Class Class source
  Plain text file ListRunningModelsResponse.php Class Class source
  Plain text file PullModelResponse.php Class Class source
  Plain text file PushModelResponse.php Class Class source
  Plain text file ShowModelResponse.php Class Class source

  Files folder image Files (39)  /  tests  
File Role Description
Files folder imageResources (2 files)
  Plain text file OllamaTest.php Class Class source

  Files folder image Files (39)  /  tests  /  Resources  
File Role Description
  Plain text file ChatTest.php Class Class source
  Plain text file CompletionsTest.php Class Class source

The PHP Classes site has supported package installation using the Composer tool since 2013, as you may verify by reading this instructions page.
Install with Composer Install with Composer
 Version Control Unique User Downloads Download Rankings  
 100%
Total:16
This week:16
All time:11,329
This week:8Up