Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

doc: openAIChat #63

Closed
wants to merge 54 commits into from
Closed
Changes from 10 commits
Commits
Show all changes
54 commits
Select commit Hold shift + click to select a range
8da8794
Adding image support to ollamaChat
ccreutzi Jul 25, 2024
056c557
Add vision support to the doc files
ccreutzi Jul 25, 2024
b9922f0
Merge branch 'main' into ollama-images
ccreutzi Jul 25, 2024
ed2aa22
Preinstall bakllava
ccreutzi Jul 25, 2024
d24fa00
Text only edits to error message catalog (doc review)
MiriamScharnke Jul 25, 2024
1701188
Add doc warning about Ollama image input for text-only models
ccreutzi Jul 26, 2024
bb4ba1a
Update doc/Ollama.md
ccreutzi Jul 26, 2024
7aaadb4
WARNING -> TIP
ccreutzi Jul 26, 2024
7bed93d
Update +llms/+utils/errorMessageCatalog.m
MiriamScharnke Jul 26, 2024
83eff3a
Update +llms/+utils/errorMessageCatalog.m
MiriamScharnke Jul 26, 2024
1384590
Moved mlx into sub-directory, added Markdown export
ccreutzi Jul 29, 2024
c31e76e
Correct mlx path in texampleTests.m
ccreutzi Jul 29, 2024
570aaa1
Inserted text should have MATLAB®
ccreutzi Jul 29, 2024
50b9b0f
Also ignore data directory from moved mlx files
ccreutzi Jul 29, 2024
581a4a2
Add ® in generated files
ccreutzi Jul 29, 2024
99c0902
Merge branch 'generate-md-from-mlx' of github.com:matlab-deep-learnin…
ccreutzi Jul 29, 2024
a7b0627
Create functions directory in the doc folder and add openAIChat docum…
MiriamScharnke Jul 29, 2024
c3bec04
Merge pull request #59 from matlab-deep-learning/ollama-images
ccreutzi Jul 29, 2024
19c70ee
Update doc/functions/openAIChat.md
MiriamScharnke Jul 29, 2024
56a65e7
Update doc/functions/openAIChat.md
MiriamScharnke Jul 29, 2024
b0f15e2
Update doc/functions/openAIChat.md
MiriamScharnke Jul 29, 2024
ada0f7c
Merge pull request #62 from matlab-deep-learning/generate-md-from-mlx
ccreutzi Jul 30, 2024
1ed0f7c
Apply suggestions from code review
MiriamScharnke Jul 30, 2024
595316f
Update doc/functions/openAIChat.md
MiriamScharnke Jul 30, 2024
ada0ff8
Update openAIChat documentation.
MiriamScharnke Jul 30, 2024
8426bb7
Merge branch 'temp-change' into doc-openaichat
MiriamScharnke Jul 30, 2024
ddcf6f1
Merge pull request #60 from matlab-deep-learning/error-message-text-o…
MiriamScharnke Jul 31, 2024
72d171e
Fix OpenAI trademarks
MiriamScharnke Aug 1, 2024
223347b
Delete doc/functions/openAIChat.md
MiriamScharnke Aug 1, 2024
ae07cd9
Merge pull request #66 from matlab-deep-learning/fix-openai-trademark
MiriamScharnke Aug 1, 2024
b272344
Technical feedback and cosmetic changes for openAIChat reference page
MiriamScharnke Aug 1, 2024
ae46982
Cosmetic changes for openAIChat reference page
MiriamScharnke Aug 1, 2024
a7c469a
Fix trademark on openAIChat reference page
MiriamScharnke Aug 1, 2024
57b22e1
Allow (long) char vectors for StopSequences
ccreutzi Aug 1, 2024
4563e70
Cosmetic fixes
MiriamScharnke Aug 1, 2024
ea8874b
Merge pull request #67 from matlab-deep-learning/allow-char-StopSeque…
ccreutzi Aug 2, 2024
172a4da
Replace bakllava by moondream
ccreutzi Aug 2, 2024
5fa9534
Create CODEOWNERS
ccreutzi Aug 2, 2024
e72cc99
Merge pull request #68 from matlab-deep-learning/replace-bakllava
ccreutzi Aug 2, 2024
062538c
Merge pull request #69 from matlab-deep-learning/codeowners
ccreutzi Aug 2, 2024
2565f27
Trace/replay `llms.internal.sendRequest`
ccreutzi Aug 5, 2024
53c1df5
Avoid bogus access to `json.choices.delta.content`
ccreutzi Aug 5, 2024
fce8bb8
Merge pull request #71 from matlab-deep-learning/avoid-content-error
ccreutzi Aug 5, 2024
2cbda4d
Merge pull request #70 from matlab-deep-learning/test-doubles
ccreutzi Aug 6, 2024
a1b48e1
Cosmetic fixes
MiriamScharnke Aug 6, 2024
bee4e39
Update doc/functions/openAIChat.md
MiriamScharnke Jul 30, 2024
e8f20cd
Technical feedback and cosmetic changes for openAIChat reference page
MiriamScharnke Aug 1, 2024
356bcd4
Cosmetic changes for openAIChat reference page
MiriamScharnke Aug 1, 2024
4b68fc7
Fix trademark on openAIChat reference page
MiriamScharnke Aug 1, 2024
07bef3c
Cosmetic fixes
MiriamScharnke Aug 1, 2024
963a6d9
Cosmetic fixes
MiriamScharnke Aug 6, 2024
874536b
Cosmetic fixes
MiriamScharnke Aug 6, 2024
3da3ae6
Merge branch 'doc-openaichat' of github.com:matlab-deep-learning/llms…
MiriamScharnke Aug 6, 2024
c357e75
Merge remote-tracking branch 'origin' into doc-openaichat
MiriamScharnke Aug 6, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
253 changes: 253 additions & 0 deletions doc/functions/openAIChat.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,253 @@

# openAIChat

Connect to OpenAI® Chat Completion API

# Creation
## Syntax

`chat = openAIChat`


`chat = openAIChat(systemPrompt)`


`chat = openAIChat(___,ApiKey=key)`


`chat = openAIChat(___,Name=Value)`


## Description

Connect to the OpenAI Chat Completion API to generate text using large language models developed by OpenAI.


To connect to the OpenAI API, you need a valid API key. For information on how to obtain an API key, see [https://platform.openai.com/docs/quickstart](https://platform.openai.com/docs/quickstart).


`chat = openAIChat` creates an `openAIChat` object. Connecting to the OpenAI API requires a valid API key. Either set the environment variable `OPENAI_API_KEY` or specify the `APIKey` name\-value argument.


`chat = openAIChat(systemPrompt)` creates an `openAIChat` object with the specified system prompt.


`chat = openAIChat(___,APIKey=key)` uses the specified API key.


`chat = openAIChat(___,Name=Value)` specifies additional options using one or more name\-value arguments.


`chat = openAIChat(___,PropertyName=PropertyValue)` specifies properties that are settable at construction using one or more name\-value arguments.

## Input Arguments

### `systemPrompt` – System prompt

character vector | string scalar


Specify the system prompt and set the `SystemPrompt` property. The system prompt is a natural\-language description that provides the framework in which a large language model generates its responses. The system prompt can include instructions about tone, communications style, language, etc.


**Example**: "You are a helpful assistant who provides answers to user queries in iambic pentameter."
MiriamScharnke marked this conversation as resolved.
Show resolved Hide resolved

## Name\-Value Arguments

### `APIKey` – OpenAI API key

character vector | string scalar


OpenAI API key to access OpenAI APIs such as ChatGPT.


Instead of using the `APIKey` name\-value argument, you can also set the environment variable OPEN\_API\_KEY. For more information, see [OpenAI API](../OpenAI.md).

---
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These lines look heavier in the preview than the section break for “Properties Settable at Construction” below.


### `Tools` – OpenAI functions to use during output generation

`openAIFunction` object | array of `openAIFunction` objects
MiriamScharnke marked this conversation as resolved.
Show resolved Hide resolved


Custom functions used by the model to collect or generate additional data.

For an example, see [Analyze Scientific Papers Using ChatGPT Function Calls](../../examples/AnalyzeScientificPapersUsingFunctionCalls.md).

## Properties Settable at Construction
Optionally specify these properties at construction using name-value arguments. Specify `PropertyName1=PropertyValue1,...,PropertyNameN=PropertyValueN`, where `PropertyName` is the property name and `PropertyValue` is the corresponding value.

### `ModelName` – Model name

`"gpt-4o-mini"` (default) | `"gpt-4"` | `"gpt-3.5-turbo"` | `"dall-e-2"` | ...


Name of the OpenAI model to use for text or image generation.


For a list of currently supported models, see [OpenAI API](../OpenAI.md).

---

### `Temperature` – Temperature

`1` (default) | numeric scalar between `0` and `2`


Temperature value for controlling the randomness of the output. Higher temperature increases the randomness of the output. Setting the temperature to `0` results in fully deterministic output.

---

### `TopP` – Top probability mass

`1` (default) | numeric scalar between `0` and `1`


Top probability mass for controlling the diversity of the generated output using top-p sampling. Higher top probability mass corresponds to higher diversity.

---

### `StopSequences` – Stop sequences

`""` (default) | string array with between `1` and `4` elements
MiriamScharnke marked this conversation as resolved.
Show resolved Hide resolved


Sequences that stop generation of tokens.


**Example:** `["The end.","And that is all she wrote."]`

---

### `PresencePenalty` – Presence penalty

`0` (default) | numeric scalar between `-2` and `2`


Penalty value for using a token that has already been used at least once in the generated output. Higher values reduce the repetition of tokens. Negative values increase the repetition of tokens.


The presence penalty is independent of the number of incidents of a token, so long as it has been used at least once. To increase the penalty for every additional time a token is generated, use the `FrequencyPenalty` name\-value argument.

---

### `FrequencyPenalty` – Frequency penalty

`0` (default) | numeric scalar between `-2` and `2`


Penalty value for repeatedly using the same token in the generated output. Higher values reduce the repetition of tokens. Negative values increase the repetition of tokens.


The frequence penalty increases with every instance of a token in the generated output. To use a constant penalty for a repeated token, independent of the number of instances that token is generated, use the `PresencePenalty` name\-value argument.

---

### `TimeOut` – Connection timeout in seconds

`10` (default) | nonnegative numeric scalar

After construction, this property is read-only.

If the OpenAI server does not respond within the timeout, then the function throws an error.

---

### `StreamFun` – Custom streaming function

function handle


Specify a custom streaming function to process the generated output token by token as it is being generated, rather than having to wait for the end of the generation. For example, you can use this function to print the output as it is generated.
MiriamScharnke marked this conversation as resolved.
Show resolved Hide resolved

For an example, see [Process Generated Text in Real Time by Using ChatGPT™ in Streaming Mode](../../examples/ProcessGeneratedTextinRealTimebyUsingChatGPTinStreamingMode.md).


**Example:** `@(token) fprint("%s",token)`

---

### `ResponseFormat` – Response format

`"text"` (default) | `"json"`


After construction, this property is read\-only.


Format of generated output.


If the response format is `"text"`, then the generated output is a string.


If the response format is `"json"`, then the generated output is a string containing JSON encoded data.


To configure the format of the generated JSON file, describe the format using natural language and provide it to the model either in the system prompt or as a user message. The prompt or message describing the format must contain the word `"json"` or `"JSON"`.


For an example, see [Analyze Sentiment in Text Using ChatGPT in JSON Mode](../../examples/AnalyzeSentimentinTextUsingChatGPTinJSONMode.md).


The JSON response format is not supported for these models:

- `ModelName="gpt-4"`
- `ModelName="gpt-4-0613"`


## Other Properties

### `SystemPrompt` – System prompt

character vector | string scalar


This property is read\-only.


The system prompt is a natural\-language description that provides the framework in which a large language model generates its responses. The system prompt can include instructions about tone, communications style, language, etc.

Specify the `SystemPrompt` property at construction using the `systemPrompt` input argument.


**Example**: "You are a helpful assistant who provides answers to user queries in iambic pentameter."

---

### `FunctionNames` – Names of OpenAI functions to use during output generation

string array


This property is read\-only.


Names of the custom functions specified in the `Tools` name\-value argument.


# Object Functions

`generate` – Generate text

# Examples
## Create OpenAI Chat
```matlab
modelName = "gpt-4o-mini";
chat = openAIChat("You are a helpful assistant awaiting further instructions.",ModelName=modelName);
```

## Generate and Stream Text
```matlab
sf = @(x) fprintf("%s",x);
chat = openAIChat(StreamFun=sf);
generate(chat,"Why is a raven like a writing desk?");
```
# See Also
- [Create Simple Chat Bot](../../examples/CreateSimpleChatBot.md)
- [Process Generated Text in Real Time Using ChatGPT in Streaming Mode](../../examples/ProcessGeneratedTextinRealTimebyUsingChatGPTinStreamingMode.md)
- [Analyze Scientific Papers Using Function Calls](../../examples/AnalyzeScientificPapersUsingFunctionCalls.md)
- [Analyze Sentiment in Text Using ChatGPT in JSON Mode](../../examples/AnalyzeSentimentinTextUsingChatGPTinJSONMode.md)

*Copyright 2024 The MathWorks, Inc.*