Pretrained Foundational Models in Generative AI
You can use the following pretrained foundational models in OCI Generative AI:
Chat Models (New)
Ask questions and get conversational responses through an AI chat interface.
Model | Available in These Regions | Key Features |
---|---|---|
cohere.command-r-16k |
|
|
cohere.command-r-plus |
|
|
meta.llama-3.1-70b-instruct |
|
|
meta.llama-3.1-405b-instruct |
|
|
meta.llama-3-70b-instruct (deprecated) |
|
|
Embedding Models
Convert text to vector embeddings to use in applications for semantic searches, text classification, or text clustering.
Model | Available in These Regions | Key Features |
---|---|---|
cohere.embed-english-v3.0 |
|
|
cohere.embed-multilingual-v3.0 |
|
|
cohere.embed-english-light-v3.0 |
|
|
cohere.embed-multilingual-light-v3.0 |
|
|
Generation Models (Deprecated)
Give instructions to generate text or extract information from text.
Important
All OCI Generative AI foundational pretrained models supported for the on-demand serving mode that use the text generation and summarization APIs (including the playground) are now retired. If you host a summarization or a generation model such as
All OCI Generative AI foundational pretrained models supported for the on-demand serving mode that use the text generation and summarization APIs (including the playground) are now retired. If you host a summarization or a generation model such as
cohere.command
on a dedicated AI cluster, (dedicated serving mode), you can continue to use that model until it's retired. See Retiring the Models for retirement dates and definitions. We recommend that you use the chat models instead.Model | Available in These Regions | Key Features |
---|---|---|
cohere.command (deprecated) |
|
|
cohere.command-light (deprecated) |
|
|
meta.llama-2-70b-chat (deprecated) |
|
|
The Summarization Model (Deprecated)
Summarize text with your instructed format, length, and tone.
Important
The
The
cohere.command
model supported for the on-demand serving mode is now retired and this model is deprecated for the dedicated serving mode. If you're hosting cohere.command
on a dedicated AI cluster, (dedicated serving mode) for summarization, you can continue to use this hosted model replica with the summarization API and in the playground until the cohere.command
model retires for the dedicated serving mode. See Retiring the Models for retirement dates and definitions. We recommend that you use the chat models instead which offer the same summarization capabilities, including control over summary length and style.Model | Available in These Regions | Key Features |
---|---|---|
cohere.command (deprecated) |
|
|