Getting Started with OCI OpenAI-Compatible SDKs

The OCI OpenAI package is a library developed and maintained by the OCI Generative AI team. This package streamlines integration between the OpenAI-compatible Python and Java SDKs and the OCI Generative AI service. With built-in authentication and authorization, you can securely connect to the Generative AI service using OCI credentials.

Key Benefits

  • Simplifies integration between OpenAI-compatible SDKs (Python and Java) and the OCI Generative AI service by adding an OCI authentication layer on top of the OpenAI SDKs, enabling seamless use of OCI authentication.
  • Works with a subset of OpenAI-compatible SDKs, simplifying coding for developers familiar with the OpenAI API.
  • Easily connects to a subset of OCI Generative AI models with fewer on-boarding tasks.
  • Provides a seamless way to port code from an OpenAI-compatible endpoint to an OCI Generative AI endpoint. If you're using OpenAI-style API keys, you can now use API Keys for full compatibility.

Supported OpenAI SDKs

The OCI OpenAI package supports OpenAI-compatible Python and Java SDKs for the following API:

Important

The OCI OpenAI package adds its own authentication layer on top of the OpenAI SDK. Only the Chat Completions API is compatible. Using other OpenAI APIs result in errors (for example, a 404).

Installation

Python

Install the OCI OpenAI Python package with the following command:

pip install oci-openai
Java
Add the OCI OpenAI Java SDK as a Maven dependency from Maven Central. Include the following in your pom.xml (version 0.1.22 as of the latest release):
<dependency>
    <groupId>com.oracle.genai</groupId>
    <artifactId>oci-openai-java-sdk</artifactId>
    <version>0.1.22</version>
</dependency>

Repository: Maven Central

Supported Models

The OCI OpenAI package supports only the following models hosted on OCI Generative AI. For each model, check whether it’s offered in on-demand or dedicated mode. For supported regions for each model, see Supported Regions.

Models for Chat Completions API
Important

External Calls to xAI Grok Models

The xAI Grok models are hosted in an OCI data center, in a tenancy provisioned for xAI. The xAI Grok models, which can be accessed through the OCI Generative AI service, are managed by xAI.

Supported Regions

The OCI OpenAI package is supported in the following regions:

  • Germany Central (Frankfurt)
  • India South (Hyderabad)
  • Japan Central (Osaka)
  • US East (Ashburn)
  • US Midwest (Chicago)
  • US West (Phoenix)

To confirm whether a specific model is available in one of these regions and compatible with a required mode (on-demand or dedicated), follow these steps:

  1. Open the Generative AI Models by Region page.
  2. For the model that you want to use (for example, Meta Llama, xAI Grok, or OpenAI gpt-oss), note the available regions listed on the page.
  3. Select a region that's available for the model both in the preceding list and in the region page.
  4. Verify whether the model is available in the mode that you need (on-demand or dedicated).

    For access to models in the dedicated mode, both public and private endpoints are supported.