LlamaIndex Integration

As a provider of large language models (LLMs), Generative AI service has an integration with LlamaIndex.

OCI Generative AI has a LlamaIndex integration that's supported for Python. Using the OCI Generative AI service, you can access ready-to-use pretrained models, or create and host your own fine-tuned custom models based on your own data on dedicated AI clusters.

References
Tip

When you chat or create text embeddings in the playground, OCI Generative AI creates code samples for LlamaIndex that include your prompts and embeddings. You can then use these code samples in your applications. See Chat in the Console and Creating text embeddings in the Console.