Integrations

LlamaIndex

LlamaIndex is a data framework for building LLM-powered applications over your data. It provides tools for data ingestion, indexing, and querying, making it ideal for RAG (Retrieval-Augmented Generation) applications. Since MARA Cloud is OpenAI-compatible, you can use LlamaIndex's OpenAI integration with a custom base URL.

Prerequisites

Setup

Install LlamaIndex with the OpenAI integration:
bash
pip install llama-index llama-index-llms-openai-like

Configuration

Point LlamaIndex to MARA Cloud using the OpenAILike class:
python
from llama_index.llms.openai_like import OpenAILike

llm = OpenAILike(
    api_base="https://api.cloud.mara.com/v1",
    api_key="your-mara-api-key",
    model="MiniMax-M2.5",
    is_chat_model=True,
)

response = llm.complete("Explain what RAG is in two sentences.")
print(response)

Using with a query engine

Once configured, you can use the LLM with LlamaIndex's indexing and querying capabilities:
python
from llama_index.core import VectorStoreIndex, SimpleDirectoryReader, Settings

Settings.llm = llm

documents = SimpleDirectoryReader("./data").load_data()
index = VectorStoreIndex.from_documents(documents)
query_engine = index.as_query_engine()

response = query_engine.query("What are the key points in this document?")
print(response)

Learn more