Streamline AI development by using Portkey AI Gateway with Novita AI for fast, secure, and reliable performance.
Portkey AI Gateway transforms how developers work with AI models like Novita AI, providing a unified interface for seamless access to multiple language models with fast, secure, and reliable routing. This integration simplifies AI development and improves application performance.This guide will walk you through setting up Portkey AI Gateway and then integrating Novita AI API with Portkey.
Setting up Portkey AI Gateway is simple and efficient, requiring just three key steps: configuring the gateway, sending your first request, and optimizing routing and guardrails for seamless performance.
Begin by installing the Portkey AI Python library:
Copy
Ask AI
pip install -qU portkey-ai
Next, execute the following Python code to send your first request:
Copy
Ask AI
from portkey_ai import Portkey# OpenAI compatible clientclient = Portkey( provider="openai", # or 'anthropic', 'bedrock', 'groq', etc Authorization="sk-***" # the provider API key)# Make a request through your AI Gatewayclient.chat.completions.create( messages=[{"role": "user", "content": "What's the weather like?"}], model="gpt-4o-mini")
Effortlessly monitor all your local logs in one centralized location using the Gateway Console at: http://localhost:8787/public/.
Portkey AI Gateway enables you to configure routing rules, add reliability features, and enforce guardrails. Below is an example configuration:
Copy
Ask AI
config = { "retry": {"attempts": 5}, "output_guardrails": [{ "default.contains": {"operator": "none", "words": ["Apple"]}, "deny": True }]}# Attach the config to the clientclient = client.with_options(config=config)client.chat.completions.create( model="gpt-4o-mini", messages=[{"role": "user", "content": "Reply randomly with Apple or Bat"}])# In this example, the guardrail denies all replies containing "Apple", so the response would always be "Bat". The retry configuration would attempt the request up to 5 times before giving up.
To integrate Novita AI with Portkey, retrieve your LLM API key from Novita AI and add it to Portkey to generate the virtual key.Node.JS SDK
Copy
Ask AI
import Portkey from 'portkey-ai'const portkey = new Portkey({ apiKey: "PORTKEY_API_KEY", // Replace with your Portkey API key virtualKey: "VIRTUAL_KEY" // Replace with your virtual key for Novita AI})
Python SDK
Copy
Ask AI
from portkey_ai import Portkeyportkey = Portkey( api_key="PORTKEY_API_KEY", # Replace with your Portkey API key virtual_key="VIRTUAL_KEY" # Replace with your virtual key for Novita AI)