Usage
Once you have a zylAI API key, you can start using SecretLLM with any OpenAI-compatible library.
Getting Started with SecretLLM
Getting started with SecretLLM is straightforward:
Select a zylAI node url
Query the
/v1/models
endpoint or check available modelsSelect an available model and use it with the
/v1/chat/completions
zylAI node endpoint
Since SecretLLM is OpenAI-compatible, you can use any OpenAI library. Here's an example for querying the Llama-3.1-8B
model:
Node.js
Python
Next JS (API routes)
zylai/secretllm_nodejs/index.js
const OpenAI = require('openai');
require('dotenv').config();
// Initialize the OpenAI client
// baseURL is the zylAI node url: https://docs.zyllion.com/network#zylai-nodes
// apiKey is your zylAI node api key: https://docs.zyllion.com/build/secretLLM/access
const client = new OpenAI({
baseURL: 'https://zylai-a779.zyllion.network/v1',
apiKey: process.env.ZYLAI_API_KEY || 'YOUR_API_KEY_HERE'
});
async function generateText() {
try {
const response = await client.chat.completions.create({
model: 'meta-llama/Llama-3.1-8B-Instruct',
messages: [
{
role: 'system',
content: 'You are a fitness coach.'
},
{
role: 'user',
content: 'What is better for you, salad or pizza?'
}
],
stream: false
});
// Every SecretLLM response includes a cryptographic signature for verification
console.log(`Signature: ${response.signature}`);
console.log(`Response: ${response.choices[0].message.content}`);
} catch (error) {
console.error('Error:', error);
}
}
generateText();
Last updated