SecretLLM Quickstart

SecretLLM allows you to run AI models within a Trusted Execution Environment (TEE). You can use SecretLLM to build new private AI applications or migrate existing ones to run in a secure SecretLLM environment where your data remains private.

Getting Started

In this quickstart, we will be making a private AI journaling application with nextjs. Let's get started by cloning our examples repo.

gh repo clone ZyllionNetwork/blind-module-examples
cd blind-module-examples/zylai/secretllm_nextjs

Authenticationarrow-up-right

It requires either a Keplr or Metamask wallet. Then copy your .env.example into your .env file

cp .env.example .env

Then replace it with the API Key you created from the access page.

The two main pages that are relevant are page.tsx and api/chat/route.ts. page.tsx hosts the home content which we will refactor and calls the api/chat endpoint. The chat endpoint is very simple and adheres to OpenAI API specs for chat completion

You can use the code as it is and test text input to see how the AI responds. You can also customize the types of models you want to use.

zylai/secretllm_nextjs/app/api/chat/route.ts

import { NextResponse } from 'next/server';

export async function POST(req: Request) {
  try {
    const body = await req.json();

    const response = await fetch(
      `${process.env.ZYLAI_API_URL}/v1/chat/completions`,
      {
        method: 'POST',
        headers: {
          'Content-Type': 'application/json',
          Authorization: `Bearer ${process.env.ZYL
          AI_API_KEY}`,
        },
        body: JSON.stringify({
          model: 'meta-llama/Llama-3.1-8B-Instruct',
          messages: body.messages,
          temperature: 0.2,
        }),
      }
    );

    const data = await response.json();
    return NextResponse.json(data);
  } catch (error) {
    console.error('Chat error:', error);
    return NextResponse.json(
      { error: 'Failed to process chat request' },
      { status: 500 }
    );
  }
}

Customization of UIarrow-up-right

With your new API key, you should be able to interact and talk to the SecretLLM with the current UI. In order to interact with a new interface for the journal UI, we will just simple useState and then then call the SecretLLM API to retrieve the data.

You may use the content from this page on your page.tsx and then customize the application as you wish.

The current prompt for the AI journal response is:

zylai/secretllm_nextjs/app/tutorial_page.tsx

Last updated