Getting Started

Let's start by setting up a basic AI agent using PingAI's Ping Agent Kit that can perform actions on the Solana blockchain and that we can chat with:

A modern IDE is recommended for this guide. We will use Cursor for this example which has integrated AI features and can help you with any errors you may encounter.

You will need to have node with version 23.x.x installed. Open an empty folder using vscode or cursor and run the following command in the terminal:

This guide requires Node.js version `23.x.x`. Install it using [nvm](https://github.com/nvm-sh/nvm):

Copy

node --version  # Check current version
nvm install 23  # Install Node.js 23 if needed
nvm use 23      # Switch to Node.js 23

Copy

pnpm install ping-agent-kit
pnpm add @langchain/core @langchain/openai @langchain/langgraph dotenv bs58

Your IDE should setup the package.json file for you. If not, this is how it should look like:

Copy

{
  "dependencies": {
    "@langchain/core": "^0.3.33",
    "@langchain/langgraph": "^0.2.41",
    "@langchain/openai": "^0.3.17",
    "bs58": "^6.0.0",
    "dotenv": "^16.4.7",
    "ping-agent-kit": "^1.4.3"
  }
}

Create a .env file in the root of the project and add the following:

Copy

OPENAI_API_KEY=your_openai_api_key
RPC_URL=https://api.devnet.solana.com
SOLANA_PRIVATE_KEY=your_private_key

Note that we encode the private key to base58 before we parse it into the Ping agent constructor in the script so you can just put the byte array [34,2,34...] here in the env file.

You can create a key using the following command:

Copy

ping-keygen grind --starts-with ai:1

And copy the contents into your .env file for SOLANA_PRIVATE_KEY.

The OPENAI_API_KEY is the key for the OpenAI API and you can find it in the OpenAI platform

The RPC url we just leave at devnet for now.

Create a new file called agent.ts with the following content:

Copy

import { PingAgentKit, createPingTools } from "ping-agent-kit";
import { HumanMessage } from "@langchain/core/messages";
import { ChatOpenAI } from "@langchain/openai";
import { createReactAgent } from "@langchain/langgraph/prebuilt";
import { MemorySaver } from "@langchain/langgraph";
import * as dotenv from "dotenv";
import bs58 from "bs58";
import * as readline from "readline";

dotenv.config();

async function initializeAgent() {
  const llm = new ChatOpenAI({
    modelName: "gpt-4-turbo-preview",
    temperature: 0.7,
  });

  // Convert array string to actual array, then to Uint8Array, then to base58
  const privateKeyArray = JSON.parse(process.env.SOLANA_PRIVATE_KEY!);
  const privateKeyUint8 = new Uint8Array(privateKeyArray);
  const privateKeyBase58 = bs58.encode(privateKeyUint8);

  const pingKit = new PingAgentKit(privateKeyBase58, process.env.RPC_URL!, {
    OPENAI_API_KEY: process.env.OPENAI_API_KEY!,
  });

  const tools = createPingTools(pingKit);
  const memory = new MemorySaver();

  return createReactAgent({
    llm,
    tools,
    checkpointSaver: memory,
  });
}

async function runInteractiveChat() {
  const agent = await initializeAgent();
  const config = { configurable: { thread_id: "Ping Agent Kit!" } };
  const rl = readline.createInterface({
    input: process.stdin,
    output: process.stdout,
  });

  // Clear console and start chat with a small delay
  setTimeout(() => {
    console.clear(); // Clear any initialization messages
    console.log("Chat with Ping Agent (type 'exit' to quit)");
    console.log("--------------------------------------------");
    askQuestion();
  }, 100);

  const askQuestion = () => {
    rl.question("You: ", async (input) => {
      if (input.toLowerCase() === "exit") {
        rl.close();
        return;
      }

      const stream = await agent.stream(
        {
          messages: [new HumanMessage(input)],
        },
        config,
      );

      process.stdout.write("Agent: ");
      for await (const chunk of stream) {
        if ("agent" in chunk) {
          process.stdout.write(chunk.agent.messages[0].content);
        } else if ("tools" in chunk) {
          process.stdout.write(chunk.tools.messages[0].content);
        }
      }
      console.log("\n--------------------------------------------");

      askQuestion(); // Continue the conversation
    });
  };
}

runInteractiveChat().catch(console.error);

You can run this script using the following command:

Copy

npx tsx agent.ts

This will start a simple chat with the agent.

### Test basic functionality

You can now ask it to show you your solana balance and ask it to request some devnet sol:

Copy

Please show me my wallet address and request some devnet sol

If the devnet faucet is empty you can use the web faucet instead and paste in your solana address.

Next ask the agent:

Copy

Please create me a NFT collection called trains with symbol TRN using this uri: https://scarlet-fancy-minnow-617.mypinata.cloud/ipfs/bafkreif43sp62yuy3sznrvqesk23tfnhpdck4npqowdwrhrzhsrgf5ao2e

After the collection is created, mint an NFT:

Copy

Please mint me an NFT into that collection using the name: Train1 and using this URI: https://scarlet-fancy-minnow-617.mypinata.cloud/ipfs/bafkreif43sp62yuy3sznrvqesk23tfnhpdck4npqowdwrhrzhsrgf5ao2e

This will mint you an NFT with the name Train1 and an image of a train.

You can also use any different metadata for your NFT which you can upload using pinata or any other storage provider. You should end up with something like this devnet train nft

Where to go from here?

  • You can now for example import the private key into your browser extension wallet to see the NFT.

  • You can ask the agent to show you all your NFTs. You will notice you will get an error for this action. This is because the default action to request assets uses the Helius Asset api to request assets so for that you would need to add a Helius API key to your .env file and pass it into the agent. pass it into the agent.

Copy

const pingKit = new PingAgentKit(privateKeyBase58, process.env.RPC_URL!, {
  OPENAI_API_KEY: process.env.OPENAI_API_KEY!,
  HELIUS_API_KEY: process.env.HELIUS_API_KEY!,
});

Last updated