A Practical Look at OpenAI Chat Integration on Oyster Serverless

What is Oyster Serverless

Oyster Serverless is a lightweight execution environment developed by Marlin Protocol that allows developers to run JavaScript functions without managing servers. These functions run inside a sandboxed runtime and are deployed on chain, which makes them immutable once published.
From a developer’s perspective, it behaves like a minimal serverless platform with strict limits and a very clear execution model.

Understanding the OpenAI Chat Integration Template

The OpenAI Chat Integration template is essentially a thin proxy between a client and the OpenAI API. It does not try to abstract complexity away. Instead, it keeps the flow explicit and predictable.
The function receives a prompt, forwards it to OpenAI, and returns the generated response. That is the entire lifecycle.

addEventListener('fetch', event => {
  event.respondWith(handle(event.request));
});
async function handle(request) {
  try {
    const { prompt, apiKey } = await request.json();
    if (typeof prompt !== 'string' || typeof apiKey !== 'string') {
      return new Response('Invalid input');
    }
    const response = await fetch('https://api.openai.com/v1/chat/completions', {
      method: 'POST',
      headers: {
        'Content-Type': 'application/json',
        'Authorization': `Bearer ${apiKey}`
      },
      body: JSON.stringify({
        model: 'gpt-3.5-turbo',
        messages: [{ role: 'user', content: prompt }]
      })
    });
    const data = await response.json();
    const message = data.choices?.[0]?.message?.content || '';
    return new Response(JSON.stringify({ response: message }));
  } catch {
    return new Response('Error processing request');
  }
}

Developer Notes

  • The template is stateless. Every request is independent
  • Input validation is minimal but necessary
  • Errors from OpenAI are passed through without heavy transformation

The simplicity here is intentional. It reduces surface area and makes the function easy to reason about after deployment.

Extensible Template

Most developers extend this template in small but meaningful ways:

  • Adding a system role for consistent outputs
  • Modifying prompts before forwarding
  • Structuring responses for frontend consumption

The base template acts as a stable starting point rather than a complete solution.

Final Take

The OpenAI Chat Integration template is a good example of how Oyster Serverless is meant to be used. Keep the logic small, keep the flow explicit, and avoid unnecessary abstraction.
It is less about building complex backends and more about composing simple, reliable functions.