Shopify Storefront AI Agent template uses Anthropic's Claude by default, but you can switch to OpenRouter by leveraging OpenRouter's "Anthropic Skin" feature or by switching to the OpenAI SDK format

Here’s a comprehensive one-shot prompt for Claude Code:

Implement OpenRouter integration for the Shopify Storefront AI Agent template. I want to switch from default Anthropic Claude to OpenRouter while maintaining the Anthropic SDK structure.

## Context
This is a Shopify AI Agent template using React Router with the Storefront MCP. The current implementation uses Anthropic's Claude SDK for the AI agent chat functionality.

## Implementation Requirements

### 1. Environment Configuration
Create or update `.env` with these variables:

ANTHROPIC_API_KEY=“” # Leave empty to disable direct Anthropic
ANTHROPIC_AUTH_TOKEN=sk-or-v1-your-openrouter-api-key
ANTHROPIC_BASE_URL=Author Not Found
MODEL_NAME=anthropic/claude-sonnet-4-6 # Can be any OpenRouter model: openai/gpt-4o, google/gemini-2.5-pro, etc.


### 2. Code Modifications
Find the chat route file (likely `app/routes/chat.ts` or `app/routes/api.chat.ts`) and update the Anthropic client initialization:

Current code likely looks like:
```typescript
const anthropic = new Anthropic({
  apiKey: process.env.ANTHROPIC_API_KEY,
});

Change to:

const anthropic = new Anthropic({
  apiKey: process.env.ANTHROPIC_AUTH_TOKEN || process.env.ANTHROPIC_API_KEY,
  baseURL: process.env.ANTHROPIC_BASE_URL || undefined,
});

const model = process.env.MODEL_NAME || 'claude-sonnet-4-6';

Then update the messages.create call to use the model variable and add optional OpenRouter headers:

const response = await anthropic.messages.create({
  model: model,  // Use environment variable instead of hardcoded 'claude-...'
  messages: messageHistory,
  tools: mcpTools,
  max_tokens: 4096,
  extraHeaders: {
    'HTTP-Referer': process.env.APP_URL || 'http://localhost:3000',
    'X-Title': 'Shopify Storefront Agent',
  },
});

3. Type Safety (if using TypeScript)

If there are TypeScript errors about baseURL or extraHeaders, add:

const anthropic = new Anthropic({
  apiKey: process.env.ANTHROPIC_AUTH_TOKEN!,
  baseURL: process.env.ANTHROPIC_BASE_URL!,
  defaultHeaders: {
    'HTTP-Referer': process.env.APP_URL || 'http://localhost:3000',
    'X-Title': 'Shopify Storefront Agent',
  },
} as any);

4. Documentation Update

Create a comment block at the top of the modified file explaining:

  • This uses OpenRouter instead of direct Anthropic
  • Any model from openrouter.ai/models can be used by changing MODEL_NAME
  • Format should be provider/model-name (e.g., anthropic/claude-sonnet-4-6, openai/gpt-4o, google/gemini-2.5-pro)

Steps to Execute

  1. @-mention the .env file and update with the new variables
  2. @-mention the chat route file (search for Anthropic client initialization) and apply the modifications
  3. Check if there’s a .env.example and update it as well
  4. Run npm run typecheck or tsc --noEmit to verify no TypeScript errors
  5. Create a summary of changes made

Verification

Ensure the implementation:

  • Uses ANTHROPIC_AUTH_TOKEN for the API key
  • Sets baseURL to https://openrouter.ai/api
  • Reads model from MODEL_NAME environment variable
  • Maintains all existing MCP tool functionality
  • Includes proper headers for OpenRouter analytics

Do this now and report back the files modified and any additional steps needed.


**How to use this:**
1. Open Claude Code in your Shopify agent project directory
2. Paste the above prompt
3. Claude will:
   - Search for the chat route file using the Anthropic SDK
   - Modify the `.env` configuration  
   - Update the client initialization code
   - Run type checking to verify
   - Report what was changed

**Alternative shorter version** if you want to be more hands-on:

```markdown
@.env @app/routes Update Shopify Storefront AI Agent to use OpenRouter instead of direct Anthropic. Change ANTHROPIC_API_KEY to ANTHROPIC_AUTH_TOKEN, add ANTHROPIC_BASE_URL=https://openrouter.ai/api, update the Anthropic client in the chat route to use baseURL parameter, and make the model name read from MODEL_NAME env var (format: provider/model-name). Keep everything else the same.