Overview / TL;DR
This post shows how to add an AI chatbot to Shopify that can:
- Answer FAQs and product questions
- Suggest products (personalized recommendations)
- Look up order status (via secure server + Shopify Admin API)
- Escalate to human support
You’ll get:
- Frontend widget (vanilla JavaScript) to embed in the storefront.
- Secure backend (Node/Express) that proxies requests to OpenAI and the Shopify Admin API.
- Webhook & signature verification tips and example code.
- Security, privacy, and deployment notes.
1. Architecture (high level)
- Shopify storefront (client) — small JS widget embedded in theme. Collects user messages and sends to your backend.
- Your backend (server) — verifies requests, reads store data via Shopify Admin API (if needed), forwards messages to OpenAI (or other LLM), returns responses.
- Optional: Database — store conversation logs, user context, order lookups.
- Human fallback — bot provides option to contact human team.
Security rules:
- Never call OpenAI or Shopify Admin API directly from browser—always use your server to keep secrets safe.
- Verify incoming webhooks (Shopify HMAC).
- Rate-limit and sanitize inputs.
2. Storefront widget (vanilla JS)
Drop this into your theme (e.g., theme.liquid before </body>). It creates a chat bubble and talks to /api/chat.
<!-- chatbot-widget.html -->
<style>
/* minimal styles */
#ai-chatbot {
position: fixed;
right: 20px;
bottom: 24px;
width: 320px;
max-height: 70vh;
box-shadow: 0 8px 30px rgba(0,0,0,0.15);
border-radius: 12px;
overflow: hidden;
font-family: Arial, sans-serif;
z-index: 9999;
}
#ai-chatbot .header { padding:10px; background:#0b6ef6; color:white; }
#ai-chatbot .messages { padding:12px; height:340px; overflow:auto; background:#fff; }
#ai-chatbot .input { display:flex; padding:8px; background:#f7f7f7; }
#ai-chatbot input { flex:1; padding:8px; border-radius:6px; border:1px solid #ddd; }
#ai-chatbot button { margin-left:8px; padding:8px 12px; border-radius:6px; border:none; background:#0b6ef6; color:white; }
.msg { margin-bottom:8px; }
.msg.bot { color:#111; background:#f0f4ff; padding:8px; border-radius:8px; display:inline-block; }
.msg.user { color:#fff; background:#0b6ef6; padding:8px; border-radius:8px; display:inline-block; float:right; }
</style>
<div id="ai-chatbot" aria-live="polite" role="dialog" aria-label="Shop assistant">
<div class="header">Shop Assistant</div>
<div class="messages" id="chatMessages"></div>
<div class="input">
<input id="chatInput" placeholder="Ask about a product, order, or size..." />
<button id="chatSend">Send</button>
</div>
</div>
<script>
(function(){
const messagesEl = document.getElementById('chatMessages');
const inputEl = document.getElementById('chatInput');
const btn = document.getElementById('chatSend');
function appendMsg(text, klass='bot'){
const div = document.createElement('div');
div.className = 'msg ' + klass;
div.innerText = text;
messagesEl.appendChild(div);
messagesEl.scrollTop = messagesEl.scrollHeight;
}
// Optional: attach visitor context (cart, product handle) sent to backend
function buildContext() {
// Example: include current product handle if present (Shopify Liquid variable)
// You can render shop/product variables into window._shopify_chat_context in theme.
return window._shopify_chat_context || {};
}
async function sendMessage(message) {
appendMsg(message, 'user');
appendMsg('Thinking...', 'bot');
try {
const resp = await fetch('/api/chat', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
message,
context: buildContext()
})
});
const data = await resp.json();
// remove last "Thinking..." message
const last = messagesEl.querySelector('.msg.bot:last-child');
if (last && last.innerText === 'Thinking...') last.remove();
appendMsg(data.reply || 'Sorry, I could not answer that right now.', 'bot');
} catch (err) {
console.error(err);
appendMsg('Error communicating with server.', 'bot');
}
}
btn.addEventListener('click', () => {
const text = inputEl.value.trim();
if (!text) return;
sendMessage(text);
inputEl.value = '';
});
inputEl.addEventListener('keydown', (e) => {
if (e.key === 'Enter') btn.click();
});
})();
</script>
Notes
- The widget calls /api/chat on your server. Implement that endpoint next.
- Optionally pre-fill window._shopify_chat_context via Liquid in theme (customer email, product handle, cart totals) — but be cautious about exposing PII.
3. Server: Node/Express backend (proxy to OpenAI + Shopify)
This is a minimal example showing:
- Accept messages from the widget
- Optionally fetch product or order info from Shopify Admin API
- Call OpenAI Chat API (or other LLM) securely
- Return a reply
Install dependencies:
npm init -y npm i express node-fetch dotenv body-parser crypto
(You’ll need openai SDK or use fetch to OpenAI; this example uses fetch with an API key in env.)
// server.js
require('dotenv').config();
const express = require('express');
const fetch = require('node-fetch');
const bodyParser = require('body-parser');
const crypto = require('crypto');
const app = express();
app.use(bodyParser.json());
// Load environment variables
const OPENAI_API_KEY = process.env.OPENAI_API_KEY; // your OpenAI key
const SHOPIFY_ADMIN_TOKEN = process.env.SHOPIFY_ADMIN_TOKEN; // if needed
const SHOPIFY_STORE = process.env.SHOPIFY_STORE; // example: myshop.myshopify.com
// Basic rate-limit & simple abuse protection (illustrative)
const rateMap = new Map();
function rateCheck(ip) {
const now = Date.now();
const entry = rateMap.get(ip) || { ts: now, count: 0 };
if (now - entry.ts > 60*1000) { entry.ts = now; entry.count = 0; }
entry.count++;
rateMap.set(ip, entry);
return entry.count < 40; // e.g., 40 req/min allowed
}
// Endpoint for chat widget
app.post('/api/chat', async (req, res) => {
try {
const ip = req.ip || req.connection.remoteAddress;
if (!rateCheck(ip)) return res.status(429).json({ error: 'Too many requests' });
const { message, context } = req.body;
if (!message || typeof message !== 'string') {
return res.status(400).json({ error: 'Invalid message' });
}
// OPTIONAL: enrich prompt with Shopify data (product info / order) based on context
let shopContext = '';
if (context && context.product_handle) {
// fetch product info from Shopify Admin API
const prodResp = await fetch(`https://${SHOPIFY_STORE}/admin/api/2024-07/products.json?handle=${encodeURIComponent(context.product_handle)}`, {
headers: { 'X-Shopify-Access-Token': SHOPIFY_ADMIN_TOKEN, 'Content-Type': 'application/json' }
});
if (prodResp.ok) {
const prodJson = await prodResp.json();
// add short product summary to prompt
if (prodJson.products && prodJson.products.length > 0) {
const p = prodJson.products[0];
shopContext = `Product: ${p.title}. Short description: ${p.body_html.replace(/<[^>]+>/g,'').slice(0,200)}. `;
}
}
}
// Build messages for OpenAI ChatCompletion
const system = `You are a helpful shopping assistant for ${SHOPIFY_STORE}. Use context from Shopify when provided. Keep responses concise, show product links when relevant.`;
const messages = [
{ role: 'system', content: system },
{ role: 'user', content: shopContext + '\nUser: ' + message }
];
// Call OpenAI (Chat Completions)
const openaiResp = await fetch('https://api.openai.com/v1/chat/completions', {
method: 'POST',
headers: { 'Authorization': `Bearer ${OPENAI_API_KEY}`, 'Content-Type': 'application/json' },
body: JSON.stringify({
model: 'gpt-4o-mini', // replace with your model
messages,
max_tokens: 350,
temperature: 0.2
})
});
if (!openaiResp.ok) {
const errText = await openaiResp.text();
console.error('OpenAI error', errText);
return res.status(500).json({ error: 'AI service error' });
}
const result = await openaiResp.json();
const reply = result.choices?.[0]?.message?.content?.trim() || 'Sorry, I could not generate a response.';
// Optionally store conversation in DB here
return res.json({ reply });
} catch (err) {
console.error(err);
res.status(500).json({ error: 'Server error' });
}
});
// Example webhook endpoint to verify Shopify webhooks (for order lookup or updates)
app.post('/webhooks/shopify', express.raw({ type: '*/*' }), (req, res) => {
const hmac = req.get('X-Shopify-Hmac-Sha256') || '';
const secret = process.env.SHOPIFY_WEBHOOK_SECRET || '';
const hash = crypto.createHmac('sha256', secret).update(req.body).digest('base64');
if (!crypto.timingSafeEqual(Buffer.from(hash), Buffer.from(hmac))) {
console.warn('Invalid Shopify webhook signature');
return res.status(401).send('Invalid signature');
}
const payload = JSON.parse(req.body.toString('utf8'));
// handle webhook...
res.status(200).send('OK');
});
const port = process.env.PORT || 3000;
app.listen(port, () => console.log(`Server listening on ${port}`));
Important env vars
OPENAI_API_KEY=sk-... SHOPIFY_ADMIN_TOKEN=shpat_... SHOPIFY_STORE=your-store.myshopify.com SHOPIFY_WEBHOOK_SECRET=...
Notes
- Use the proper Shopify Admin API version & set scopes for read_products, read_orders as needed.
- Replace gpt-4o-mini with whichever model you use and adapt request shape for your environment / SDK.
4. Example: Add order lookup command in chatbot
When a customer asks “Where’s my order?”, the bot should verify the user before revealing order status.
Flow:
- User types “Where is my order #1001?”
- Widget sends message + (optional) customer_email from Liquid if customer is logged in.
-
Server parses message; if it detects an order lookup intent, it:
- Confirms with user: “Please confirm your email ending with @example.com or provide order number”
- Calls Shopify Admin API: GET /admin/api/2024-07/orders/{order_id}.json (or search by emai
- Returns safe order status details (no full PII).
Pseudo-code snippet (intent detection + order lookup)
// in /api/chat, after receiving message:
if (message.match(/where.*order|track.*order|order status/i)) {
// if context contains customer email and order number, proceed
// else ask for a confirming detail (last 4 digits of phone or email domain)
// fetch order via Shopify Admin API (only after confirmation)
}
5. Optional: PHP backend example (lighter)
If you prefer PHP, here’s a minimal example to forward chat to OpenAI:
<?php
// endpoint: /api/chat.php
require 'vendor/autoload.php'; // if using composer
$openaiKey = getenv('OPENAI_API_KEY');
$input = json_decode(file_get_contents('php://input'), true);
$message = $input['message'] ?? '';
if (!$message) {
http_response_code(400);
echo json_encode(['error'=>'No message']);
exit;
}
$system = "You are a succinct shopping assistant for mystore.";
$payload = [
'model' => 'gpt-4o-mini',
'messages' => [
['role'=>'system','content'=>$system],
['role'=>'user','content'=>$message]
],
'max_tokens' => 300,
];
$ch = curl_init('https://api.openai.com/v1/chat/completions');
curl_setopt($ch, CURLOPT_HTTPHEADER, [
"Authorization: Bearer $openaiKey",
"Content-Type: application/json"
]);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_POSTFIELDS, json_encode($payload));
$resp = curl_exec($ch);
curl_close($ch);
$data = json_decode($resp, true);
$reply = $data['choices'][0]['message']['content'] ?? 'Sorry.';
echo json_encode(['reply' => $reply]);
6. Best practices & deployment tips
Data privacy
- Store minimal PII. If storing, encrypt at rest.
- Add a visible privacy note in the chat: “This chat stores conversation for support purposes.”
- Comply with GDPR/CCPA if you operate in relevant regions.
Authentication
- Use OAuth or app access tokens for Shopify Admin API.
- Verify Shopify webhooks via HMAC signature (example included).
- Rate-limit requests to OpenAI and Shopify.
Performance
- Cache common responses (FAQ) on the server to reduce LLM calls.
- Use a smaller, cheaper model for basic Q&A and upgrade only when necessary.
User experience
- Provide suggested quick replies (e.g., “Track order”, “Return policy”, “Product size chart”).
- Show product thumbnails and direct “Add to cart” buttons when making suggestions (requires your server to supply product URL/handle).
7. Example prompt engineering (short, safe)
Use controlled prompts so the chatbot stays on-topic:
System prompt:
"You are a helpful shopping assistant for {store_name}. Keep answers short (1-3 sentences). When user asks about products, try to include product title and a short link. If the user asks for order details, ask for verification (email domain or last 4 digits of phone) before fetching order data. Never request full credit card, passwords, or other full PII."
8. Monitoring & analytics
- Log chat interactions (anonymized) to measure top intents.
- Track conversions that originated from chatbot suggestions (UTM).
- Monitor OpenAI errors and implement fallbacks.
9. Example FAQs to seed the bot (useful for fine-tuning or prompt context)
- Return policy
- Shipping times
- Size guide
- How to apply discount codes
- Store hours and contact info
Seed these into the system message or a retrieval system to improve accuracy and reduce LLM calls.
10. Final checklist before launch
- Use server-side proxy: no secrets in client code
- Implement Shopify webhook signature verification
- Limit LLM usage and add caching
- Provide human fall-back option
- Add privacy notice & logging policy
- Test flows: FAQ, product recommendation, order lookup