Published a new article and waiting for Google to eventually find it? Skip the wait. This guide shows you how to knock on Google’s door directly using the Indexing API — and get indexed in hours, not days.
Why You Need This
By default, when you publish a new page, Google might take 3–7 days to discover and index it. That’s a lot of dead time for content that should already be ranking.
Google offers a free API — the Indexing API — that lets you proactively tell Google: “Hey, I’ve got new content. Come take a look.”
- Result: Indexed in hours, not days
- 200 free URL submissions per day — more than enough for most sites
How It Works (30-Second Overview)
You publish an article
↓
A script scans recently published pages
↓
Calls Google Indexing API to submit URLs
↓
Pings Google/Bing sitemap endpoints
↓
Google crawls your pages within hours
What you need:
- A Google Cloud project (free)
- A Service Account (free)
- Node.js
- Google Search Console access for your site
Step 1: Create a Google Service Account (2 min)
1.1 Open Google Cloud Console and create a project (or use an existing one)
1.2 Left menu → IAM & Admin → Service Accounts → Create Service Account
- Name it anything, e.g.
my-site-indexing - Skip the role selection, just finish
1.3 Click into the Service Account → Keys → Add Key → Create new key → Choose JSON → Download
You’ll get a .json file like this:
{
"type": "service_account",
"project_id": "your-project",
"private_key": "-----BEGIN PRIVATE KEY-----\n...",
"client_email": "[email protected]"
}
Save that client_email — you’ll need it next.
1.4 Enable the Indexing API: Go to the Indexing API page and click Enable
Step 2: Grant Search Console Owner Access (1 min)
This is the tricky part — Google requires the Indexing API caller to be a site Owner.
But Search Console’s UI won’t let you add a Service Account as Owner directly (since Service Accounts don’t have a login interface).
The workaround: Use Google OAuth Playground to add it via API.
2.1 Open OAuth Playground
2.2 Find Site Verification API v1 → Check https://www.googleapis.com/auth/siteverification
2.3 Click Authorize APIs → Sign in with your Google account (the verified site owner)
2.4 Click Exchange authorization code for tokens
2.5 In Step 3, fill in:
- HTTP Method:
PUT - Request URI:
https://www.googleapis.com/siteVerification/v1/webResource/https%3A%2F%2Fyour-site.com%2F
(Replace your-site.com with your actual domain)
- Content-Type:
application/json - Request Body:
{
"owners": ["[email protected]"],
"site": {
"type": "SITE",
"identifier": "https://your-site.com/"
}
}
2.6 Click Send the request → A 200 response means you’re all set
Step 3: Write the Submission Script (2 min)
Install the dependency:
npm install googleapis
Create submit-indexing.ts (or .js):
import { google } from 'googleapis';
// URLs to submit
const URLS = [
'https://your-site.com/',
'https://your-site.com/blog/your-new-post',
];
async function main() {
const credentials = JSON.parse(process.env.GOOGLE_SERVICE_ACCOUNT_KEY || '{}');
const auth = new google.auth.GoogleAuth({
credentials,
scopes: ['https://www.googleapis.com/auth/indexing'],
});
const indexing = google.indexing({ version: 'v3', auth });
for (const url of URLS) {
try {
await indexing.urlNotifications.publish({
requestBody: { url, type: 'URL_UPDATED' },
});
console.log(`✓ Submitted: ${url}`);
} catch (error) {
console.warn(`✗ Failed: ${url}`, error.message);
}
}
}
main();
Run it:
GOOGLE_SERVICE_ACCOUNT_KEY="$(cat your-service-account.json)" npx tsx submit-indexing.ts
You should see ✓ Submitted for each URL.
Step 4: Add Sitemap Ping (30 sec)
Even simpler — one file:
// ping-sitemap.ts
const SITEMAP = 'https://your-site.com/sitemap.xml';
const endpoints = [
`https://www.google.com/ping?sitemap=${encodeURIComponent(SITEMAP)}`,
`https://www.bing.com/ping?sitemap=${encodeURIComponent(SITEMAP)}`,
];
for (const url of endpoints) {
try {
const res = await fetch(url, { signal: AbortSignal.timeout(10000) });
console.log(`✓ Pinged ${new URL(url).hostname}: ${res.status}`);
} catch (e) {
console.warn(`✗ Failed: ${new URL(url).hostname}`);
}
}
⚠️ Note: Google’s ping endpoint has been returning 404 recently (they’re phasing it out), but Bing’s still works fine. With the Indexing API doing the heavy lifting, sitemap pings are just a bonus.
Step 5: Automate It (Optional but Recommended)
If you deploy via GitHub Actions, add two steps after your deploy:
- name: Submit new pages to Google
continue-on-error: true
run: npx tsx submit-indexing.ts
env:
GOOGLE_SERVICE_ACCOUNT_KEY: ${{ secrets.GOOGLE_SERVICE_ACCOUNT_KEY }}
- name: Ping Sitemap
continue-on-error: true
run: npx tsx ping-sitemap.ts
Store the Service Account JSON in GitHub Secrets (Settings → Secrets → GOOGLE_SERVICE_ACCOUNT_KEY).
continue-on-error: true is important — a failed submission shouldn’t break your deployment.
Advanced: Auto-Discover New Articles
The basic script requires manually listing URLs. If your site uses Markdown content (Astro, Next.js, Hugo, etc.), you can auto-scan for recently published articles:
import { readdirSync, readFileSync } from 'fs';
function findRecentArticles(contentDir: string, hoursBack = 48) {
const urls: string[] = [];
const cutoff = Date.now() - hoursBack * 3600 * 1000;
for (const file of readdirSync(contentDir)) {
if (!file.endsWith('.md')) continue;
const content = readFileSync(`${contentDir}/${file}`, 'utf-8');
const match = content.match(/pubDate:\s*(.+)/);
if (!match) continue;
const pubDate = new Date(match[1].trim());
if (pubDate.getTime() > cutoff) {
const slug = file.replace('.md', '');
urls.push(`https://your-site.com/blog/${slug}`);
}
}
return urls;
}
Now every time you publish, the pipeline discovers and submits automatically — zero manual work.
Summary
| Step | Time | What |
|---|---|---|
| Create Service Account | 2 min | Google Cloud Console |
| Add Owner access | 1 min | OAuth Playground API call |
| Write submission script | 1 min | Copy-paste the code above |
| Sitemap Ping | 30 sec | A single fetch call |
| Wire into CI/CD | 30 sec | Two lines in your workflow |
Cost: $0. Result: Indexing speed goes from days to hours.
The only limit is 200 URLs per day — unless you’re a news site publishing hundreds of articles daily, that’s more than enough.