SEO Guide

Everything you need to know about search engine optimization for your Nexting-hosted pages. What we handle automatically, what you need to configure, and best practices.

What Nexting Handles Automatically

Every page served through the proxy includes these SEO elements out of the box. You don't need to configure any of this.

Canonical URLsEvery page includes a canonical link pointing to your domain (e.g., yourdomain.com/blog/article), telling search engines this is the authoritative version.
Meta TagsTitle, description, and keywords meta tags are automatically generated from your page content and settings.
Open Graph & Twitter CardsSocial sharing metadata is included so your pages look great when shared on social media platforms.
Structured Data (JSON-LD)Article schema markup is injected into every page, helping search engines understand your content structure, publish dates, and authorship.
Sitemap GenerationAn XML sitemap is auto-generated at your path prefix (e.g., yourdomain.com/resources/sitemap.xml) containing all published pages with last-modified dates.
Robots.txtA robots.txt file is generated that allows search engine and AI crawlers to access your published content.
AI Crawler OptimizationWhen AI crawlers (GPTBot, ClaudeBot, PerplexityBot, etc.) visit your pages, they receive clean semantic HTML optimized for AI comprehension.
Search Engine NotificationsWhen you publish a page, we automatically notify Google and Bing via IndexNow and sitemap ping so they discover your new content faster.

What You Need to Do

While Nexting handles on-page SEO automatically, there are a few things only you can do as the domain owner.

1

Set Up Google Search Console

Google Search Console is essential for monitoring how Google indexes your pages. Without it, you're flying blind.

  1. Go to Google Search Console
  2. Add your domain property (e.g., yourdomain.com)
  3. Verify ownership via DNS TXT record (recommended) or HTML meta tag
  4. Submit your Nexting sitemap (see next step)

DNS Verification

If your domain is on Cloudflare, Vercel, or similar platforms, DNS TXT verification is the easiest method. It verifies the entire domain at once, including all subpaths.
2

Submit Your Sitemap

After verifying your domain, submit the Nexting-generated sitemap so Google knows about your pages.

In Google Search Console, go to Sitemaps and add:

text
https://yourdomain.com/resources/sitemap.xml

Replace /resources with your actual path prefix. If your prefix is /, the sitemap is at the root.

Multiple Sitemaps

If your site already has its own sitemap, you can submit both. Google supports multiple sitemaps per property. You can also create a sitemap index file that references both.
3

Check Your Robots.txt

Make sure your main site's robots.txt doesn't block the path prefix where Nexting pages live.

Good
User-agent: *
Allow: /resources/
Allow: /blog/
Bad
User-agent: *
Disallow: /resources/
Disallow: /blog/
If your robots.txt blocks the path prefix, search engines will not index any Nexting pages, even if the sitemap is submitted correctly.
4

Set Up Bing Webmaster Tools (Optional)

Many AI search engines (including ChatGPT and Perplexity) rely on Bing's index. Setting up Bing Webmaster Tools can improve AI discoverability.

  1. Go to Bing Webmaster Tools
  2. You can import your site directly from Google Search Console
  3. Submit the same sitemap URL

SEO Health Checklist

Use this checklist to verify everything is set up correctly after deploying your pages.

CheckHow to Verify
Pages return 200 statusVisit your page URL directly — it should load without redirects or errors.
Canonical URL points to your domainView page source and search for <link rel="canonical">. It should show your domain, not nexting.ai.
Sitemap is accessibleVisit yourdomain.com/[prefix]/sitemap.xml in your browser. You should see an XML file listing your pages.
Robots.txt allows crawlingVisit yourdomain.com/robots.txt and ensure your path prefix is not disallowed.
Sitemap submitted to GoogleIn Google Search Console → Sitemaps, check that status shows "Success".
Pages appear in Google indexSearch site:yourdomain.com/[prefix] on Google. Results may take days to weeks to appear.
Structured data is validUse Google's Rich Results Test (search.google.com/test/rich-results) with your page URL.
Meta description is setView page source and look for <meta name="description">. Fill in descriptions in the Nexting dashboard for best results.

AI Search Visibility

Beyond traditional search engines, your pages are optimized for AI-powered search platforms like ChatGPT, Perplexity, Google AI Overviews, and Claude.

How AI Search Finds Your Content

AI search engines discover your content through a combination of:

  • Web crawling — AI bots regularly crawl indexed websites
  • Search engine indexes — Many AI platforms use Google or Bing results as a data source
  • Structured data — JSON-LD helps AI understand your content's meaning and context
  • LLMS.txt — A machine-readable summary of your site, specifically designed for LLMs

What Nexting Does for AI Crawlers

When an AI crawler visits your pages, Nexting serves a special optimized version:

  • Clean semantic HTML without CSS frameworks or JavaScript
  • Structured with <article>, <header>, and <section> tags
  • Clear headings, dates, and metadata for easy extraction
  • Robots.txt explicitly allows all major AI crawlers

LLMS.txt (Optional)

You can enable LLMS.txt in your project settings. This generates two machine-readable files:

  • /llms.txt — Page titles and descriptions
  • /llms-full.txt — Full page content in plain text

These files help LLMs quickly understand what your site offers without crawling every page.

Timeline Expectations

After your pages are indexed by Google, it typically takes 2-8 weeks for AI search platforms to discover and start citing your content. Building backlinks from other websites can accelerate this process.

Tips & Best Practices

Do

  • Write unique, valuable content that answers real questions — AI search engines prioritize authoritative, helpful content.
  • Fill in meta descriptions for every page — they appear in search results and help both users and AI understand your content.
  • Use descriptive page paths like /guide/seo-basics instead of /page-1 — URLs signal relevance to search engines.
  • Include relevant keywords naturally in your content — don't stuff them, but do use the terms your audience searches for.
  • Keep content up to date — search engines prefer fresh, recently-modified content. Use the dashboard to update pages regularly.
  • Add your sitemap to Google Search Console and Bing Webmaster Tools for faster discovery.
  • Build backlinks by sharing your content on relevant forums, social media, and industry directories.

Don't

  • Don't block Nexting page paths in your robots.txt — this prevents search engines from indexing your content.
  • Don't set up 301 redirects on the proxy paths — rewrites (200 status) are required, not redirects.
  • Don't duplicate the same content across multiple pages — search engines penalize duplicate content.
  • Don't leave meta descriptions empty — while not critical, missing descriptions mean Google generates its own snippet, which may not be ideal.
  • Don't expect instant results — SEO is a long-term strategy. It typically takes weeks to months to see significant organic traffic.
  • Don't use cloaking or hidden text — search engines will penalize your entire domain, not just the affected pages.

Common Questions

How long until my pages appear on Google?
After submitting your sitemap, Google typically discovers new pages within 1-3 days. Full indexing can take 1-4 weeks. Check progress in Google Search Console under "Pages".
Will Nexting pages affect my existing site's SEO?
No. Nexting pages are served on their own paths and don't interfere with your existing content. The canonical URLs ensure search engines attribute the content correctly to your domain, which can actually improve your overall domain authority.
Do I need to set up Google Search Console for every project?
If all your Nexting projects use the same domain, one Search Console property is enough. If you use different domains, you'll need a separate property for each.
Why are my pages showing as "Discovered — currently not indexed"?
This is normal for new pages. Google has discovered the URL but hasn't crawled it yet. It usually resolves within a few days. Make sure your content is unique and valuable to improve crawl priority.
Can AI search engines find my pages if Google hasn't indexed them?
Some AI crawlers (like GPTBot and PerplexityBot) crawl independently, but most also rely on search engine indexes. Getting indexed by Google and Bing significantly improves AI search visibility.
How do I check if an AI search engine knows about my content?
Ask the AI directly! Try searching for your brand or topic in ChatGPT, Perplexity, or Google AI Overviews. If your content is well-indexed and authoritative, these platforms will cite it in their responses.

Auto-Generated SEO Files

Nexting generates these files automatically for each project. They're served through the same proxy, so they appear on your domain.

FileURLPurpose
sitemap.xmlyourdomain.com/[prefix]/sitemap.xmlLists all published pages for search engines
robots.txtyourdomain.com/[prefix]/robots.txtCrawler access rules and sitemap reference
llms.txtyourdomain.com/[prefix]/llms.txtMachine-readable site summary for AI models
llms-full.txtyourdomain.com/[prefix]/llms-full.txtFull content export for AI models

Sitemap Index

If your site already has a sitemap, consider creating a sitemap index file at the root that references both your existing sitemap and the Nexting-generated one. This way, search engines find all your content from a single entry point.

Related Resources