Main Branch

Fundamentals first, always

Articles

Why AI Discoverability Matters: Optimizing Your Website for This Generation of Search

The way people discover content has fundamentally shifted. When ChatGPT, Claude, and Perplexity answer questions about your expertise, does your website show up? Here's how to make your content AI-accessible.

Andrea Griffiths 10 min read 🌐 Read in Spanish
AI SEO Developer Advocacy Web Development Structured Data

The Search Landscape Has Changed

When was the last time you opened Google to find an answer? Honestly, most developers I know have stopped doing that. They ask ChatGPT. They ask Claude. They ask Perplexity. The way people discover content online has fundamentally shifted, and if your website isn’t optimized for AI agents, you’re basically invisible.

I’m not exaggerating. In 2024-2025, AI-powered search went mainstream. ChatGPT added web search. Claude started fetching and analyzing websites in real-time. Perplexity built an entire search engine around AI-generated answers with citations. GitHub Copilot evolved from code completion to a conversational agent with web search capabilities. Google threw AI overviews into search results. The question isn’t whether AI search will matter anymore. It’s whether your content will be found when someone asks.

I’ve been thinking about it a LOT. When a developer asks Claude “Who are the leading voices in developer advocacy?” or a recruiter asks ChatGPT “Find me experts in AI-assisted development,” your website either shows up as a cited source or it doesn’t exist. There’s no middle ground.

Being invisible to AI is becoming invisible, period.

How AI Agents Actually Discover Content

Traditional search engines like Google use crawlers that index your HTML, analyze your links, and rank based on hundreds of signals. AI agents? They work completely differently. Here’s what they actually need:

  1. Content they can quickly understand without parsing nested <div> tags and CSS classes
  2. Structured information about who you are, what you actually do, and why anyone should care
  3. Credibility signals that are consistent across your metadata
  4. Clear citations with proper titles, descriptions, and URLs they can point to

The thing is, most websites are built for human eyes. Your beautiful hero section with animated gradients? Stunning. To an AI agent, it’s incomprehensible markup. Your carefully crafted “About Me” story split across five components? Good luck extracting that coherently. The information is there, but it’s not AI-accessible.

The Wake-Up Call: My Own AI Audit

I realized this problem when I asked Claude to summarize my own developer advocacy work. I have a portfolio website. I have blog posts everywhere. I have conference talks listed across the web. But when I asked Claude the basic questions someone hiring a speaker would ask, it struggled. It could find my GitHub profile and a few scattered articles, but it couldn’t answer:

  • “What are Andrea’s main projects?”
  • “What topics does she speak about?”
  • “How can I contact her for a speaking engagement?”

The information existed. It was literally on my website. But it wasn’t AI-accessible. That’s when I decided to rebuild my online presence with AI discoverability as a first-class citizen, not an afterthought.

What I Actually Built

Here’s exactly what I implemented to make my portfolio discoverable by ChatGPT, Claude, Perplexity, and every other AI agent crawling the web:

1. Created Markdown Endpoints

The biggest breakthrough was realizing something simple: AI models are trained on text. They speak Markdown natively. So I built dedicated endpoints that serve my content in clean, hierarchical Markdown instead of HTML.

Two key endpoints:

  • /index.md - My complete professional profile, featured projects, talks, and writings in Markdown
  • /writings.md - All my articles with descriptions, tags, and reading times

These endpoints pull from the same data sources as my HTML pages but serve it in a format AI agents can parse instantly. When ChatGPT or Claude crawls my site, they get structured, semantic content instead of HTML soup.

Technically, it’s straightforward:

  • Simple API routes that fetch from GitHub Blog API, Dev.to API, Sessionize
  • Format everything as clean, hierarchical Markdown
  • Cache responses for 1 hour to reduce server load
  • Return proper Content-Type: text/markdown headers

The payoff? AI agents can understand my entire career in milliseconds.

2. Explicitly Welcomed AI Crawlers (Not Everyone Does This)

Here’s something that surprised me: most sites accidentally block AI crawlers. Overly strict robots.txt rules. Rate limiting that catches bots too. You’re essentially shooting yourself in the foot.

I updated my robots.txt to explicitly welcome every major AI crawler and point them directly to my Markdown endpoints:

# AI Bots and Crawlers - Full access including Markdown endpoints
User-agent: GPTBot
Allow: /
Allow: /index.md
Allow: /writings.md

User-agent: ChatGPT-User
Allow: /

User-agent: ClaudeBot
Allow: /

User-agent: anthropic-ai
Allow: /

User-agent: PerplexityBot
Allow: /

User-agent: CCBot
Allow: /

User-agent: cohere-ai
Allow: /

That CCBot from Common Crawl? Don’t sleep on it. Its dataset powers a ton of AI training runs and research projects. Explicitly allowing it means your content gets into those datasets.

3. Added Rich Structured Data (So AI Agents Know What They’re Reading)

AI agents love structured data. It’s like giving them a cheat sheet. I implemented Schema.org JSON-LD markup across the site:

  • Person Schema: My name, job title, organization, what I know about (skills), social profiles
  • Organization Schema: Brand consistency across platforms
  • WebSite Schema: Site metadata and description
  • Event Schema: Speaking engagements with dates, locations, whether it’s in-person or hybrid
  • Article Schema: Blog posts with authors, publish dates, clear descriptions

This structured data serves double duty: it helps traditional search engines display rich snippets, but more importantly, it gives AI agents verifiable, machine-readable facts about my work. No guessing. No parsing ambiguity.

4. Optimized for Speed (Because AI Crawlers Have Limited Patience)

Slow site? Incomplete indexing. AI crawlers will give up if your site takes forever to load. I implemented:

  • Image optimization: WebP format with responsive srcset cuts file size by 92-97%
  • Font optimization: Self-hosted fonts with font-display: swap so text renders immediately
  • Lazy loading: Below-the-fold images and iframes only load when needed
  • Mobile-first JavaScript: Desktop-only components don’t load on mobile using client:media, thanks Astro
  • Smart caching: 1-hour cache on Markdown endpoints

Result? My homepage loads in under 1 second even on 3G. That’s not a vanity metric. That’s faster crawling and better indexing.

5. Built a Dual-Format Strategy

Here’s what most people miss: You don’t replace your beautiful website. You augment it.

Humans still see the beautiful, interactive portfolio with animations and responsive design. The experience is great. But AI agents can request /index.md and get a clean, comprehensive profile in milliseconds.

It’s like having both a visual resume and a plain-text ATS-friendly resume. Same information. Different audience. Each optimized for what that audience actually needs.

You can have both. You should have both.

The Results: Before and After

How do you know if this is actually working? I tested by asking various AI assistants about my work.

Before optimization, when I asked Claude about my expertise:

  • “I can see Andrea Griffiths has a GitHub profile…”
  • “There’s a blog post from 2024…”
  • “Let me search for more information…”

Vague. Incomplete. Uncertain.

After optimization, same question:

  • “Andrea Griffiths is a Senior Developer Advocate at GitHub specializing in AI-assisted development workflows, developer tools, and team leadership. Her notable projects include Team X-Ray, a VS Code extension for revealing team expertise, and the From Pair to Peer AI Leadership Framework. She’s spoken at GitHub Universe, Netlify Compose, and JFrog SwampUP on AI security and hybrid human-agent workflows…”

Night and day. AI assistants now cite my work accurately, include relevant project links, and answer detailed questions about my expertise without hesitation.

Why This Matters for Your Career (Seriously)

If you’re a developer, designer, writer, or any professional with an online presence, AI discoverability isn’t optional. It’s essential. Here’s why:

1. AI Is How People Research You Now

Recruiters. Conference organizers. Potential clients. Collaborators. They’re increasingly using AI assistants to research people. When they ask “Find me an expert in Kubernetes security” or “Who should I invite to speak about TypeScript best practices?”, your name either comes up or it doesn’t.

You’re not getting a second chance to make that first impression.

In the SEO era, backlinks were currency. In the AI era, citations are everything. Every time an AI assistant cites your website as a source, it’s validating your expertise, driving qualified traffic, and building your reputation in AI training datasets. That’s leverage.

3. AI Agents Are Tireless Advocates

Unlike humans who visit your site once and move on, AI agents constantly recrawl and update their understanding. Once you’re in their index with good structured data, they become persistent advocates for your work—citing you in thousands of conversations you’ll never see. That’s free marketing powered by automation.

4. You’re Future-Proofing Your Online Presence

AI search is accelerating. OpenAI’s SearchGPT. Google’s AI Overviews. Perplexity’s answer engine. These aren’t experiments. They’re the future. Optimizing now means you’re ahead of the curve when AI search becomes the default for your industry.

Getting Started: Your Checklist

Ready to make your website AI-discoverable? Here’s where to start, broken into realistic effort levels:

Quick Wins:

  • Update your robots.txt to explicitly allow AI bots (GPTBot, ClaudeBot, CCBot, PerplexityBot)
  • Add basic structured data (Person schema with your name, job title, and social links)
  • Create a simple /about.md endpoint with your bio in Markdown
  • Test by asking ChatGPT or Claude “What do you know about [your name]?” and see what comes back

Medium Effort:

  • Implement comprehensive JSON-LD structured data across your site
  • Create Markdown endpoints for your main content (projects, blog posts, talks)
  • Optimize images with modern formats (WebP, AVIF) and lazy loading
  • Self-host fonts and add proper caching headers
  • Generate or verify your sitemap is complete

Advanced:

  • Monitor AI citations by periodically testing different AI assistants
  • Keep structured data updated when you publish new content
  • Experiment with different Markdown formats to see what AI agents parse best
  • Track referral traffic from AI search engines in your analytics
  • Join communities discussing AI SEO so you stay ahead of best practices

The Dual-Format Future

The web is evolving to serve two audiences: humans who want beauty and interactivity, and AI agents who want structure and semantics. The websites that thrive will master both.

Your beautiful portfolio with smooth animations and stunning design? Keep it. That’s for the human who wants to feel your personality and style.

But also create clean, structured pathways for AI agents to understand your expertise, cite your work, and recommend you to millions of people asking questions you’ll never hear.

Because in 2026 and beyond, being found doesn’t just mean ranking on Google’s first page. It means being the answer when someone asks an AI, “Who’s an expert in this?”

Make sure that expert is you.

Resources & Further Reading


Want to see how I actually built this? Check out my website at andreagriffiths.dev or dive into the Markdown endpoint that makes it all work.

About the Author: Andrea Griffiths is a Senior Developer Advocate at GitHub, where she helps engineering teams adopt and scale developer technologies. She’s passionate about making technical concepts accessible—to both humans and AI agents. Connect with her on LinkedIn, GitHub, or Twitter/X.