Agency Technical SEO in 2026: AMA with Tory Gray & Patrick Hathaway
Published March 9, 2026
Last week, Women in Tech SEO hosted an AMA with two super-cool guests: Tory Gray, Founder of Gray Dot Co, and Patrick Hathaway, Co-founder and CEO of Sitebulb. The questions came from working agency SEOs “in the trenches” so to speak, and covered everything from JavaScript rendering to Cloudflare's newest features, to how to talk to clients about AI search.
Reading through the thread, one thing stood out for me—and it’s the same thing that keeps coming up in our webinars, guest articles, and training courses. Across a pretty wide range of questions, both Tory and Patrick kept arriving at the same place: the fundamentals still work, but understanding why they work has expanded. You're now thinking about how two generations of crawlers consume your site, not just one. That's double trouble!
You can read the full AMA here on the WTS website, or if you’d like a Jojo summary of what we’ve learned, stay right here and keep reading.
Contents:
Rendering is still the biggest skill gap

One of the questions came from someone who'd just moved agency-side after five years in-house, with largely self-taught technical SEO. What's the most important thing to get up to speed on?
Patrick's answer was straight-forward: JavaScript rendering. Not because Sitebulb offers a free training course on it, but because it underpins so many other problems, like indexing, crawl budget, and content visibility.
Plus, it's the area where self-taught technical SEOs tend to have the biggest gaps. You can have solid technical instincts but still be flying blind when a client's site relies heavily on client-side rendering. This skills gap was highlighted by our 2024 JavaScript SEO Report, and then AI search came along and the problem only got more interesting.
Tory's answer was similar-ish, though she framed it more broadly. She hates being forced to pick one thing (her words), and pointed out that the right answer is genuinely contextual (i.e. it depends): enterprise vs startup, ecommerce vs SaaS, Jamstack vs vanilla WordPress. BUT put a gun to her head and she'd choose rendering too, especially since understanding how bots consume the web is no longer just a Google question.
“We're seeing a return to having to figure out how new bots consume the web, in addition to old bots. There's a TON to explore and learn, and it should be an interesting adventure as it inevitably evolves!”
Most LLM crawlers can't render JavaScript either. Patrick flagged this as an area of rapid change, especially when it comes to RAG retrieval (e.g. live fetching during a users’ querying session vs ongoing bot crawl for training data updates), so it's worth watching.
For now, the practical implication is that getting solid on rendering pays dividends in both directions: traditional search and AI search.
International SEO: Where rendering and AI create new problems

For those with an interest in international SEO, there was a question about what, if anything, has changed from a technical standpoint.
Patrick's view is that not a huge amount has changed; hreflang is still the core of it, it's still fiddly and often broken.
What has changed is the prevalence of JavaScript-heavy internationalisation frameworks. A lot of sites now handle location-switching client-side, which means Googlebot doesn't always see the hreflang tags at all. Crawling with JS rendering enabled and cross-checking what Google actually sees against what the CMS thinks it's serving has become a more important part of the audit process as a result.
Tory adds the AI dimension: LLMs don't know, use, or respect hreflang. At all. ChatGPT has a strong western and English-language bias baked in, and the fallout from that for international visibility is still playing out.
For now, the practical implication is that international SEO in an AI search world is a lot about whether your content is accessible, comprehensible, and present in the markets you're targeting. Tory suggests monitoring for issues (like hallucinated pages with 404s) and fixing them quickly.
How to prioritise when dev resource is limited
Another question came from someone at a small agency with a site that needs a technical rebuild but doesn't have the dev resource or the budget to make it happen any time soon. Where do you focus?

Patrick's approach is pragmatic: start with what you can fix without dev involvement. Get your crawl data clean enough to actually understand what's happening and what matters most. Then identify anything that's genuinely critical – the kind of issue that's actively holding the site back – and use that evidence to make the case for dev resources.
Tory adds a useful evaluation framework on top of that. For each potential item, she looks at four things:
Timeliness and risk: Does NOT doing this create a risk? How much? Will it be too late if you wait?
Level of effort: How much work is it? Low-lift, urgent fixes often still get done even on constrained timelines.
Return: What's the potential upside of doing it now? Is it financially worth it given the effort (“is the juice worth the squeeze”)?
This same logic applies to PageSpeed scores, which came up in a related question (more on this below!)

Going against his usual form, Patrick’s answer was short: no, don't chase a 90+ score. The gains from improving a score from 80 to 90 are marginal in almost every context. If the site is genuinely slow, fix real Core Web Vitals failures (LCP and CLS especially, because they have the clearest correlation with user experience), and leave it there unless you have a specific reason to go further.
Tory put it plainly: speed needs to be "good enough" relative to your users and your competition, not an optimisation one.
For SMBs specifically, Patrick's prioritisation hierarchy is worth keeping handy:
Crawlability
Indexability
Rendering
On-page fundamentals
Structured data
He also flagged a pattern worth recognising: a lot of SEOs spend disproportionate time on 301s, 404s, and image alt text while the core technical problems that actually determine whether pages appear in search results don't get addressed. A page can't rank if it isn't indexed!
Internal linking still matters
This question came from a member who'd been seeing internal linking on AI optimisation checklists and couldn't work out how to explain to clients why it would matter to an LLM. Is it because it influences training data? Because it affects traditional rankings that AI then draws on? Both?

Both, basically. And then some.
Patrick's take is that you can skip trying to explain how LLMs work to clients because they probably won't fully understand it and it's not the most useful frame anyway. What we know is that AI platforms use search to ground their answers. You rank in search through traditional SEO, and internal linking is a core part of that: URL discovery, link equity distribution, helping Google understand what your site is actually about.
Tory adds two more reasons:
Live fetch testing: Tory referenced research by Jori Ford presented at Tech SEO Connect, which found that when AI bots fetch a page, they follow links roughly three pages deep. Contextual internal linking helps those bots answer sub-questions, not just the primary query.
Training data: Well-linked pages are better represented.
“AI isn't the be-all-end-all!! Users? Humans? Using your site? Also SEO. And SEO is a baseline into AI.”
Tory's broader point here is worth taking seriously. AI search traffic is still tiny relative to traditional search, even at enterprise scale. If you're not a major brand, it's even smaller. The fundamentals run the show, and AI is just one of the acts.
Structured data: Experiment, but don't go all in
Schema came up twice in the AMA (surprise surprise), from two different angles; whether it matters beyond rich result eligibility, and whether ontologies beyond schema.org are worth exploring for AI.

Patrick's take is that, for most sites, rich result eligibility is still the primary driver and the main thing to talk to clients about. But for sites with lots of deep, rich data, building out your own knowledge graph through structured markup makes genuine sense.
Tory's position is more expansive, though she's careful to frame it as experimentation rather than a proven tactic. Her case for trying it:
Most LLM tools say they use structured data. There’s at least a possibility that they're actively experimenting with it.
Martha van Berkel from Schema App has argued compellingly that schema can function like a semi-API: structured content that's explicitly organised and easy to ingest. Bots like making things cheap and easy.
Jori's live fetch research suggested schema may help with maths, reasoning, and deduction during live fetch tests.
And the simplest argument: inline schema is text. Defined, organised, explicit text. Even setting aside anything more sophisticated, it gives LLMs clearly structured context about a page that unstructured prose often doesn't provide as cleanly.
Remember: the schema must be inline, in the response HTML. AI bots that don't render JavaScript can't fetch external resources, which means JavaScript-injected schema is invisible to them.
“Don't go nuts - it's not a proven tactic, so don't over extend yourself and get yourself in hot water with your client/boss. But if you have resources to explore its use using a test-based approach methodology - then heck yes, do it. The era of AI is the TIME to experiment!”
Bot blocking, Cloudflare, and who actually owns infrastructure decisions
This question came from the awesome Emina Demiri, and it's one of those topics that a lot of agency SEOs are dealing with quietly: a dev blocked AI training bots without telling anyone, without doing a cost/benefit analysis, without checking the logs. How do you get ahead of that kind of thing?

Patrick's observation is that bot blocking has become genuinely pervasive. It's now the first thing Sitebulb covers in onboarding emails for our Cloud customers: add this IP address to your allow-list. His advice for agency SEOs: make bot-blocking checks a regular part of client check-ins, not something you discover six months in when a crawl fails.
Tory's take on Cloudflare's newer AI-related features is that most clients don't want to block AI crawlers wholesale; what they want is to reduce costs, reduce risk, and maximise return. A blanket block is a hammer when you need a scalpel. The newer features are worth cautious experimentation, but she's sceptical of Cloudflare's motivations.
On the "turn pages into markdown" feature specifically, she raises a legitimate concern about cloaking potential.
Now the ownership question: Who owns technical SEO delivery when a dev is involved?

This usually depends on the site. For enterprise with complex infrastructure, it's the devs, with SEOs on the product, QA, and strategy side. For a simpler WordPress site, SEOs can own more of it.
The key is having a named technical contact when you onboard a client and keeping communication lines clear before something like this happens.
SEO vs GEO: How to position your services
Last one, and it's more positioning than technical: do you differentiate "traditional SEO" from "AI optimisation" or GEO in your offering?

Patrick's preference (and mine, by the way) is simple: "SEO & AI Search." It's clean, honest about what the work involves, and doesn't hitch itself to an acronym that might not be around in three years.
Tory frames it as SEO+. GEO is SEO with multichannel marketing layered on and some additional technical complexity. It's not a separate discipline, and treating it as one can create problems when you're setting client expectations.
Those expectations matter, because clients are being sold something flashier on LinkedIn. Tina Reis flagged exactly this tension in her follow-up: it's one thing to know the sensible framing, another to hold that line when a prospect comes in having been told by someone else that GEO is a completely different service.
Tory's advice on expectations is worth passing on to clients directly: AI traffic is tiny relative to search traffic, even for enterprise. If they think SEO takes a long time to show results, AI visibility is a longer game still (in terms of growing the customer funnel). Getting that on the table early is better than managing disappointment later.
Wrapping up
The questions in this AMA ranged pretty widely, but the answers kept returning to the same underlying point: in the AI era, the work is still the work. Rendering, crawlability, internal linking, structured data—none of it has been replaced.
What's changed as an agency SEO, is now needing to understand why each of these things matters for LLM bots as well as traditional search bots. And cutting through all the GEO-bro noise on LinkedIn.
If you want to go deeper on any of the rendering fundamentals, the free JavaScript SEO training course we put together with WTS and Gray Dot Co is a solid place to start.
TL;DR key takeaways
💡 Rendering is still the foundational skill gap for agency technical SEOs, and it now matters for AI crawlers as much as for Googlebot.
💡 When dev resource is scarce, fix what you can without dev first, then use a risk/effort/return framework to decide what's worth fighting for.
💡 Internal linking matters for traditional rankings, for live-fetch depth (roughly three pages), for training data, and for actual humans using your site.
💡 Schema is worth experimenting with cautiously; but it must be inline HTML, not fetched, or AI bots won't see it.
💡 Bot blocking and infrastructure decisions need SEO & business needs in the loop; make it a regular client check-in, not a post-mortem.
💡 "SEO & AI Search" is a cleaner framing than GEO, and AI traffic is still tiny relative to search; set client expectations accordingly.
Jojo is Marketing Manager at Sitebulb. She has 15 years' experience in content and SEO, with 10 of those agency-side. Jojo works closely with the SEO community, collaborating on webinars, articles, and training content that helps to upskill SEOs.
When Jojo isn’t wrestling with content, you can find her trudging through fields with her King Charles Cavalier.
Articles for every stage in your SEO journey. Jump on board.
Related Articles
JavaScript SEO AMA with Sam Torres: 13 Questions & Answers
These WordPress Website Mistakes Could Hurt Your Brand’s Credibility
Advanced SEO Guide to Rendering: How to Debug, Test & Control What Google Sees
Sitebulb Desktop
Find, fix and communicate technical issues with easy visuals, in-depth insights, & prioritized recommendations across 300+ SEO issues.
- Ideal for SEO professionals, consultants & marketing agencies.
Try our fully featured 14 day trial. No credit card required.
Try Sitebulb for free
Sitebulb Cloud
Get all the capability of Sitebulb Desktop, accessible via your web browser. Crawl at scale without project, crawl credit, or machine limits.
- Perfect for collaboration, remote teams & extreme scale.
If you’re using another cloud crawler, you will definitely save money with Sitebulb.
Explore Sitebulb Cloud
Jojo Furnival