SEO in the Era of AI Search: Talks From brightonSEO San Diego
Speakers
Joshua Boughton
Dan Taylor
October’s webinar was a repeat of TWO fantastic talks from brightonSEO San Diego featuring Dan Taylor & Joshua Boughton.
First up is Joshua Boughton, Senior SEO Manager at BloomHouse Marketing. With over a decade of experience driving search visibility and ROI for brands, he shows us how to align modern AI Search techniques with timeless SEO fundamentals so that strategies are sustainable and scalable.
Then we have Dan Taylor, Head of Technical SEO at SALT.agency, walking us through how he uses platforms like Google Analytics and Cloudflare to uncover technical issues that go beyond standard audits. He also covers how structured data can support visibility in AI-driven search experiences, and how to implement it in a way that aligns with both search intent and site performance.
Webinar recording
You can find Dan Taylor's presentation slides here.
Subscribe to the Sitebulb YouTube channel! You can also sign up here to be the first to find out about upcoming webinars.
Webinar transcript
Patrick Hathaway:
Hi everyone. We are back with a special webinar today. We were lucky enough to head out to San Diego for the US version of brightonSEO last month where we, as site where we're exhibiting, and we're pleased to be able to revisit two of the brilliant talks from that event.
Now, since I was diligently manning our stand, I wasn't actually able to watch any of the live presentations at the event, but I've got it on good authority that these were both very well received. So I'm looking forward to them as much as you today.
So my name is Patrick. I'm CEO and co-founder of Sitebulb. And if you are logging on today, expect to see Jojo. I'm afraid she's unwell and tucked up in bed recovering, so you're just going to have to put up with me. Fortunately, it will be our guests doing most of the heavy lifting today. So please say hi to Joshua Boughton and Dan Taylor. You'll introduce properly in a minute. Of course, feel free to introduce yourselves in the chat. So Joshua is not yet with us, so we will probably run with Dan's presentation first and then we can do Joshua's after, but I will go onto that in a minute.
So if you don't know Sitebulb already, it is a website auditing and calling tool. Now, if you do tech SEO, website calling or auditing at any level, Sitebulb can probably help. We are the only caller on the market to offer desktop-based software and a web-based cloud product. So that means our tool can help every SEO from solo freelancers all the way up to enterprise brands. So please do check out the website if you want to learn more. We have a free trial of our desktop product. And if you want to have a chat about Sitebulb Cloud, you can book in a call with me.
Cool. So final bits of housekeeping before we can get going. Now, the webinar is being recorded and we'll aim to email the recording tomorrow with the slides and everything in that. So please don't worry if you're not able to make it for the whole thing. And we do have a little bit of a different format today, because we are fitting in two awesome talks. So these will pretty much run back to back. Then if we have any time left over at the end, we can fit in some ... We'll put some Q&A at the end.
I think Joshua's just here, so I'll be able to add them to this stage in a second. But, yes, please make sure if you do have questions that you put them in the Q&A tab on the right-hand side of the screen, not in the chat itself. Everyone does this wrong every single time. There's a little Q&A thing, so put the questions in there and then we can add them to the stage later. So, yeah, we may have some time for Q&A, but we may not. We'll see how we go. Cool, right. Joshua is just here, so we're going to move him to the stage as well. Hello, Joshua.
Joshua Boughton:
Hey, how are you doing? Good morning or evening.
Patrick Hathaway:
How's it going?
Joshua Boughton:
Hey guys.
Patrick Hathaway:
Afternoon for us. We're just at 4:00 PM over here. So, yes, we do have Joshua Boughton here from the senior SEO manager at BloomHouse Marketing, with over a decade of experience driving search visibility and ROI for brands, he'll be showing us how to align modern AI search techniques with timeless SEO fundamentals. So the strategies are sustainable and scalable.
Please welcome, Joshua. And then after them, we also have Dan Taylor, head of technical SEO at SALT, specialist tech SEO agency. Dan is a well-known face on the speech circuit and he'll be walking us through how he uses platforms like Google Analytics and Cloudflare to uncover technical issues that go beyond standard audits.
He'll also cover how structured data can support visibility in AI-driven search experiences and how to implement it in a way that aligns with both search and site performance. So welcome Dan and Joshua. Thank you for giving up your time today and welcome everybody as well.
Right. So now we are all introduced, Joshua and I will take a back seat and I'll hand over to Dan and we'll see you all shortly. Dan, will you be able to then share your screen and we'll see you in 20 minutes or so?
Dan Taylor:
Yeah, absolutely. Thank you for the instruction, Patrick, and good to see you again Joshua and thank you everyone for joining, so let- ...
I got Candy and his experience and everything else, but I'm at SALT.agency at my moment where I've been for nearly 10 years. Previously, like I've just said, I was head of technical SEO. Prior to that, I was head of research and development and just led accounts over the years.
So repeating the talk I gave over at Brighton San Diego, heavily focusing on basically technical SEO for LLMs. So what I wanted to do with this talk was to kind of strip back and actually have a look at what are the standard technical SEO things we've looked at over the last oodles and number of years as to practise. And then if we wanted to try and apply this two how LLMs work, how LLMs present data, what can we tie back, what best practises from existing technical SEO can we apply to AI search and kind of doing that separation.
So the initial leading as we all know why we're on this kind of webinar is search engines aren't the only crawlers and clients tend to actually seem to care about this AI stuff. They seem to want to talk about it. C level wants to seem to ask questions. It doesn't feel like it's going away like voice search did after 2020 and it wasn't 40% of all search. It's becoming a bigger thing. And even though everyone's saying you don't need geo, you don't need this, SEO is a foundation. There are nuances to it and we need to actually adapt what our processes are, what our understandings are and our best practises. For one best practise is for both, or are the differences between the two?
For this, like I said, it's moving on. It's creating a technical optimization framework which is applicable for the modern web, and the internet hasn't really changed shape. What's happening is these things on the internet are changing shape and how users are interacting and changing. So what we want to do here is we want to actually create a more modern framework and a statistical framework moving forward.
Now, some of the definitions because I want to be clear of how I'm going to say things. So when I refer to LLMs, I'm talking positively around things like Gemini, ChatGPT, Perplexity, Claude, et cetera. And when I'm talking traditional search, I'm talking your traditional template links. I'm talking AI Mode and AI Overviews predominantly because AI Mode and AI Overviews have the same infrastructure as what traditional search does. So the theory of technical SEO for one works for the other because it's the same infrastructure in the backend.
And with anything in SEO, your mileage may vary. The data I've pulled is across a specific set of sites. There'll always be anomalies in my data. There'll always be other data sets to that rule. So everything with this is more thought provocative. Test, take a look into SALT and test that theory with your own science as well.
So let's dive into a nice length one, it wouldn't be technical SEO talk if I didn't immediately mention JavaScript, which I have heard actually at the conference over in San Diego described to me as being the devil a couple of times by some non-technical SEOs, which I found quite amusing. So the standard truth is that most LLMs don't render JavaScript content. This isn't something new. They use a very base crawler. They pretty much do page fetch with very simple HTTP requests. They look at the raw HTML, they don't necessarily execute scripts or wait for dynamic content to load. It's a very linear process which you can actually attain into curls or basic HTTP client requests for those who've been in technical for a long time. It's more like a text-based scraper more than anything and not actually like a browser in rendering.
Now, since I did this talk, obviously, we've had Perplexity Comet and ChatGPT Atlas be released. I have additional theories on whether or not the data from those browsers will be passed back in a similar way that Google says they don't with Chrome so that they can actually have render caches and get that data into the training data more. But from what we know from documentation, their exact user agents for crawls at the moment are still these basic text-based levels.
From doing that testing, we are obviously Gemini AI Mode and AI Overviews, in theory, fall into the Render JS category just purely because they're using Google's current infrastructure to a point. AI Overviews and AI Mode a hundred percent Gemini uses an adjacent infrastructure, but they both technically have rendering capabilities of JavaScript because they both use web rendering service. They both follow the same protocols that Google does. From testing, most specifically don't have Perplexity involved. And ChatGPT is falling into inclusive, because through testing, sometimes it did, sometimes it didn't and I'm basically sitting on the fence there about being committal to it.
But essentially, the practises for what we've done with SEO over the years with our JavaScript seem to follow truth to what we want to get value out of when it comes to LLMs. So we want to be investing in server-side rendering. We want to start looking at things like static pre-rendering. We want to look at hydration. And even if we want to do compressed plain text views for specific user agents are finding the live retrieval user agents for ChatGPT, for Claude, for Perplexity, and actually presenting them with a cache plain text version of the site. Give them something in lightweight to digest, because they don't necessarily want to see all the bells and whistles on a paid layout and the what-nots and Google looks at the page experience at the moment. They just want the data, they want the text. So strip that back, give them a plain text version and that might be a technical wing from that side.
Moving on to another very controversial topic when it comes to LLMs is that around schema? Now, the two main questions asking here really are can LLMs understand schema and does schema impact visibility. So the first one to go on this is around about five, six months ago, Andrej Karpathy who is co-founder of OpenAI, ex-senior director of AI at Tesla. He's currently at Hugging Face. He released this nearly four-hour long video on YouTube, which you could find. We have that link below and I believe a copy of these slides will be sent out after this anyway, so you don't need to desperately try screenshot that right now. But around four minutes in it, the key point is they basically turn around and say, "Right, when we take this data, we take URLs, we filter them, then we extract for text." They strip it of all JavaScript, they strip it of CSS, they strip it of JSON, all they want is the actual text itself, not any kind of coloration or fluff around it. Then it goes into processing.
So at that stage, information from the training data doesn't get passed to in theory ChatGPT or the LLMs because this is supposedly how ChatGPT was built. It's got infrastructure for processing. So schema doesn't influence the processing part of that or the understanding of the content. It's exact quote, which I won't go on too much, is essentially that we see all the markup, list CSS, compute code for those webpages. They don't want it, they just want that raw content that may strip it plain out as plain text.
But what is important to understand is LLMs can understand structured data. You can use structured data in JSON for prompting. You can ask each to produce it. They just don't often receive it as I've just covered as plain text or markdown when it gets through to processing.
Looking at AI Overviews, we did a study of this, I pulled 107,000 webpages cited and I just wanted to simply look right, "Is schema potentially impacting this?" And lo and behold, 82% of 100,000 websites are using organisation schema. The majority being cited are using very standard schema types you would expect from any website on the internet, anything with a simple WordPress optimization plugin perhaps or out of the box schema functionality from most website builders and platforms. There's very few actually with anything out of the ordinary.
Then maybe wants to say, "Okay, well, how about combinations of schema?" So I looked at schema triplets. Again, you'd expect these to be on any standard website. You've got articles, author, person, organisation, URL, homepage. These aren't uncommon non-hygiene schema types. So these essentially is saying, "Well, it isn't receiving the schema, but as most webpages need to be ranked, I believe recent studies have shown that I think a third of websites being cited at least have some search traffic."
It's still important from an SEO perspective to actually have that ranking, have that value and have that consideration. But LLMs and AI tools don't need that schema to be able to extract and understand the content on the page. So it is from that perspective relatively hygiene. So Google still uses that in that same way as it does for search, but it isn't necessary for ranking in search. It's still important from organic and it's still important from a hygiene perspective.
Moving on something a bit lighter, more looking at the technical side for LLMs, robots.txt files. Now, I wanted to include this mainly in my talk in San Diego just because some people are using robots.txt files for more harm than good, and having had a few conversations with web archive and people like Common Crawl, this became more apparent. Because if you're disallowing LLM crawlers, you could still be included in training data.
So if you want to be included, there's certain things you need to do. And if you don't want to be included, so people are blocking ChatGPT, they're blocking Corebot, PerplexityBot, et cetera, but then they're still in the training data and they're still being included in the LLMs. That's because the training data comes from very different sources, one of which is Common Crawl.
Now, a lot of websites have CCBot blocked, which is Common Crawl bot, which ... Well, WebArchive, for example, use Common Crawl for some of their data collection and LLMs will download data from Common Crawl in, I believe the term we used was quettabyte, which were many times out. There's a lot of blooming zeros on that number, but basically huge data files. Basically, archives of the internet.
And you can ask to be excluded from it. You can block Common Crawl from accessing, but if you're blocking Common Crawl, there's a chance you're not being included in the training data. If you're blocking just user agents for LLMs and not Common Crawl, there's a chance you're still going to be included because you're not blocking the training data. There's two sides to this coin and most robots.txt files have that discerning, we're blocking one but not the other, either way and vice versa.
Common Crawl is something I'll advise everybody to have a look at just because of how it works and what it's doing and making sure you're in that data set requesting them to crawl, et cetera. They're non-profit and they want that data and they want the inclusion levels. And if you want high levels, well, if you want to be included in AI more, blocking CCBot isn't something I'd recommend doing in the robots.txt files. Any documentation for that can be found on this URL here.
The next thing I wanted to look at with LLM and technicality is site speed. Because obviously, we talk about site speed being important for SEO. I think it's called WebHouse. We want sites to be fast, because fast, it was good user experience. But is there a correlation between good site speeds and inclusion within LLMs and AI?
So essentially, I took a relative same number of websites, had a look of pages on them, and the average was around, well, it was 55.76, so most websites being cited in LLMs aren't lightning fast. There were a handful towards the top, but as you can see, the dispersion to the right of the average is relatively lower than to the left of the average. So if there isn't really a correlation between good page speed ... I mean, I use PSI, PageSpeed Insights score space just because it's the best to do at scale when you don't have access to 100,000 search console properties, but you don't have to have that lightning speed.
Similarly, when we look at Core Web Vitals, yeah, there is a correlation of having a lower level of LCP seconds. I meant a low score for CLS as well as actually largest content will pay in cumulative layout shifts. So having good Core Web Vitals does have a correlation in being present in LLMs and AI search. However, I think that is a correlation in that, like I said, you need to at least have a good ranking and a good stead in theory to actually be present, to be indexed and to actually then be included.
Just to summarise what I've just said there when you download. The next element was site structure and long URL versus short URL, keyword in URL, et cetera. So the first thing I wanted to look at was URL length, and a lot of this comes from just spending far too long and linked in and seeing all the wonderful buttons which are never backed by data and have to rank the LLMs. [inaudible 00:18:26], AIO service at the back of it.
One of which I saw when I was published, that thing was around URL length. So what I did is I took the same sample and I want me to look at the average URL length of characters based for the classic blue links ranking core base. And as you can see here, average number of characters for these URLs was around 45, 46. Around that level was a couple which were really long at 175 and some which were just like nothing. So then I took the same queries and wanted to look at the URLs within LLM search.
I'm going to flick between these two slides again for a little bit just so you can actually see. That's blue links, AI search, blue links, AI search. The actual numbers for what was being cited in AI Mode, the min/max and even the mean some middle value for those and near enough exactly the same. Yes, the average is slightly different. The URL length doesn't have any difference in citation and it's pretty much the same practises you would follow for traditional SEOs. Phase 1 technical can be followed over.
The next element I wanted to look at was around page structure which is, again, people have said if you technically lay out a page in a certain way, you can get certain structures and you stand a greater chance. And again, this comes from hours of being on LinkedIn and seeing the templates where you can comment complete AI template and you can be sent the PDF and whatnot again once to see if there's any empirical evidence to look at this.
So what I need to do this was I wanted to see, well, in AI Mode and AI Overviews, occasionally, it will cite the URL with the #:~:text string, and it will have that jump link to the highlighted text. So I took that data set, I pulled out all those with that, and I wanted to see. Well, surely, if there's a structure and a preference on a standard viewport of, I think it was 1980, 1080 viewport, the depth of which that highlighted area appears in terms of pixel depth will be relatively stable because AI supposedly has a preference for a certain structure.
So I pulled the data, it was only 2,138 of these websites, but as you can see, the pixel depth of where these citations and the text summaries were being pulled from massively varies. There's no rhyme or reason or set structure to these. And then I even broke these down by different sectors and you can either be pulled from the top of a page of a first sentence. You can be pulled from the bottom of a page and cited as a jump link.
I think a very small percentage of these, something like 3%, 4% was actually being pulled from FAQs, like specific FAQs on the page. There was very little correlation to any kind of page structure methodology actually providing a better return rate as you can see from the data. So to kind of wrap things up before we hand over to Josh from the foundations of technical SEO are relatively same.
The shape of the internet hasn't changed yet. What's changed is how people are accessing it and there's new crawl, there's new technologies in town, but the internet is still built the same way the internet was five years ago. We need to acknowledge that we've optimised specifically mainly for Google, let's be honest, and sometimes it's been bought in the past, but now we actually need to understand and accommodate the nuances of the different crawler types for essentially what is a new platform in AI for visibility there.
And JavaScript is, once again, a key pain point, but we'll continue having discussions on the, probably as long as SEO is a thing, which given SEO first died in 1997, it's probably not got long left for the state of execution. But thank you very much. Back to you Patrick.
Patrick Hathaway:
Hey, Dan, thanks for that. That was really good. Completely agree about the fundamentals being the same and especially at that point about JavaScript. Just we're never going to get away from it, right?
Dan Taylor:
Unfortunately not. But for as long as JavaScript exists, I have [inaudible 00:22:57]. It's a two-sided coin.
Patrick Hathaway:
Right. So hang on a sec, let me just see if I can get this right this time. So we'll bring Joshua back on and he'll be able to share his screen. Yeah. So Dan, thank you for that. If you could hang around please, then if we have some Q&A at the end, which so oftentimes in these sorts of sessions we don't have as much Q&A, so that's absolutely fine. If we don't, we'll get to finish early, but if people do have questions, please put them in. So we'll say bye to you, Dan, and then I'll hand over to you, Joshua, in a sec.
Dan Taylor:
Thank you.
Patrick Hathaway:
Cheers, Dan. Okay. So Joshua, I think you're currently on mute, if you could put your audio on, then the present button is at the bottom, which will allow you to choose your screen and pick your slides from there as well.
Joshua Boughton:
All right, there we go. Can you hear me now?
Patrick Hathaway:
We can hear you, absolutely.
Joshua Boughton:
It's a lot better. Okay, share my screen here
Patrick Hathaway:
And then I'll go backstage and then this time I'll try not to remove the audio. Apologies for that before, that was my bad. Okay, great stuff. Right. I'll go backstage. I will leave you to it. Thanks very much again for this Joshua, and I'll see you shortly.
Joshua Boughton:
Sounds like a plan. All right, so yeah guys, my name is Joshua. I'm working at BloomHouse Marketing, been in SEO for about 12 years. At BloomHouse, we predominantly focus on behavioural health, the behavioural industry, so that's like your rehab centres and mental health facilities and things like that. I come from a wealth of knowledge, prior working with MP Digital and various other agencies. Worked with anywhere from mom-and-pops all the way up. So I've seen the gambit of the SEO world.
So basically, my talk today, I'm just going to be focused on the fundamentals, because as mentioned previously, the fundamentals have not changed, so extremely important. Sometimes, it's good to just kind of recalibrate, refocus and return to that original fundamentals.
All right. So knowing your true north. Every SEO strategy needs a true north, a clear destination. It hasn't changed any of the outward loopholes or the vanity metrics. It's connecting with users with information that they're actually looking for. And finding your true north should be easy. You just want to kind of identify what makes your business money, what is going to drive revenue, how do you bring in your customers, what are your goals for the year, whether that's financial, revenue, and bring that back into your SEO strategy. How does that affect your SEO strategy?
We've had clients that have worked in different fields. For instance, I previously worked with some plastic surgeon clients and one thing that they knew was that by targeting Botox keywords, related keywords, they would be able to bring in their clients and then upsell them other services.
In the behavioural health space, there are keywords that are a lot more profitable for the clients would be like detoxes and things like that. So doing research on keywords around that and using that as the cornerstone of your strategy and setting your goals through those metrics. It's not about the traffic or anything, it's about how do all the elements of SEO come in to help you identify that true north.
All right, Google's playbook hasn't really changed. It's clearly relevance and trust. That's been the name of the game for Google. Looking at your playbook, it clearly comes down to those three pillars. Clearly means your site is crystal clear to search engines from the code of your content. They should be able to crawl your pages instantly and it graphs what each page is talking about.
Using things like schema, proper URL structure would absolutely help. Make sure you're locked in on your technical SEO. It will help these search engines identify that content. Sorry. That's when EEAT comes into play, experience, expertise, authoritativeness, and trustworthiness, we know this acronym. We live and die by it. Search engines want to feel confident in recommending you. They want to know that your website is a trustworthy website, that's credibility. Looking at citation scores, that shows expertise.
The core principle, it really boils down to clarity, relevance, and trust. These are the Thomas SEO fundamentals that you should live and kind of die by. And you take a look at AI. AI, it's pulling rapidly from all these different sources similar to search engines, and they have to know that they can trust your content. A lot of that trust comes from the weight that they put on search engines like Google and Bing. So if you can earn Google's trust, you can earn AI's trust.
AI mission, understanding and serving intent. So how does AI fit in? Essentially AI's mission is the same as Google, as I mentioned before. Understand what you usually want. It goes back to your true north. What does your user looking for? What does your user journey look like, and ow to meet them where they are and not where you want them to be?
So whether it's like Google, Brainframe, or ChatGPT, focus on your user intent. Above all, Google always says we prioritise content that matches the user intent, shows depth and offers clear trusted answers. AI models use neutral language processing to intercept nuanced queries in plain English. So they're a lot better at picking up that those long-form conversational keywords within your content. So that's why it's really good to understand your users and speak directly to them. So when they're speaking to search engines and AI models, your content is speaking to them. So then these AI models will speak to your users.
All right, so different players, same game, Google, Bing, ChatGPT, Brad, they're all hungry for our clear quality content. Easy to get distracted by the new players. Today, we have Google's traditional SERPs, Bing's chat integrated search. Voice agents, AI, chatbots, ChatGPT, Perplexity, Claude. There's a long list and there's definitely probably more coming, but here's the thing. They're all fundamentally playing the same game, server and searcher. So whether a user types in a query into Google or ChatGPT for a question and all and the answer still comes up from the content that they get from the web.
At BloomHouse we always say, AI can't guess. It still needs to scan the top performing content to come up with generated responses. In other words, the better your site content is, the more all these platforms will favour it. Do your targeting for Google Maps, international, X link, tags for global, Bing results or search data to feed AI snippets. Tactics of different venues, but they share a common purpose to make it easy for AI to understand what your content is referencing.
The audience might shift platforms, but they're all searchers seeking answers. And if you build your site for the search or not for the search engine, you'll do well on our platforms. So again, like I said, it always starts with that true north understanding. If that comes down to building personas, understanding that and building off those pillars and the data that you know about your users and really defining who it is you're trying to speak to before you go in and just adding all of the SEO tactics in the world to your strategy.
Signing objects versus strategy. By now, once heard the buzzwords, GEO, LLMO, AI, we write the rules, there's a whole bunch of new tools and acronyms, new opportunities, but let's be real. AI doesn't change the core of a good SEO strategy. We've seen this before. I remember back in the day there was voice search or AMP and everyone's like, "Oh my god, you got to have voice search, you got to have AMP and all this." And everyone went at an absolute website. Where's AMP now? No one's talking about AMP.
So not saying that's going to happen to AI but just chasing those shiny objects and SEO tactics can lead you down a rabbit hole. And I think people are saying things like the SEO is obsolete. It's kind of ridiculous. The idea that AI makes SEO relevant is the most inaccurate and irresponsible framing I've heard in my career. Optimising for search at this point, it's clear knowledge that it's optimising for AI.
You look at ChatGPT, they launched their search engine and they came out and pretty much said that they're heavily dependent on Google's SERP results. So what does that tell you? That focusing on your strategy, don't get caught in the hype and the shiny objects. Traditional SEO will continue to win. And not saying you can't use these tactics, but as SEOs, we should be looking at AI. I have a friend, his name is Simon, he always says, "We're the doctors and SEO is just our scalpel." So that's just another tool in your toolbox to use to optimise your website, to present your content and your website to the users.
All right, fundamental first, why fundamentals still win. Are your fundamentals solid? Sometimes you have to take a gut check. In the area of AI, the question matters more than ever. Even the most advanced AI algorithms depend on fundamental signals to understand and rank your content. So like your H1s, are those optimised? Are you using schema? You know what I mean? Is your content relevant, your page structure, all that? Meta descriptions, title tags. Believe it or not, all that comes into play. ChatGPT just launched its arm search, like I mentioned before, it's heavily dependent on Google Search.
AI tools, use the fundamentals, determine what your content really means. It's relevance when crawling, indexing and ranking the pages. A solid on-page optimization. Technical and user experience signals still play a big role in helping AI understand your value of your site, but you can't escape the basics if your site isn't crawlable. If your content structure is a mess or your page load speed is lower than dial-up connection in 1999, then no amount of AI trickery will help. You'll just become obsolete.
So without a crawlable site and clear structure and content, as we saw with the data presented in the previous talk, you need that all to match your user intent. No tool or tactic or trick can flip the scales for you. So you really got to hone in on your SEO fundamentals. Those haven't changed.
All right. Modern tactics, that meets classic principles, it's a hybrid approach. I'm not saying ignore the new tactics, far from it. We should embrace them. Embrace modern tactics if that comes from diving into a Reddit strategy to help build off-site links. Reddit is awesome for keywords. Google actually partnered with Reddit and I think they own 30 or something percent. I have to get the exact number, but 30% of some searches and SERPs, and you guys have seen it.
When you search a lot of longer tail or when you search a lot of queries, Reddit pops up first. And the LLMs, they depend heavily on brand signals and Reddit is a huge brand signal. So implementing things like that and thinking of a hybrid approach, it's like old school wisdom meets new school tactics. The key is layering innovation on top of a strong base, not replacing the base as we put it. Knowing how to layer efficiently is the secret. Step one, master the core principles. Things like keyword research, on-page, technical health, link building, UX, get these down.
Step two, strategically integrate innovations that enhance those core areas. For example, we've created in-house sheets that allow us to automate things like blog topics after doing a round of keyword gap analysis or automating your blog, posting your blogs and all those features that come into that and using AI to leverage that.
As a result, a sustainable strategy that adapts change. You're not choosing between SEO the old way or the AI way. You're doing both in harmony. That hybrid mindset lets you scale and evolve without losing sight of what actually works.
Ranking in an era of AI. Visibility beyond the blue links. So you want to optimise visibility, not just a rank position. It goes back to what I said previously. Using every platform. Neil Patel says all the time to search everywhere. We're going to search everywhere environment. And that is so true because AI depends a lot on brand signals. So if you only focus on getting those links on Google, you're going to miss out on a lot of opportunity there.
Today, over 60% of Google searchers and without even having to click to get their answers on the first page. If you think about it, when you look at a SERP, there's feature snippets, summaries, knowledge panels, searchers are going to find what they're looking for and they might even click into your website. So you got to redefine what winning an SEO means to you and your strategy. It's not just did I rank for this or am I ranking for that, but am I visible where my answers are being delivered?
Classic blue links are literally getting pushed down by these new elements. So ranking first doesn't generate as much traffic as it used to anymore. Instead, we need to optimise for those prime on-page features. That means structuring your content, earning rich snippets and those concise answers, lists and tables. You're pairing in the people. Also ask by looking at their PAA and including those in your content for your FAQs, that will help you based on your keyword. That will help you. Doing keyword research again on Reddit to help with your PAAs and getting into local mat packs for local queries. Those would definitely always help in practise.
I'm sorry. In practise, that might evolve. Answering the question clearly in 40 to 50 words through wrap. Simply using FAQ or PAA. Adding schema markup. At the bottom line, expand your definition of SEO success. It's about owning as much real estate in the server as you can. In the answer space. If 10 blue links were main street, then AI summaries and rich results are as shiny as the new skyline.
Timeless SEO, non-negotiables. Non-negotiable SEO success in 2025 might sound very familiar. In fact, they should be the same things that worked in 2015, 2005. We've been optimising H1s, title tags, building content FAQs for a while now. So these same non-negotiables that worked before should work for you right now.
If you've been practising good SEO, you've essentially been preparing for AI without even knowing sites with fast mobile. Mobile speed are always winning. Users love it. Google loves it. Content that generally and usually answers real questions. Google loves that data. Find the data that supports your findings in your content and present that in your content. The LLMs love data. That's one thing that they can continuously get from your users without you presenting it to them. Building high quality backlinks, getting brand mentions, those are always gold. As I mentioned, these AI LLMs as well as Google, they love backlinks, so they're more important than ever. You have to be building backlinks that were brand signals.
All right. With so many things possible to do in SEO, it's crucial to focus on the tactics that truly move the needle. Here's a little hit list, technical cleanup. Start by making sure your site is well-oiled machine. Ensure every page is crawlable and indexable. No mysterious disallows in your robots.txt, fix your 404s. Make sure it's mobile-friendly. Obviously, you want to get rid of HTTP if you're still seeing that and just clean up your technical.
Intent-driven content is the next thing you want to focus on. Build out a content strategy that follows the path of your true north. So figure out where you want to take your SEO and where do you want to be visible with your content and settle that down. Figure out where your users are looking for the solution that you provide, whether that's an e-comm or a product or that's a service or that's B2B. Figure out the solution, identify the problem and answer any question along that path that you can find and deliver that to your users.
On-page is my favourite part. Yes, the classics. It always matters. Craft a clear compelling title tags, meta descriptions, improve your click-through rate. Use logical H1s and H2s implementing FAQs. These are all going to help you, obviously, rank and then incorporating schema into this FAQs will definitely help you get into those people also as.
Internal linking, these are not people clicking through the sites, these are bots. So you want to help the bots out by incorporating internal linking. One thing we built internally was JavaScript that automatically takes the keyword research we've performed on all of our pages and that we've identified and it incorporates the internal linking throughout our website. It's one thing you guys should ... I would definitely recommend, because basically, you got to look at it like these bots are just clicking around trying to figure out everything they can and digest everything about your website. So make sure you got to look at it as like putting food on the plate. So put the food on the plate for them to digest without those internal links, without the proper structure, then your site becomes undigestible by the bots.
Inequality signals focus on trust and authority. This means earning quality backlinks and brand mentions still important, getting positive reviews. One of the hurdles that we always have to overcome with our clients is reviews. Whether you're in-house or you are working with a list of clients, the importance of reviews is more important than ever because that lets Google know from your users that's coming. What's the saying? Straight from the horse's mouth. That's straight from your users saying how good and great they are. That is important.
Making sure your NAPs are consistent across the internet. So you don't want to have 50 million different numbers because that's going to confuse and it's going to look like you don't care about the information that's about your business and across the internet. Building out author bios, making your authors and authority. So then building authority behind your authors and citing your sources and making sure the data is up-to-date, keeping that content fresh. These are things that are going to help build that authority and build that trust throughout Google. These are the essentials that drive the significant results. You find yourself spending time on something that doesn't back your core areas. Question it. The flashiest way, the new tactic and the fundamentals will always work.
Blueprint for success. So strategy, specifically how to create an SEO strategy. Step one, lay the platform, perform a comprehensive site audit. That's your technical, your on-site, and any errors that you're finding on the website. Step two, implement all your core on-page fundamentals. Do the dishes. So that's why I like to call that, start doing your dishes, clean the kitchen. So make sure your site is eligible to rank. If your on-site is not aligned, then you will probably never link. There's just no way to rank without it on-site.
Step three, layer advanced strategies in a way. For example, once your content bases are down, introduce content clustering strategies. You start creating tools that support your content, that help your user, things like that. Making sure you're incorporating schema markup, optimising for international if that's what your site calls for. But, yeah, as long as you have a blueprint, you're building a strategy that's actually scalable and know where you want to start and know where you want to end and then create a plan that takes you from path A to path B with the SEO tactics, whether that's AI or the old school SEO tactics that will get you there.
All right. So sometimes pause, resetting can be a power move. Everybody is moving a million miles a minute. The landscape is always changing. So sometimes, it's good to hit the pause, do some spring-cleaning, do a periodic reset, re-audit your website, re-audit your strategy. Is this working? Is this effective? Is this pushing the needle forward? Because over time, websites accumulate bloat, technical debt and outdated content. So you want to go back, take a pause, correct all that, because you're just building dirt on top of dirt, and that's never going to help you rank, it's not going to help you gain that visibility. You need to rank in both LLMs and at AR.
So the last part is just I have a saying that I like say to my team all the time. So you can't cook in a dirty kitchen. So going back to what I previously said, you kind of want to take a step back, look at your strategy, look at your website and do your dishes. Periodically, going through doing a content audit, remove what's not performing, improve what is performing, get rid of any technical errors and get rid of those pages. Don't be afraid to delete unperforming content. See it so many times, people are like, "Oh, well what if we just rewrite it and re-write it? Send some links to it." I usually give content about three max, six months. If it's not moving after a few changes, then we need to reassess it, remove it, redirect it, and find a new path forward.
Because as I said previously, your website will accumulate a lot of bloat and you'll start eating up your crawl budget and you'll just find yourself with a lot of duplicate pages. But, yeah, all in all, I don't think that the landscape has changed as much as people will say. SEO is not going anywhere, but the fundamentals are the most important part because without the fundamentals of SEO, then you're not serving the search engines. If you're not serving the search engines, there's no way you can serve AI. Thank you.
Patrick Hathaway:
Hi, thanks for that, Joshua. I'm going to welcome Dan back as well. I genuinely love that the main takeaway from both the talks is that we still need to focus on SEO fundamentals. That is just the confirmation bias I was looking for. So let's get to me.
Hang on a sec. We got Joshua's video. Okay, well, we do have some questions and we have 10 minutes or so left, so I'm just going to fire through them. If the other person is answering, just put yourself on mute because I think we're getting a little bit of background noise coming through. So I'm just going to go through the questions in order.
Reminder, if you haven't got questions of your own, feel free to go and about other people's. But yeah, feel free to just answer any that you like the look of. So I'm going to show on stage. So apart from appearing, which is extremely important, of course, there is another matter in how to measure the impact of our content. So GA4 or similar won't help you much to understand if we appear in the summaries and if the user doesn't click. So I'm taking this to mean how on earth are we figuring out when our brands and websites are actually appearing meaningfully in the AI results?
Joshua Boughton:
Yeah, I'll take that one. So we use Ahrefs to have a tool that will show you your-
Patrick Hathaway:
Joshua, can I just interrupt just a sec? Your video is gone. I don't know if you're aware of that.
Joshua Boughton:
Sorry about that.
Patrick Hathaway:
Okay, thanks.
Joshua Boughton:
Yeah, we utilise Ahrefs in-house. We have an LLM tracking feature that you could utilise. A lot of the SEO tools will have that. Then you could also look at your impressions on search console and you will have to go into GA4, and a lot of times it's going to pop up as referral traffic. And look at your referral traffic, because a lot of that can be coming from AI as well. So you have to look at referral traffic, look at those pages, look at those keywords from that referral traffic, look at the results that you're getting from Ahrefs, match up the two, and then that'll give you that answer.
Patrick Hathaway:
Awesome. Okay, let me go on to the next one. Okay, so one thing that's becoming clear is my site now has not only SEO competitors but also GEO or LLM competitors. Do you have any practical advice on how to identify these GEO competitors instruction in a meaningful way?
Dan Taylor:
Yeah, I'm happy to run with this one. I think the easiest and best way to do it is use something like HS Brand Radar, Semrush's tool, or even more linear tools like Noah Tower or something like that, take topics and not necessarily keywords. Because I think we're evolving from keywords now, but having more long, or we more classify as long tail or things users may search, whack those in and then pull the citation lists from those and you'll see the brands that are appearing.
You can very easily then break down, is it blog posts, is it home pages, is it commercial pages? You can break that down relatively easy. And then over time if you're monitoring it, also look for stability of the competitors appearing there as well. And I would then personally take it back to why I do is I take it back to a matrix. You have your direct competitors, you have your indirect competitors, you have your potential competitors and your replacement ones. Your direct ones are obviously the ones who offer a direct service or product parallel with your own. Your indirect ones are people who offer something similar, which could have a same place. Replacement is something which just replaces the need for your product or service altogether. And potential are people and products and services who operate in an adjacent field who could very easily crossover into yours, and you'll find about a lot with AI generative citations.
Patrick Hathaway:
Yeah. Awesome. Joshua, did you have anything to add there as well? It kind of look like you-
Joshua Boughton:
No, I think Dan hit that one right on the head.
Patrick Hathaway:
Okay. All right, let's go with another one. Okay, so some of these things, questions are all quite similar so far. What metrics or analytics scores can track visibility and engagement in LLM-based search environments? Before I hand this one over again, we've kind of covered this a little bit already.
Starting next week, so week today, we have a new webinar series on SEO training in the age of AI. And the second module of that, we are talking about tools in particular. So I will drop the link in a second in the chat, but if these are the sorts of things that people are interested in, please come along to that training as well. So I will just hand this to the guests. Is there anything we haven't mentioned already that you're using in terms of tools? Please jump in.
Dan Taylor:
Not from my perspective. I'd say Joshua hit quite a lot of it on the head.
Patrick Hathaway:
Yeah, awesome. Right, we've already answered it then. Let's mark that one answered. Next question. So this one is for you, Dan, apparently. What's one of the things that you would recommend to look for on websites, mistakes I should avoid on the text on my websites?
Dan Taylor:
Probably what we've been wanting to avoid in SEO for 10 to 15 years and that's over writing purely for SEO. Long gone are the days where if you want to rank for a certain keyword, you just churn out a specific blog post on it. You need to demonstrate EEAT. Even if you can't optimise for it, you need to demonstrate there's actually thought going into it.
And ultimately, there's a concept in the quality writing guidelines which isn't spoken about as much and that's a beneficial purpose. What is the beneficial purpose of this page? Has this content been written surely to rank for a specific set of keywords and propose X? Or is the actual beneficial purpose to satisfy the intent of why the user is using those phrases in the first place? If you're not answering that intent and you're just going after search volumes and obviously rankings and that sort of stuff, I think the systems are intelligent enough now that you're going to slowly see yourself falling down because the Chrome is passing you signals. Presumably, Atlas and Comet are going to be passing similar signals as well. You're going to end up in one of these effects where you've got no idea why your traffic is dropping and the SEO best practise box takes a tick kind of scenario.
Patrick Hathaway:
Yeah. And I think one of the other things that seems to be very, very beneficial, obviously, you guys mentioned it earlier, but both for SEO and LLMs is the content refreshing stuff. It's just when there is meaningful updates to be made on a page, it makes sense to go and do that. Awesome. Right, let's move on. We've got a couple more minutes left, so maybe we can get one more question in. So this comes from someone from Macmillan Cancer Support, so maybe it's a very specific question for them, but what do you think about using more specific schema types? For example, medical condition for medical information webpages?
Dan Taylor:
Do you want to pick this up, Josh?
Joshua Boughton:
I mean, you're the technical expert, but I would just say it's kind of like common sense rule. The more you can identify the content that's on your website, if that scheme is available, I would recommend utilising it and just making sure it's up-to-date. Google remove schema and that schema changes can hold the time. But anything that can help the bots crawl and identify important pieces of your content and tie those together, I would always recommend using those. How do you feel about that, Dan?
Dan Taylor:
Yeah, a hundred percent echo what you just said there, Josh. I'd also add that Schema.org is an open community project. And for a long time, people from Google, people from Bing, people from Yahoo up until more recently, they're actively involved in proposing and validating schema types.
Now, while Google may not specifically use a specific scheme for what they call absurd declarations, your rich snippets, et cetera, and while LLMs may not process that JSON and process that microdata, it still goes into the knowledge graph and it still helps with contextual relevance in that way. So as long as you're implementing that schema from a ... This is an indirect thing and it's not going to directly lead to the outcome, it's part of that pyramid and it's part of that growing process. A hundred percent be as linear and granular and pernickety as possible when you're doing this.
Patrick Hathaway:
Right. Awesome. Thank you so much guys. I think that's pretty much all we've got time for. So thanks everyone for watching and for your fantastic questions and a huge thanks, of course go to Josh and Dan for so generously giving out the time and expertise. Please follow the guys on LinkedIn. Dan just mentioned, DM them loads of questions. If you've got more following this, if there's anything that we didn't get to.
We will email out a recording tomorrow with the slides and with the recording. So if you're missing it from the start, don't worry. And then next up, we've got the next webinar series we've got, it's only a week away. Next Wednesday is the first live session in a three-part net technical SEO times AI training series in partnership with EPCO Rank and DemandSphere. So make sure you sign up for that one. I put the link in the chat earlier, but for now, thanks very much for attending and see you on the next one.
Jojo is Marketing Manager at Sitebulb. She has 15 years' experience in content and SEO, with 10 of those agency-side. Jojo works closely with the SEO community, collaborating on webinars, articles, and training content that helps to upskill SEOs.
When Jojo isn’t wrestling with content, you can find her trudging through fields with her King Charles Cavalier.
Articles for every stage in your SEO journey. Jump on board.
Related Articles
Plain-text Lifelines: How to Make Medical Content Visible in AI Search
Back to Basics: 3 SEO Pillars That Will Future-Proof Your Organic Growth
The Invisible Web: What LLMs Miss (and Expose) on Your Site
Sitebulb Desktop
Find, fix and communicate technical issues with easy visuals, in-depth insights, & prioritized recommendations across 300+ SEO issues.
- Ideal for SEO professionals, consultants & marketing agencies.
Try our fully featured 14 day trial. No credit card required.
Try Sitebulb for free
Sitebulb Cloud
Get all the capability of Sitebulb Desktop, accessible via your web browser. Crawl at scale without project, crawl credit, or machine limits.
- Perfect for collaboration, remote teams & extreme scale.
If you’re using another cloud crawler, you will definitely save money with Sitebulb.
Explore Sitebulb Cloud
Jojo Furnival