Webinar Recording: SEO for News & Publishing Brands
Published 2024-05-09
Of all the sectors out there, publishing is one of the most complex and rewarding when it comes to SEO. News websites are faced with some unique challenges (which is why it's so important for publishers to use the right SEO tools).
In this webinar, Sitebulb’s Patrick Hathaway was joined by an expert panel to discuss all things SEO for news publishers:
- Barry Adams, specialized (and legendary) SEO consultant
- Emina Demiri-Watson, Head of Digital Marketing at Vixen Digital
- Jessie Willms, SEO Editor at The Guardian US & Co-founder of WTF is SEO? newsletter
- Caitlin Hathaway, Audience Development Specialist at MVF
Jump to:
- Watch the webinar recording
- Read the video transcript
- Q&A from attendees
- Further reading on SEO for news sites
Find out about Sitebulb Cloud, a revolutionary new cloud crawler and Barry Adams' go-to technical SEO tool
Discover Sitebulb Cloud
Watch the publisher SEO webinar recording
Here's the webinar recording to watch at your leisure.
Don't forget to subscribe to our YouTube channel to get more videos like this!
Read the video transcript
This week the site reputation abuse part of Google's March update kicked in. So far this has been enforced through manual penalties to subfolders and subdomains on big publishing sites, specifically so far targeting coupon related keywords, presumably that's the tip of the iceberg. I'd love to know if you guys have got any early feedback or thoughts on the update.
Barry Adams
Yeah, 3 of my clients have been hit too, because the content just perfectly ticked the boxes for what Google was looking for, I think unfairly because there was editorial oversight. Clearly indicated there was a total oversight, but they still got the manual penalty.
So far the penalties have been very targeted, as in only the subfolder or subdomain in question has been penalized and it's not in any way a sitewide downgrade, which means it's fairly well, I wouldn't say pain free, but it's contained.
But Google have also said, Danny Sullivan has said, that there will be an algorithmic component to this, which will roll out at the later stage and algorithmic downgrades tend to be sitewide not section specific. So for those Publishers who may have escaped this particular round of penalties, don't count your blessings just yet.
This is just the start of the whole avalanche of penalties and downgrades that we're going to see on coupon content, betting content. I'm seeing that downgraded as well. But there's still a lot of other types of similar content. For example in the travel Niche that has been unaffected so far. But it feels like it's a matter of time. It's just a matter of dominoes falling and I have mixed opinions on whether or not Google should be doing this but that's probably more of a philosophical discussion in practical terms. If you have some site section like this, which basically allows the third party to publish whatever the hell they want and you get a share of the revenue. You should probably consider getting rid of it just to be on the safe side. Because it might come in and bite you in the ass.
Emina Demiri-Watson
To kind of add to that, we've been really lucky, I woke up this morning and when I saw everything blowing up, I was quickly getting onto Google console just in case... I thought that this was going to be different and it ended up being coupons.
Is there any credence to the idea that maybe they're doing manual penalty specifically because they don't want to accidentally nuke these really big big publishers?
Barry Adams
No, no Google has no problem nuking big Publishers even accidentally. Have you been paying attention the last few years, Patrick?! There has been some major Publishers who've gotten absolutely slapped by the Google hammer in algorithm updates. Google has no qualms about doing any of that. Um, I think the manual penalties are more of a hey we know this is happening warning across the bar.
You know, we already got a 2-month grace period, which to be entirely honest is something no other industry gets. You know, there's never been a case that I can remember where Google has said we're going to be handing out penalties in 2 months time unless you fix your shit. This is the first time I think that actually Google said, you know advanced notice and and a lot of Publishers said yeah, we're going to call your Bluff and now Google's like yeah, we're serious.
So this is like a second chance for Publishers to mend their ways. And if they then don't mend their ways then I think Google will come down with the algorithm. And that's going to be hurting a lot more.
I want to talk about technical SEO initially and its role when it comes to publishing. So if for instance, can you get away with having amazing content and not worrying about technical SEO or is technical SEO absolutely crucial?
Jessie Willms
I mean, I think obviously Barry's on this call. So he has the most expertise in this particular vertical of SEO but technical SEO is very much the foundation of all good sites and you're really trying to build a house on a shaky foundation if you don't have your technical SEO lined up correctly. So, um, I would say technical SEO is probably where you want to start and then from there once you've got that solid go into continuing to build your brand and continue to publish content that aligns with your audience interests and your long-term overall strategic objectives, but I would say technical SEO is probably the starting line.
Barry Adams
You can rank in Google in general in news with terrible technical SEO, but it would be like running a marathon with a leg cut off. You're not going to get anywhere fast. Yes, you need to get your your site in order and that's the interesting bit about news Publishers because of the speed of its news moves and how fast Google needs to crawl and index your content the emphasis on Tech SEO for Publishers is a little different than for like an e-commerce website or a grocery website. Uh, so you have to focus more on fast crawling and fast indexing.
And just making sure that there are no obstacles you put in place to Google finding your content and extracting content from your basic HTML source code. You know, Google says that for example, they will render every single web page that they crawl and index and we know that is a bold-faced lie because we know they don't; they try to but they probably don't and even when they do it tends to be a delayed process. So even in Google's technical guidelines for news Publishers, they quite clearly state…
Emina Demiri-Watson
I echo what Barry was saying because that was my first thought was you can have the most amazing, you know, you could have a breaking news article that nobody has picked up and it's the best report ever. But if you hide it behind like heavy Java, it's not going to get you know, it's not going to be ranking when it needs to be ranking. And the other thing for with Jessie's point, I agree technical needs to be a foundation. But what happens a lot of the times especially in Publishers is that it's not the foundation when you come in as an agency, so you have these very big clunky Legacy websites that have all kinds of technical issues with it because they have been outsourced to a developer and not every developer will have that SEO mindset and particularly not have that SEO mindset when it comes to Publishers and knowing all of these little intricacies around technical SEO.
Jessie Willms
And working in newsrooms. I'll also add on this that I think often the challenge is you might be the only person doing SEO and you’re a content person, and you don't have technical expertise and you don't have a background as a web developer and you don't speak the language of the the team who actually is in charge of this and you might not have access to that team and those resources might be limited and focused on another business priority.
So if there's a big such a big focus on making everything as quick and easy as possible for search engines, where does structured data fit into everything? There tends to be 2 schools of thought with structured data, it’s either just specifically done for Google or just Mark up everything. Where do you stand on that?
Barry Adams
I've changed on this to be entirely honest. I used to think Mark up everything. But now I'm more like there’s what I'm seeing and I hope others are seeing the same, but if you don't please do correct me. I'm seeing that.
Google used to reward like having really long detailed structured data Snippets. They just listed everything in the news article everything related to it, but fairly recently, probably year or so ago I started seeing websites outperform and book Discover and in Top Stories that had very small news article Snippets - just the required attributes and nothing else in there - which was a new one to me because I always thought the more info you provide to Google the better Google understands the content and the better Google can rank it.
Now I think Google has changed how they index content - they still use the structured data but they I think they now primarily try to understand the article from the HTML and you just structure your data as a sanity check. Whereas it used to be the other way around where the structure data was the primary source of indexing and then the HTML was sort of the backup option.
And one of the reasons that I think this is: we used to see the structure data headline attribute always show up in Top Stories as the headline and that is still primarily the case, but increasingly I see the actual title tag show up as a headline in Top Stories. And that's new, that didn't used to happen.
So in reflection on structured data, I would now say if you’re having an opportunity to redo structured data for your website, keep it a small snippet just some required maybe a few recommended attributes. Nothing too elaborate. It tends to seem to work better nowadays. But at the same time if you have a long detailed snippet already, I just say leave it in Tech. It's probably not worth fixing at this stage. I think you get more mileage out of better on page SEO rather than trying to maximize your structured data Snippets.
One thing I'm fascinated to know is how much input or influence can you have as an SEO when it comes to the creation of new content when working with publishers?
Caitlin Hathaway
In my experience, having been before an agency and an in-house little SaaS company and then moving more into a new space, it surprised me how much of an SEO Acumen the whole company had and they really cared about it and they really prioritized it which was fantastic meaning there was always a seat at the table. So that's fantastic however, there's a bit of work that needs to be done where you need to prove why they should pursue an opportunity.
We need to make sure it's aligned to what each portfolio brand wants to offer to them. So for instance in my Approach, I'd say the data never lies, um why I'd want them to to pursue an idea like I tried to back up with things like search volume or if there's like a clear Trend because you know the keyword tools they're going to see that there's a search volume of zero, but they're obviously like news World things reacting really fast and that doesn't track with those tools and you have to keep up.
Jessie Willms
I'll just add on. So I basically only do content stuff in the newsrooms that I've worked in in the roles that I've had. And so for me, it's a lot of looking at Google Trends, Google Trends with Glimpse, alsoasked and other keyword research Tools in order to find questions that our readers are asking on topics that we care about and then sending that over to editorial teams desks specific editors and reporters as sort of an outline and a structure of um, here are things that I think we should include in explain a time line, or some sort of supplementary piece.
Often when I take these pitches to desks, it's as our secondary piece of content, so we have the main news file and this is how we're sort of rounding out our coverage to make sure that we have our main on the news item that the reporters are writing on the ground and then also taking the opportunity to because SEO in the newsrooms that I've worked in has always sat within a broader audience team what we're really trying to do when we look at Google Trends and look at other search tools. Um, what we're trying to do is have an audience first mindset. So rather than thinking about exclusively here's what the story is because I as an editor, I think this is what the story is. We're trying to figure out what is the complement to that news file that the audience is clearly asking for so looking for the particular questions that are on Google Trends Rising Interest to see if there's anything that we've missed in our coverage if there's gaps if there's other opportunities or other storylines for us to hit on because ultimately what we want is readers to come to us and read our coverage on key news events that we really care about and search for us is one of those ways that we can find and bring in that audience.
So when it comes to content creation, this is what I do. Most of the day is trying to figure out what stories people are interested in and using Google to find and then how can we make sure that our coverage includes those key storylines and elements?
Do you have a plan for the day that can get completely capsized by a new story that just comes in?
Jessie Willms
I think that's it. That's the universal experience of every editor that I've ever met working in news. You have a plan for the day and then you go to the meeting and you realize we're doing something else. Um, and so I think the key skill if you're working in um news and journalism because it moves so fast is you just have to be adaptive and receptive and willing to change focus and priority, um hour by hour, day by day, um, because often the conversation is being driven by what's happening. And so you just have to be receptive and reactive to that.
But yeah definitely I have had days where I thought I'm just going to work on this content outline for a series that I know we're running in the summer and that's going to be my main priority and then you know, that's totally gone by 10 a.m.
So dealing with lots of different writers constantly producing content, how do you handle the quality control aspect?
Emina Demiri-Watson
Um, I mean as an agency, we don't have as much input or as much power really as some of the in-house teams, um and coming back to where you were saying around kind of what is our role. I think a big piece for how to get involved and how to manage writers as well is kind of this education piece that that needs to happen that will then avoid the extensive Q&A afterwards. One of the things and I've read Barry's excellent article around why you need to optimize everything before you actually hit publish and for me, that's like that's the basis of it all and and it goes with writers as well it avoids that Q&A if you can educate your writers why they're doing these things and that's hard because journalists, I was a journalist, I didn't care about SEO when I was doing that, you know, I just want to write my news story. I want to get it out. I don't want to be thinking about like a title.
Caitlin Hathaway
Our team used to do all these manual checks and we used to fix those issues ourselves. So we're really quite small In the comparison like big Publishers and that proved to be such a time-consuming activity. It would take days at a time to go and audit and fix those issues. It's just like it's impossible. There's there's so much more higher value activities you could be doing so I think on the answer to what you said Emina about having guidelines, having workshops, and like creating SEO Champions within the writing team and like the content teams themselves so they can flag those issues and be able to learn how to fix them themselves is a lot more of a collaborative approach.
If you could bake that into their process that is much more powerful, um, and we've got automated sheets that now flag those issues. And then what we do now is that we feed those by month to say like these are the areas or that the trends that we're seeing if you could keep on top of those if you can go back and fix them if it's like Evergreen content, that would be really really helpful. 2 Days of our month is saved at least on that process alone.
How do you deal with this ever changing landscape of algorithm updates, monitoring, adjusting trying to get everybody aware on what they should or should not be doing (as it changes forever)?
Jessie Willms
As long-term thinking, think audience first rather than Google first. Like, I think anything that you do because it's a tactic is eventually going to tire itself out and stop working. So, um, like I'm in the news rooms that I've worked in, there isn't company-wide education about Google algorithm updates. I think that has a tendency to scare writers and reporters a bit. Obviously key stakeholders who do need to know about algorithm updates get that in internal meetings.
Then I would just say, the training that we do provide or that I have done in in previous roles, um has always very much focused on like the north star of what we want to create…rather than like, here's what we need to do because of the last algorithm update. So I'd always just caution, is this long term strategic thinking? I just I think that sometimes we're a little too focused on the minutiae of individual core updates and I think if we just like zoom out a little bit and focus on our core pillars of what we're good at and have that as your North Star. I think that's I think that would be my my starting line at least.
Emina Demiri-Watson
Yeah, I mean I was going to say the same thing. I'm an SEO but because I studied journalism, I have a big passion for it. Every time there's emphasis on an algorithm update it just takes away a little bit from what journalism actually is, which is it's crazy that I'm saying this considering that I'm an SEO but that's you know, my emotional side reacts to it that way, whereas my rational side is the way that I approach it with clients. Really it’s just keeping the line of communication very open, what you don't want to do is you don't want to be give them a heads up that there's a big algorithm update coming and making everybody panic.
When you're crawling sites with millions of pages, do you feel the temptation to crawl the whole thing? But you can't possibly be doing this every day or even probably every week. So what do you do instead?
Barry Adams
Yeah, don't crawl the whole website. The largest website I've ever worked on had I think 300 million indexed URLs, um, roughly. You don't need to crawl all pages on the website. You do need to be smart about how you crawl. Um, I don't think just crawling from the home page and setting it to like 1 million URLs max is necessarily always the best approach.
Generally it is how I start by the way; it's how I configure Sitebulb to crawl and then once that initial crawl is done I sometimes if it's a big website do focused crawls on like a site section for example to get more sense of the depth of that particular section.
The key to understanding the technical setup and the sort of issues that are on your website is always to look for patterns. When a certain type of page speed is on an article template or category page, if one page has an issue chances are every single implementation of that same template has that same issue.
Of course, you will want to check that but that's the reason why you don't need to crawl the entire website. You need to crawl enough pages to get a sense of where the issues might lie. And then you can do more focused crawls on all specific areas of the website, for example author pages. You can go to a particular section, like the sport section of a publisher, or you know one of my clients has a finance and investment section, which can be very different from the rest of the website. So I crawl that one as a separate project in Sitebulb.
Then you look for those patterns, you look for template based issues and and you try to address those. There are very few scenarios where you will want to crawl the entire website. I sometimes get asked, how do we find all internal links that result in a redirect? I said the hard answer to that is you probably can't and if your website is that big you can probably find some patterns but there will always be some cases of old URLs linked from deeper pages that result in an internal redirect. And it's okay to leave those in place, you know, you don't have to have a perfect website as long as you fix the the low hanging fruit.
If you have a link in your top navigation that results in the redirect, you want to fix that because that would be on every single URL in your website. But if it's like a deep article from 17 years ago that has a link that was also in the redirect or even in the 404, it’s probably not worth digging back and finding those and fixing those because you're not going to get any payoff from the amount of work that would require. So it's based on templates, finding patterns and being a bit more focused and smart with your crawls and I have to say again - I don't get paid to say this - I do love Sitebulb for that, especially the cloud version. You can just set and forget, do your crawl, don't worry about credits or anything like that, and just get all the data that you need to do a proper site audit since I started using Sitebulb Cloud.
I'm a lot less anxious about starting a big crawl because I used to be, like if the crawl goes haywire and there's like a quote trap somewhere, I'm going to be wasting like my entire monthly allocation of URLs on this 1 single quote. Um, whereas now I'm like, yeah, I'll just start the crawl again and exclude certain URLs from it. So you have to be smart about the tools that you use, and not overspend on tools that promise you technical SEO golden mountains, you have to do the hard work yourself still. You have to do some sanity checking and not just look at the report and say alright, we're going to fix all of those things because a lot of them might not be worth fixing or might not be issues in the first place.
Also 404s don't consume crawl budget; the moment Google gets the 404 status code, it just moves on to the next URL. Also, by the way, it means that if you have links on your 404 pages, Google never sees those because Google never parses the content of your 404 pages. It just looks like a 404 status code, same with the 410, and then just skips that and goes to the next one.
We can't possibly have a webinar in 2024 where we don't mention AI at all. So are you integrating AI into processes and if so, how?
Jessie Willms
I can only speak for what I'm doing for the newsletter, uh, because I don't really use AI in my day-to-day job. But what I would say is I think the thing that's interesting for me is not AI for content creation, but AI for doing just the boring tasks and tedious work. I haven't really used this in my professional life yet, but I have used it a little bit for like some newsletter stuff on the side and I teach data journalism on the side and I love pivot tables so much and if ChatGPT can do some data analysis for me and I don't have to plug away at a spreadsheet and I don't have to do any of the data cleanup work myself, that is such a gift from AI.
Emina Demiri-Watson
I mean for me it kind of goes in the bucket with like too much focus on the algorithm changes with journalism, you know to use AI to actually do news articles. I mean for me that's like complete blasphemy. It just goes against every pore in my body. Um, but I echo what Jessie said, um, I think for data analysis, sometimes that's good it very quickly can do a few things. I'd be very careful about using it for stuff like keyword research, for example, because those are not real volumes there. Um, it also doesn't take you know SERPs into consideration really most of the time and so I was using it actually today looking at how I can turn some of the stuff that I've written about publisher seos. How can I turn that into checklists and resources that I can use with my clients when I'm training the editorial teams.
Caitlin Hathaway
Definitely echo what’s being said in terms of operational wise and data analysis wise, ChatGPT or any other AI tool, there's some really good ones out there. You should definitely explore. The biggest ick is when you see chat GPT content and oh my goodness, it's all over LinkedIn and Twitter at the moment. It's just so obvious the words that are being used to the point where I see some headlines and I see a word and I'm like, I feel like that might be ChatGPT and then it's not! It's like “elevate” and “supercharge” and all these sorts of things. Like I'm trying to avoid using it in my actual day-to-day language. Sometimes it slips out and I'm like, oh my goodness ChatGPT influence, um, but definitely recommend exploring it and trying to build solutions that work for you. So for instance there might be ways that you can automate your sheets or like monitoring systems where I've consulted with ChatGPT to help me build the formulas or build the app scripts that enabled me to automate certain manual tasks, which is really important. Loads of Time Savings achieved there.
In audience research obviously, it's very very new to the game and there's not much that you can explore. But for instance, Microsoft Co-Pilot…you can get some interesting data where it's deciphered and analyzed, um, new insights that you might not have covered so it could be a new kind of like area for you to look into. I’ve written up a guide on how to do that.
That inventory and auditing as well where I mentioned about content themes and tagging a lot of users but just make sure that you're manually reviewing everything and getting a human review because the amount of times it's given the most wildest category themes to some of the the URLs it suggested.
End of webinar Q&A
What's the most common technical issue that news websites tend to face?
Barry Adams
I wrote an article about that not too long ago. So if you'll allow me: pagination is probably the one I see most often, where infinite scroll or load more buttons with JavaScript. Um, Google doesn't click, Google doesn't scroll. So Google never sees beyond the first page and we do actually want Google to see beyond the first page of content. We do want Google to see a substantial body of work associated with a topic so on a section page or or a tag page, so we need some level of crawlable packaging there. So that's probably one of the more common technical issues that I see.
Also unoptimized HTML like I said before, it's probably the only context in which clean HTML source code actually matters, um, because Google needs to index news articles fast and therefore bases it initially on the HTML only and the things that tend to go wrong are not always optimized or the order of the header tags, the Meta tags, we want to have your title, uh, your open graph tags fairly high up in the head section and avoid using body HTML tags, like image and iframe in the header and the article HTML from the H1 headline and yes, it should be an H1 tag. I want your headline down to the last paragraph should be relatively clean uninterrupted block of HTML. Now that sort of stuff is more about reducing the chances of errors rather than actually increasing traffic.
99 times out of 100, Google doesn't have hiccups when it indexes HTML even unoptimized HTML but the one time Google does have a hiccup, it's usually one of those articles. You absolutely want Google to index very very quickly. So it's reducing the probability of of errors occurring and other things like I said earlier like having 301 redirected URLs in your top navigation, people tend not to realize that those exist because they click on the link and it works but if you go through one or more redirect hops, you do actually lose a little bit of link value, a little bit of link equity through redirect. So, you know, that's a fairly common issue as well.
Emina Demiri-Watson
I mean those are kind of what I see as well most often. One of the really interesting ones for me is the use of categories and tags and and things like that, I think you know, that's very much entrenched in content, in on page SEO and kind of your strategy, but it's also a technical issue because you have all of these I mean the amount of tags and things that I have seen people just create willy nilly when there is actually a very optimized tag that I'm trying to pursue and then you have these very very thin kind of tag pages that basically do nothing. From a technical point of view, stuff around internal linking is an interesting one as well. We've seen that so many times.
How do you find and monitor trending topics. Do you have any tips for SEO working with newsrooms?
Jessie Willms
I'll put the newsletter in the chat in a second, but I would say time frame can be really useful and important and your regional filtering. So one thing I look at every like couple times a day is the W5 - who what where when why - and then change the time frame to 1 hour and then you can kind of get an indication of 2 things: 1 - what students are learning in school because often it's these like funny historical questions, but 2 - you do get questions about like why is X happening or what time does this thing start? And so that can be a little signal of like here are things that people are interested in learning and if you work for a local outlet, the SEO editor at the Philadelphia Inquirer - I cannot remember her name off the top of my head, but I'll put the link to the the newsletter where she said this - she does that in the specific region that her paper covers and just does the W5 or will put in some related queries into Google and just close the time frame down to see what people are looking for in the morning.
The other tip and trick I would say is sometimes you can look at like sometimes it'll work and sometimes it'll give you interesting thing. But if you look at key search terms like um, like if you're interested in what people are searching from Tik Tok, if you search Trends, sometimes you can see a little bit of like what are the trends that people are Googling versus just looking at on Tik Tok as a platform. So let's say for example, there was a recall, if you put in some of those like specific search terms and then like change the filtering and changing the location. You might get some tips on things that people are looking for right now, and that might be useful for breaking or emerging news.
Otherwise, look at Google Trends several times a day and then look at not only like what is trending on Google but also key topics that we care about and seeing if there are new questions that are rising or being asked that we haven't covered.
I'll say on the part of the question about how do you work with The Newsroom? I would say when you're an SEO editor in a newsroom part of your job is working in Translation. So taking what you see in terms of like SEO Trends and converting it to language that your editors and reporters will understand so I wouldn't give someone a search volume metric because I don't know that they would necessarily find that intuitive to understand as an editor. Um, and I wouldn't really want to pitch a story saying like, “Here's why we should do it for exclusively Google reasons.” I would always make sure that whatever you see on Google is good, but always translate it into a framework that the people that you're talking to will understand. So even if that means rewriting questions that you're seeing on Google so that you know that they are going to be more receptive. I think it's worth doing. Um, even if you lose maybe a little bit of precision from Google Trends to how you've rewritten it. I think getting people's buy in is really important as a long term part of your strategy.
And then the other piece I'll just say is Glimpse with Google Trends is one of my favorite tools. alsoasked I've been using recently and is like kind of fun and then the Google Trends team has an email that they send out every day. They have US, UK and one other geography that I can't remember, specific editions and I find that really useful because it's curated the value in that newsletter is that they are picking out things that are interesting to look at and at least once per email I say, oh, that'd be a great story idea if we could cover this in a way that makes sense for our outlet. The other thing about that newsletter that's really great is you sort of build your instincts like you can see by reading that email every day, you kind of get a sense of how people search and what questions they ask so I would also say um, Google Trends their email and I'll find the link and I'll put in the chat is a very useful.
So what are the good and reliable sources left for small and medium publishers - even social media platforms started adding AI answers for example meta AI...
Barry Adams
Yeah, this is a tough one. Audience loyalty; you have to become immune to the whims of the big Tech Giants because their greed knows no bounds, um, and they will continue stealing from everybody and everything in pursuit of increased shareholder value. I mean, there has never been a higher demand for news than there is now, the advertising Market has never been bigger and yet Publishers have never had less revenue. And the reason for that is that the big tech companies just gobble up all the fucking money.
The only way you can overcome that is to have a loyal audience that bypasses the tech Giants, bypasses The Gatekeepers and goes directly to your website and subscribes to your website that pays you the money directly. And yes, that's hard, that is bloody hard to do. Um, so you have to have something that is unique that is worth all that is valuable enough that people are willing to do that to become a loyal audience, but that is the only way you can maintain consistent levels of traffic and and revenue because putting your eggs in any single Tech Giants basket, be that Google or Apple or Facebook or whoever or Tik Tok, it's just going to be a losing strategy in the long run and I'm saying this as a Google guy, you know, I'm saying this as a pure SEO fellow don't do just SEO you have to build a loyal audience. And yeah, that's hard.
Jessie Willms
I just want to echo this and say, I live in Canada and last year there was a big fight between our government and the platforms and so in Canada, there is no news on Meta. There's no news on Instagram. There's no news on Facebook. So what was once a really healthy and robust source of traffic to publisher websites is zero and I can't even look at American outlets on Instagram. It's just blocked, plus the pivot to video from 2016 when public workers really heavily invested in video because we were getting signals from Facebook. But to Barry’s point, I think we should learn from what big platforms and Tech Giants have done previously, which is not particularly care about our business and when I say our business, I mean journalism and the institution, the value that we provide and just know that they don't actually care about whether or not journalism survives the next 50 years, and build as many relationships as directly as you can with readers on platforms that you own, operate and control. And also maybe as like an exercise, eliminate your Google traffic from the reporting for a period of a week and see what you're left with, and see what works for your actual subscribers who come like your real readers and then figure out like okay if Google went away tomorrow, how would we make up that revenue and then use that as a starting point for thinking about alternative revenue streams or ways to engage and monetize your audience.
So how do you appear in Google Discover in this AI era?
Emina Demiri-Watson
Um some of the advice that I've shared in the series of articles for Sitebulb kind of covers Google Discover in particular. So there are some intricacies with Google Discover that don't apply to traditional search. Some of it comes from the intrinsic stuff that we've spoken about, like the quickness of crawling and things like that. Other things that you can look at is images. For example, Discover says that your image should be of a particular size now. I am a little bit suspect to that because I have seen with clients that they rank in Discover even though their images are below I think it's 1,200 pixels or something like that. And I've seen that they do rank, even without that but it's best practice. So you might want to kind of consider optimizing your images as well.
Also one of the things that we've seen, we've seen research and Lily Ray was talking a lot about the headlines and how catchy headlines for Discover in particular really work and we're talking about active verbs but don't go clickbaity. It needs to kind of stop the scroll but not bait the click.
To what extent is it useful/important to have an HTML sitemap to old content. New York Times has a huge HTML sitemap. However, will this be contributing to site bloat? Is there a time cut off point at which it’s not worth linking to old stories?
Barry Adams
No, HTML sitemaps are good, are useful. There's ways to structure them so that they point to section pages and tag pages which then after that the page actually takes over, uh, the deeper packaging the load the link value or go to lower the core and what you've spent on those pages and even a big sitemap would be what 200, 300 URLs for Google Docs a drop in the bucket, you know, you need to start worrying about it when you get over 500,000 URLs. Below that don't worry about it.