Our recommendation? Optimize the conversion rate of your website, before you spend on increasing your traffic to it.
Here’s a web design statistic to bear in mind: you have 50 milliseconds to make a good first impression. If your site’s too slow, or unattractive, or the wording isn’t clear, they’ll bounce faster than you can say “leaky bucket”. Which is a shame, because you’ve put lots of effort into designing a beautiful product page and About Us, and people just aren’t getting to see it.
As a digital web design and conversion agency in Melbourne, Australia, we’ve been helping our customers optimize their websites for over 10 years, but it wasn’t until mid-2019 that we decided to turn the tables and take a look at our own site.
As it turned out, we had a bit of a leaky bucket situation of our own: while our traffic was good and conversions were okay, there was definitely room for improvement.
In this article, I’m going to talk a little more about conversions: what they are, why they matter, and how they help your business. I’ll then share how I made lots of little tweaks that cumulatively led to my business attracting a higher tier of customers, more inquiries, plus over $780,000 worth of new sales opportunities within the first 26 weeks of making some of those changes. Let’s get into it!
Your conversion rate is a figure that represents the percentage of visitors who come to your site and take the desired action, e.g. subscribing to your newsletter, booking a demo, purchasing a product, and so on.
Conversions come in all shapes and sizes, depending on what your website does. If you sell a product, making a sale would be your primary goal (aka a macro-conversion). If you run, say, a tour company or media outlet, then subscribing or booking a consultation might be your primary goal.
If your visitor isn’t quite ready to make a purchase or book a consultation, they might take an intermediary step — like signing up to your free newsletter, or following you on social media. This is what’s known as a micro-conversion: a little step that leads towards (hopefully) a bigger one.
A conversion can apply to any number of actions — from making a purchase, to following on social media.
Macro-conversions are those we usually associate with sales: a phone call, an email, or a trip to the checkout. These happen when the customer has done their research and is ready to leap in with a purchase. If you picture the classic conversion funnel, they’re already at the bottom.
Micro-conversions, on the other hand, are small steps that lead toward a sale. They’re not the ultimate win, but they’re a step in the right direction.
Most sites and apps have multiple conversion goals, each with its own conversion rate.
The short answer? Both. Ideally, you want micro- and macro-conversions to be happening all the time so you have a continual flow of customers working their way through your sales funnel. If you have neither, then your website is behaving like a leaky bucket.
Here are two common issues that seem like good things, but ultimately lead to problems:
High web traffic (good thing) but no micro- or macro-conversions (bad thing — leaky bucket alert)
High web traffic (good thing) plenty of micro-conversions (good thing), but no macro conversions (bad thing)
A lot of businesses spend heaps of money making sure their employees work efficiently, but less of the budget goes into what is actually one of your best marketing tools: your website.
Spending money on marketing will always be a good thing. Getting customers to your site means more eyes on your business — but when your website doesn’t convert visitors into sales, that’s when you’re wasting your marketing dollars. When it comes to conversion rate statistics, one of the biggest eye-openers I read was this: the average user’s attention span has dropped from 12 to a mere 7 seconds. That’s how long you’ve got to impress before they bail — so you’d better make sure your website is fast, clear, and attractive.
Our phone wasn’t ringing as much as we’d have liked, despite spending plenty of dollars on SEO and Adwords. We looked into our analytics and realized traffic wasn’t an issue: a decent number of people were visiting our site, but too few were taking action — i.e. inquiring. Here’s where some of our issues lay:
Our site wasn’t as fast as it could have been (anything with a load time of two seconds or over is considered slow. Ours was hovering around 5-6, and that was having a negative impact on conversions).
Our CTA conversions were low (people weren’t clicking — or they were dropping off because the CTA wasn’t where it needed to be).
We were relying on guesswork for some of our design decisions — which meant we had no way of measuring what worked, and what didn’t.
In general, things were good but not great. Or in other words, there was room for improvement.
Improving your site’s conversions isn’t a one-size-fits all thing — which means what works for one person might not work for you. It’s a gradual journey of trying different things out and building up successes over time. We knew this having worked on hundreds of client websites over the years, so we went into our own redesign with this in mind. Here are some of the steps we took that had an impact.
First of all, we decided to fix our company website. This sounds like an obvious one, but how many times have you thought “I’ll do this really important thing”, then never gotten round to it. Or rushed ahead in excitement, made a few tweaks yourself, then let your efforts grind to a halt because other things took precedence?
This is an all-too-common problem when you run a business and things are just… okay. Often there’s no real drive to fix things and we fall back into doing what seems more pressing: selling, talking to customers, and running the business.
Deciding you want to improve your site’s conversions starts with a decision that involves you and everyone else in the company, and that’s what we did. We got the design and analytics experts involved. We invested time and money into the project, which made it feel substantial. We even made EDMs to announce the site launch (like the one below) to let everyone know what we’d been up to. In short, we made it feel like an event.
There are many different types of user: some are ready to buy, some are just doing some window shopping. Knowing what type of person visits your site will help you create something that caters to their needs.
We looked at our analytics data and discovered visitors to our site were a bit of both, but tended to be more ready to buy than not. This meant we needed to focus on getting macro-conversions — in other words, make our site geared towards sales — while not overlooking the visitors doing some initial research. For those users, we implemented a blog as a way to improve our SEO, educate leads, and build up our reputation.
User insight can also help you shape the feel of your site. We discovered that the marketing managers we were targeting at the time were predominantly women, and that certain images and colours resonated better among that specific demographic. We didn’t go for the (obvious pictures of the team or our offices), instead relying on data and the psychology of attraction to delve into the mind of the users.
Sending visitors to good sites with bad speeds erodes trust and sends them running. Multiple studies show that site speed matters when it comes to conversion rates. It’s one of the top SEO ranking factors, and a big factor when it comes to user experience: pages that load in under a second convert around 2.5 times higher than pages taking five seconds or more.
We built our website for speed. Moz has a great guide on page speed best practices, and from that list, we did the following things:
We optimized images.
We managed our own caching.
We compressed our files.
We improved page load times (Moz has another great article about how to speed up time to first Byte). A good web page load time is considered to be anything under two seconds — which we achieved.
In addition, we also customized our own hosting to make our site faster.
As well as making our site faster, we introduced a lot more tracking. That allowed us to refine our content, our messaging, the structure of the site, and so on, which continually adds to the conversion.
We used Google Optimize to run A/B tests across a variety of things to understand how people interacted with our site. Here are some of the tweaks we made that had a positive impact:
Social proofing can be a really effective tool if used correctly, so we added some stats to our landing page copy.
Google Analytics showed us visitors were reaching certain pages and not knowing quite where to go next, so we added CTAs that used active language. So instead of saying, “If you’d like to find out more, let us know'', we said “Get a quote”, along with two options for getting in touch.
We spent an entire month testing four words on our homepage. We actually failed (the words didn’t have a positive impact), but it allowed us to test our hypothesis. We did small tweaks and tests like this all over the site.
We used heat mapping to see where visitors were clicking, and which words caught their eye. With this data, we knew where to place buttons and key messaging.
Understanding your visitor is always a good place to start, and there are two ways to go about this:
Quantitative research (numbers and data-based research)
Qualitative research (people-based research)
We did a mixture of both.
For the quantitative research, we used Google Analytics, Google Optimize, and Hotjar to get an in-depth, numbers-based look at how people were interacting with our site.
Heat-mapping software shows how people click and scroll through a page. Hot spots indicate places where people naturally gravitate.
We could see where people were coming into our site (which pages they landed on first), what channel brought them there, which features they were engaging with, how long they spent on each page, and where they abandoned the site.
For the qualitative research, we focused primarily on interviews.
We asked customers what they thought about certain CTAs (whether they worked or not, and why).
We made messaging changes and asked customers and suppliers whether they made sense.
We invited a psychologist into the office and asked them what they thought about our design.
We found out our design was good, but our CTAs weren’t quite hitting the mark. For example, one CTA only gave the reader the option to call. But, as one of our interviewees pointed out, not everyone likes using the phone — so we added an email address.
We were intentional but ad hoc about our asking process. This worked for us — but you might want to be a bit more formal about your approach (Moz has a great practical guide to conducting qualitative usability testing if you’re after a more in-depth look).
Combined, these minor tweaks had a mighty impact. There’s a big difference in how our site looks and how we rank. The bottom line: after the rebuild, we got more work, and the business did much better. Here are some of the gains we’ve seen over the past two years.
Our site speed increased: we managed to achieve a load time of around 500-600 ms.
Our dwell time increased by 73%, going from 1.5 to 2.5 minutes.
We received four-times more inquiries by email and phone.
Our organic traffic increased despite us not channeling more funds into PPC ads.
We also realized our clients were bigger, paying on average 2.5 times more for jobs: in mid-2018, our average cost-per-job was $8,000. Now, it’s $17,000.
Our client brand names became more recognizable, household names — including two of Australia’s top universities, and a well-known manufacturing/production brand.
Within the first 26 weeks, we got over $770,000 worth of sales opportunities (if we’d accepted every job that came our way).
Our prospects began asking to work with us, rather than us having to persuade them to give us the business.
We started getting higher quality inquiries — warmer leads who had more intent to buy.
When it comes to website changes, it’s important to remember that what works for one person might not work for you.
We’ve used site speed boosters for our clients before and gotten really great results. At other times, we’ve tried it and it just broke the website. This is why it’s so important to measure as you go, use what works for your individual needs, and remember that “failures” are just as helpful as wins.
Below are some tips — some of which we did on our own site, others are things we’ve done for others.
Tip number 1: Get stronger hosting that allows you to consider things like CDNs. Hiring a developer should always be your top choice, but it’s not always possible to have that luxury. In this instance, we recommend considering CDNs, and depending on the build of your site, paying for tools like NitroPack which can help with caching and compression for faster site speeds.
Tip number 2: Focus your time. Identify top landing pages with Moz Pro and channel your efforts in these places as a priority. Use the 80/20 principle and put your attention on the 20% that gets you 80% of your success.
Tip number 3: Run A/B tests using Google Optimize to test various hypotheses and ideas (Moz has a really handy guide for running split tests using Google). Don’t be afraid of the results — failures can help confirm that what you are currently doing right. You can also access some in-depth data about your site’s performance in Google Lighthouse.
Tip number 4: Trial various messages in Google Ads (as a way of testing targeted messaging). Google provides many keyword suggestions on trending words and phrases that are worth considering.
Tip number 5: Combine qualitative and quantitative research to get to know how your users interact with your site — and keep testing on an ongoing basis.
Tip number 6: Don’t get too hung up on charts going up, or figures turning orange: do what works for you. If adding a video to your homepage slows it down a little but has an overall positive effect on your conversion, then it’s worth the tradeoff.
Tip number 7: Prioritize the needs of your target customers and focus every build and design choice around them.
Nitropack: speed up your site if you’ve not built it for speed from the beginning.
Google Optimize: run A/B tests
HotJar: see how people use your site via heat mapping and behaviour analytics.
Pingdom / GTMetrix: measure site speed (both is better if you want to make sure you meet everyone’s requirements).
Google Analytics: find drop-off points, track conversion, A/B test, set goals.
Qualaroo: poll your visitors while they are on your site with a popup window.
Google Consumer Surveys: create a survey, Google recruits the participants and provides results and analysis.
Moz Pro: Identify top landing pages when you connect this tool to your Google Analytics profile to create custom reports.
Treat your website like your car. Regular little tweaks to keep it purring, occasional deeper inspections to make sure there are no problems lurking just out of sight. Here’s what we do:
We look at Google Analytics monthly. It helps to understand what’s working, and what’s not.
We use goal tracking in GA to keep things moving in the right direction.
We use Pingdom's free service to monitor the availability and response time of our site.
We regularly ask people what they think about the site and its messaging (keeping the qualitative research coming in).
Spending money on marketing is a good thing, but when you don’t have a good conversion rate, that’s when your website’s behaving like a leaky bucket. Your website is one of your strongest sales tools, so it really does pay to make sure it’s working at peak performance.
I’ve shared a few of my favorite tools and techniques, but above all, my one bit of advice is to consider your own requirements. You can improve your site speed if you remove all tags and keep it plain. But that’s not what you want: it’s finding the balance between creativity and performance, and that will always depend on what’s important.
For us as a design agency, we need a site that’s beautiful and creative. Yes, having a moving background on our homepage slows it down a little bit, but it improves our conversions overall.
The bottom line: Consider your unique users, and make sure your website is in line with the goals of whoever you’re speaking with.
We can do all we want to please Google, but when it comes to sales and leads, it means more to have a higher converting and more effective website. We did well in inquiries (actual phone calls and email leads) despite a rapid increase in site performance requirements from Google. This only comes down to one thing: having a site customer conversion framework that’s effective.
Question: 2020 was quite the year, what were you up to this past year? Any surprises or favorite projects you worked on?
Dana: Like many people I actually ended up moving! I now live on Vancouver Island and I can practically see the USA from my house — not that I’ve been able to visit. I also completed my first course, hopefully of many, for LinkedIn Learning on the topic of technical SEO. My next course is already in progress and will be covering how to transition from Google Analytics Universal to GA4, and that should be out in the fall.
At Kick Point, we grew over the past year and we're now a team of 12! Like a lot of agencies, we did see some good come out of an otherwise pretty awful time for many, we’re very fortunate.
Q: What is the biggest shift you’ve seen in the SEO industry over the past year? How does that impact your work at Kick Point, if at all?
D: The biggest change this year was the rollout of Core Web Vitals, which as I write this, is only just happening now. I am extremely curious to see how it impacts SEO over the remainder of 2021. I don’t want to say too much more in case it ages badly!
Q: Last year, you discussed how to use a discovery process to turn red flags to green lights. Will we see any of the same themes come through in your presentation this year? How so?
D: Absolutely! This year is really a companion piece to last year. Last year I covered discovery for marketing projects, and this year I'm covering discovery for website projects. These discovery processes have made such incredible changes at Kick Point in terms of how we work with new clients — it's really been amazing. I hope that people who listen to my talk are able to take away some of the lessons that we've learned and apply them to their own processes.
Q: In your MozCon talk, you’ll be discussing how to build a website with a search-first mindset. What inspired you to discuss this topic at MozCon 2021?
D: I think it's a topic that isn't covered enough. We unfortunately often still see a real divide between the developer and SEO worlds and I'm hoping that we can work towards bridging that. Particularly with the advent of Core Web Vitals, these two teams need to work together more than ever before.
Additionally, this talk is based on a lot of our own learnings in terms of better ways to run website projects. Since we have adopted this process, website projects are just more fun — less stress, on time, on budget, all those things that we all want in a website project but seem impossible to achieve. I’m not saying this will magically fix everything but it’ll definitely put you on a happier path.
Q: What are some of the challenges SEOs face in the web development process?
D: Being taken seriously! I've been working in this field for 21 years now and I can't even tell you the number of times that I've been on a call with a developer or development team discussing SEO recommendations and just being completely dismissed — that these recommendations aren’t necessary, that we're wrong, or that the developer knows better. And it's an incredibly frustrating place to be in. I’m sure other SEOs reading this have had similar experiences.
Q: Why is it important for SEO to be at the forefront when it comes to website development? How has the relationship changed over the years?
D: Because it's so much easier and cheaper to get SEO recommendations added in at the beginning instead of trying to shove things in later after the site is done. I think that developers are more aware of SEO now but there is still a lot of mistrust. I think it’s important to set the tone that you aren’t there to throw the developer under the bus — they aren’t an SEO expert, and shouldn’t be expected to learn all this specific SEO stuff. Showing that you’re there to help right from the start can really help that relationship thrive.
Q: What’s your #1 tip for ensuring that SEO gets a seat at the table in a website rebrand?
D: You need to start with education. Either the leadership team that you're working with doesn't understand the power of SEO or they may have a really outdated understanding of what SEO is and what it can do. Tom Critchlow has an excellent article that he recently published on how to convince executives to care about SEO and I’d say that is required reading.
Q: What are the key takeaways you want the audience to walk away with?
D: I want people to understand that there is a space between waterfall and agile when it comes to website development processes. I hope that people will enjoy our blueprint process and it’ll help them make better website plans. Finally, I’m really excited to show off the keyword research presentation idea that I got from Rebekah Baggs and Chris Corak — it’s so good!
Q: Who in the MozCon lineup are you most excited to watch this year? Anything else you are looking forward to?
D: The talks by Dr. Pete, Areej AbuAli, and Britney Muller all look great! And of course I’ll be watching Brie Anderson’s talk since GA4 is very close to my heart. I’m also really looking forward to hopefully having an in-person MozCon next year! There is really nothing that can replace the experience of speaking to a live audience.
A big thank you to Dana for her time! To learn more about Dana’s upcoming presentation, see details on our other speakers, and to purchase your ticket, make sure you click the link below!
]]>You have to make a distinction early on between voice searches that simply transcribe a voice prompt into a search bar and return a list of results, or a search action that triggers a specific command from a digital assistant-style platform. Most content isn’t going to be able to accommodate optimizations for both the Google search bar and an Alexa voice command at the same time, and some content can’t be engaged by voice-enabled devices at all, like a screen-free home smart speaker that can’t display an article or play a video. Rather, if you want to reach audiences while they interact with voice-enabled devices, you can think of voice-optimized content as another arrow in your quiver.
Creating content specifically geared to be findable and consumable via voice search is going to be more important for some users than others. As screen-free devices and voice-enabled search become more ubiquitous, some sites and pages would likely benefit from becoming more Alexa-friendly. For example, location-based businesses have huge opportunities to increase their foot traffic by optimizing their online presence to be discoverable via voice search. There are more users to capture every day who are likely to ask Siri or Alexa to “find a pizza shop nearby,” compared to those who might navigate to Yelp or Google Maps and perform a text search for “pizza delivery.”
That said, voice searchability isn't necessarily what you should build your entire SEO strategy around, even for those users likely to benefit the most from high voice search rankings. That’s because voice isn’t exactly replacing text search — it’s supplementing it.
For example, Siri will update a user on the score of a game, but won’t narrate the action blow-by-blow. If you want a page to rank because you want to serve ads to users interested in sports commentary, then trying to optimize all of your content to accommodate voice may not be the most effective way to drive engagement.
However, if you want to boost foot traffic for a retail sandwich shop, then you can absolutely optimize the business listing to be easier to find when users ask for “lunch spots near me” via voice command while driving, and tailor your approach with that goal in mind.
Voice search is arriving quickly but has not yet hit critical mass, creating some low-hanging fruit for early adopters with specific content goals.
In July 2019, Adobe released a study suggesting that around 48% of consumers are using voice search for general web searches. The study did not differentiate between digital assistants on smartphones or smart speakers, but the takeaways are similar.
In Adobe’s study, 85% of those respondents used voice controls on their smartphones, and the top use case for voice commands was to get directions, with 52% of navigational searches performed via voice. Consistent with Adobe's findings, Microsoft also released a study in 2019 reporting that 72% of smartphone owners used digital assistants, with 65% of all road navigation searches being done by voice prompt.
A 2018 voice search survey conducted by BrightLocal broke out some common use cases by device:
58% of U.S. consumers had done a voice search for a local business on a smartphone
74% of those voice search users use voice to search for local businesses at least weekly
76% of voice search users search on smart speakers for local businesses at least once a week, with the majority doing so daily
Smart speaker adoption in US homes grew by 22% between 2018 and 2019 to an estimated 45% of homes having at least one smart speaker. Research released by OC&C Strategists projected the smart speaker to grow voice shopping into a $40 billion market by 2022, just in the US and UK alone.
But mass adoption of voice tech is still lagging, despite inroads made during the COVID-19 pandemic. While the 2020 Smart Audio Report by NPR and Edison Research found that consumption of news and entertainment using these devices increased among a third of smart speaker owners in early 2020, a two-thirds majority of non-owners were “not at all likely” to purchase a voice-enabled speaker in the next six months, and nearly half of non-owners who use voice commands felt the same. People who own smart speakers still perform lots of traditional text searches, in accordance with Microsoft’s 2019 study, and not everyone who has access to voice command tech likes to use it for every basic function.
Part of the delay in mass adoption may be attributed to unresolved trust and privacy questions that come with being asked to fill our homes with microphones. A majority of smart speaker owners (52%) and a majority of smartphone voice users (57%) are bothered that their smart speaker/smartphone is "always listening." However, a silver lining is that roughly the same numbers of users for each respective device trust the companies that make the smart speaker/smartphone to keep their information secure.
There are four major smart assistants processing the majority of voice search requests at the time of publication, each with their own search algorithms, but with some overlap and data sources in common.
Understanding the market share for each assistant can help you prioritize your optimization strategy to your top growth objectives. Each of these digital assistants are tied to different hardware brands with a slightly different appeal and user base, so you can likely focus your analytics tracking efforts to just one or two platforms depending on the audience you’re targeting.
The Microsoft 2019 Voice Report asked respondents to list which digital assistants they had used before, which provides a broad idea of how much voice search traffic we can expect to come from each of these engines. Siri and Google Assistant tied for first place, commanding 36% of the market each. Amazon Alexa accounts for 25% of all digital assistant usage, while Microsoft Cortana ranked third place, powering 19% of devices.
An interesting thing to note here is that the engine powering Cortana leans largely on a partnership with Amazon Alexa. Cortana provides voice command functionality to laptops and personal computers, such as “Cortana, read my new emails”, while Alexa sees more smart-speaker requests like “Turn on the lights” or “Play NPR.”
Voice commands actually fall into two categories — voice search and voice actions — and each looks for different criteria to determine which response will be returned first for any given voice request. It’s really important to define which one you’re talking about when assessing an SEO plan for voice search, because they process content very differently.
A voice search essentially just replaces a keyboard input with a spoken search phrase to return results in a browser, such as using the “OK Google” command in a smartphone browser. This may impact how you tailor your keyword phrases, based on the user's tendency to phrase queries more conversationally when interacting with a voice AI.
Voice actions, on the other hand, are specific voice commands or questions from the user that trigger certain apps or automations, such as placing an order for takeout via smart speaker or checking the weather from your car. Screen-free devices like home smart speakers and some car assistants use voice actions. These commands don’t return a ranked page of results, but often a single spoken result, with a prompt for further action. If you ask an Echo Dot device for the weather, it will describe the weather out loud based on data pulled from a predetermined source. It can’t return a list of popular weather forecast sites, because there is no screen to display a Search Engine Results Page (SERP). This is an important distinction.
Smart assistants often pull data from secondary sites to return these vocal snippet results, like pinging WolframAlpha for mathematical conversions or Yelp for local business listings. One such use case would be a voice search for “order a pizza.” The AI would route the query to Yelp or Google Maps, and verbally return one result such as “I found a pizzeria nearby with five stars on Yelp. Would you like to call Joe’s Pizza to place an order or look up driving directions?” This is sometimes known as “position zero,” when a search engine returns an abstract or snippet from within the content itself to answer a direct question without necessarily sending the user to the page.
Ranking position zero for a voice action prompt depends on where those results are being pulled from. Improving the voice search ranking for driving directions to a specific physical storefront, for example, is often a matter of improving that business's visibility on listing sites like Google Maps and Yelp, which you may already be doing as part of your SEO plan anyway.
The data source depends on the platform running the voice search. Google and Android devices utilize Google Local Pack, while Siri crawls Yelp to return results when prompted for “the best” in any specific category, otherwise prioritizing the closest results. Since Alexa pulls local results from Bing, Yelp, and Yext, having filled-out profiles and robust listings on those platforms will help a business rank highly in Alexa search results.
Each assistant also pulls NAP identity (name, address, and phone number of a business’s online listing). NAP pulls profiles for location-based results from slightly different and sometimes overlapping sources:
Siri pulls local recommendations from the NAP profiles on Yelp, Bing, Apple Maps, and Trip Advisor
Android devices and Google Assistant pulls NAP profiles from Google My Business
Alexa pulls NAP profiles from Yelp, Bing, and Yext
Cortana, powered by Alexa, pulls from Yelp and Bing
Someone hoping to optimize their business page for voice search will want to max out their NAP profiles across all platforms by making sure that their listings at business.google.com, bingmapsportal.com, and mapsconnect.apple.com are completely filled out. This is also where a reputation management product like Moz Local can help businesses looking to improve their rankings.
Again, many of the strategies you’d use to achieve first position on a text-based web search still apply to optimizing voice search. To improve voice performance specifically and appear in SERP features and voice snippets, on-page content should be structured so it’s easy to extract, basically reverse engineering the featured snippet you want to produce. But the question is, will it actually help you to rank well in that kind of search? That depends on your goal.
If the page you’re optimizing is built to sell more pizza to local customers, then yes, a featured snippet that pulls your NAP data from Google My Business and provides the pizzeria’s phone number to a hungry local parked nearby is a very good thing. But if the page in question is intended to serve sponsored content about diabetes management to drive clicks to an affiliate link for glucose monitoring strips, then you don’t necessarily want to build a page that helps Siri define Type II diabetes aloud to an eighth grader completing their homework.
Structuring the content headings with a question, followed by a concise answer in the paragraph below, makes it more likely that Siri will recite content from a given page when asked a similarly worded question by the user. The first answers a digital assistant gives when responding to a voice search query are typically the same type of snippets that show up in SERP features such as “People Also Ask” and Knowledge Graph results from Google.
In other words, Siri is unlikely to return your website to answer the voice prompt “What is the chemical composition of sugar?”, but you could rank highly with a featured snippet to answer a search like “Is sugar really bad for children with ADHD?”
The most valuable content for those seeking on-page visitors is the kind that addresses questions that are hard to answer with a single spoken response.
Rand Fishkin made his predictions on the role of the vocal snippet in search results as voice search was ramping up in 2016, and provided some advice on how you can plan your content around it in this Whiteboard Friday. According to Fishkin, it depends on whether you’re in the “safe” or “dangerous” zone for the content you’re trying to rank for, based on how easily a voice response can address the user’s query without sending them to your page.
“I think Google and Apple and Amazon and Alexa and all of these engines that participate in this will be continuing to disintermediate simplistic data and answer publishers,” Fishkin wrote.
He advises users to question the types of information they’re publishing, adding that if X percent of queries that result in traffic can be answered in fewer than Y words, or with “a quick image or a quick graphic, a quick number,” then the engine is going to do it themselves.
“They don't need you, and very frankly they're faster than you are,” Fishkin summarized. “They can answer that more quickly, more directly than you can. So I think it pays to consider: Are you in the safe or dangerous portion of this strategic framework with the current content that you publish and with the content plans that you have out in the future?”
Voice-enabled devices are gradually becoming more embedded in consumers’ daily lives, but that doesn’t mean we should prioritize our content as though voice is bearing down on the traditional search engine results page, threatening to replace text all together in the role of SEO. Even if smart assistants and voice-enabled devices continue to become more popular year over year, they still fill a relatively niche role in most consumers’ technical gadget ecosystem at this time. That could change as the voice AIs become more sophisticated and talking to our gadgets starts to feel more normal, but the industry is still grappling with some serious growing pains.
Voice search and voice action technology still has some really exciting applications looming on the horizon, and marketers are already finding clever ways to insert their brands into the hands-free experience. Optimizing content for voice search is just one piece of that puzzle.
Give us your hottest takes and wildest predictions on where voice search is headed in 2021 in the comments!
Anyone who does SEO as part of their job knows that there’s a lot of value in analyzing which queries are and are not sending traffic to specific pages on a site.
The most common uses for these datasets are to align on-page optimizations with existing rankings and traffic, and to identify gaps in ranking keywords.
However, working with this data is extremely tedious because it’s only available in the Google Search Console interface, and you have to look at only one page at a time.
On top of that, to get information on the text included in the ranking page, you either need to manually review it or extract it with a tool like Screaming Frog.
You need this kind of view:
…but even the above view would only be viable one page at a time, and as mentioned, the actual text extraction would have had to be separate as well.
Given these apparent issues with the readily available data at the SEO community’s disposal, the data engineering team at Inseev Interactive has been spending a lot of time thinking about how we can improve these processes at scale.
One specific example that we’ll be reviewing in this post is a simple script that allows you to get the above data in a flexible format for many great analytical views.
Better yet, this will all be available with only a few single input variables.
The tool automatically compares the text on-page to the Google Search Console top queries at the page-level to let you know which queries are on-page as well as how many times they appear on the page. An optional XPath variable also allows you to specify the part of the page you want to analyze text on.
This means you’ll know exactly what queries are driving clicks/impressions that are not in your <title>, <h1>, or even something as specific as the first paragraph within the main content (MC). The sky's the limit.
For those of you not familiar, we’ve also provided some quick XPath expressions you can use, as well as how to create site-specific XPath expressions within the "Input Variables" section of the post.
Once the process is set up, all that’s required is filling out a short list of variables and the rest is automated for you.
The output dataset includes multiple automated CSV datasets, as well as a structured file format to keep things organized. A simple pivot of the core analysis automated CSV can provide you with the below dataset and many other useful layouts.
Okay, not technically "new," but if you exclusively use the Google Search Console user interface, then you haven’t likely had access to metrics like these before: "Max Position," "Min Position," and "Count Position" for the specified date range – all of which are explained in the "Running your first analysis" section of the post.
To really demonstrate the impact and usefulness of this dataset, in the video below we use the Colab tool to:
[3 Minutes] — Find non-brand <title> optimization opportunities for https://www.inseev.com/ (around 30 pages in video, but you could do any number of pages)
[3 Minutes] — Convert the CSV to a more useable format
[1 Minute] – Optimize the first title with the resulting dataset
Okay, you’re all set for the initial rundown. Hopefully we were able to get you excited before moving into the somewhat dull setup process.
Keep in mind that at the end of the post, there is also a section including a few helpful use cases and an example template! To jump directly to each section of this post, please use the following links:
[Quick Consideration #1] — The web scraper built into the tool DOES NOT support JavaScript rendering. If your website uses client-side rendering, the full functionality of the tool unfortunately will not work.
[Quick Consideration #2] — This tool has been heavily tested by the members of the Inseev team. Most bugs [specifically with the web scraper] have been found and fixed, but like any other program, it is possible that other issues may come up.
If you encounter any errors, feel free to reach out to us directly at jmelman@inseev.com or info@inseev.com, and either myself or one of the other members of the data engineering team at Inseev would be happy to help you out.
If new errors are encountered and fixed, we will always upload the updated script to the code repository linked in the sections below so the most up-to-date code can be utilized by all!
Things you’ll need:
Google Drive
Google Cloud Platform account
Google Search Console access
Below you’ll find step-by-step editorial instructions in order to set up the entire process. However, if following editorial instructions isn’t your preferred method, we recorded a video of the setup process as well.
As you’ll see, we start with a brand new Gmail and set up the entire process in approximately 12 minutes, and the output is completely worth the time.
Keep in mind that the setup is one-off, and once set up, the tool should work on command from there on!
Download the files from Github and set up in Google Drive
Set up a Google Cloud Platform (GCP) Project (skip if you already have an account)
Create the OAuth 2.0 client ID for the Google Search Console (GSC) API (skip if you already have an OAuth client ID with the Search Console API enabled)
Add the OAuth 2.0 credentials to the Config.py file
1. Navigate here.
2. Select "Code" > "Download Zip"
*You can also use 'git clone https://github.com/jmelm93/query-optmization-checker.git' if you’re more comfortable using the command prompt.
If you already have a Google Colaboratory setup in your Google Drive, feel free to skip this step.
1. Navigate here.
2. Click "New" > "More" > "Connect more apps".
3. Search "Colaboratory" > Click into the application page.
4. Click "Install" > "Continue" > Sign in with OAuth.
5. Click "OK" with the prompt checked so Google Drive automatically sets appropriate files to open with Google Colab (optional).
1. Navigate to Google Drive and create a folder called "Colab Notebooks".
IMPORTANT: The folder needs to be called "Colab Notebooks" as the script is configured to look for the "api" folder from within "Colab Notebooks".
2. Import the folder downloaded from Github into Google Drive.
At the end of this step, you should have a folder in your Google Drive that contains the below items:
If you already have a Google Cloud Platform (GCP) account, feel free to skip this part.
1. Navigate to the Google Cloud page.
2. Click on the "Get started for free" CTA (CTA text may change over time).
3. Sign in with the OAuth credentials of your choice. Any Gmail email will work.
4. Follow the prompts to sign up for your GCP account.
You’ll be asked to supply a credit card to sign up, but there is currently a $300 free trial and Google notes that they won’t charge you until you upgrade your account.
1. Navigate here.
2. After you log in to your desired Google Cloud account, click "ENABLE".
3. Configure the consent screen.
Example below of minimum requirements:
4. In the left-rail navigation, click into "Credentials" > "CREATE CREDENTIALS" > "OAuth Client ID" (Not in image).
5. Within the "Create OAuth client ID" form, fill in:
Application Type = Desktop app
Name = Google Colab
Click "CREATE"
6. Save the "Client ID" and "Client Secret" — as these will be added into the "api" folder config.py file from the Github files we downloaded.
These should have appeared in a popup after hitting "CREATE"
The "Client Secret" is functionally the password to your Google Cloud (DO NOT post this to the public/share it online)
1. Return to Google Drive and navigate into the "api" folder.
2. Click into config.py.
3. Choose to open with "Text Editor" (or another app of your choice) to modify the config.py file.
4. Update the three areas highlighted below with your:
CLIENT_ID: From the OAuth 2.0 client ID setup process
CLIENT_SECRET: From the OAuth 2.0 client ID setup process
GOOGLE_CREDENTIALS: Email that corresponds with your CLIENT_ID & CLIENT_SECRET
5. Save the file once updated!
Congratulations, the boring stuff is over. You are now ready to start using the Google Colab file!
Running your first analysis may be a little intimidating, but stick with it and it will get easy fast.
Below, we’ve provided details regarding the input variables required, as well as notes on things to keep in mind when running the script and analyzing the resulting dataset.
After we walk through these items, there are also a few example projects and video walkthroughs showcasing ways to utilize these datasets for client deliverables.
Have you ever wanted to know every query driving clicks and impressions to a webpage that aren’t in your <title> or <h1> tag? Well, this parameter will allow you to do just that.
While optional, using this is highly encouraged and we feel it "supercharges" the analysis. Simply define site sections with Xpaths and the script will do the rest.
In the above video, you’ll find examples on how to create site specific extractions. In addition, below are some universal extractions that should work on almost any site on the web:
'//title' # Identifies a <title> tag
'//h1' # Identifies a <h1> tag
'//h2' # Identifies a <h2> tag
Site Specific: How to scrape only the main content (MC)?
Chaining Xpaths – Add a "|" Between Xpaths
'//title | //h1' # Gets you both the <title> and <h1> tag in 1 run
'//h1 | //h2 | //h3' # Gets you both the <h1>, <h2> and <h3> tags in 1 run
Here’s a video overview of the other variables with a short description of each.
'colab_path' [Required] – The path in which the Colab file lives. This should be "/content/drive/My Drive/Colab Notebooks/".
'domain_lookup' [Required] – Homepage of the website utilized for analysis.
'startdate' & 'enddate' [Required] – Date range for the analysis period.
'gsc_sorting_field' [Required] – The tool pulls the top N pages as defined by the user. The "top" is defined by either "clicks_sum" or "impressions_sum." Please review the video for a more detailed description.
'gsc_limit_pages_number' [Required] – Numeric value that represents the number of resulting pages you’d like within the dataset.
'brand_exclusions' [Optional] – The string sequence(s) that commonly result in branded queries (e.g., anything containing "inseev" will be branded queries for "Inseev Interactive").
'impressions_exclusion' [Optional] – Numeric value used to exclude queries that are potentially irrelevant due to the lack of pre-existing impressions. This is primarily relevant for domains with strong pre-existing rankings on a large scale number of pages.
'page_inclusions' [Optional] – The string sequence(s) that are found within the desired analysis page type. If you’d like to analyze the entire domain, leave this section blank.
Keep in mind that once the script finishes running, you’re generally going to use the "step3_query-optimizer_domain-YYYY-MM-DD.csv" file for analysis, but there are others with the raw datasets to browse as well.
Practical use cases for the "step3_query-optimizer_domain-YYYY-MM-DD.csv" file can be found in the "Practical use cases and templates" section.
That said, there are a few important things to note while testing things out:
1. No JavaScript Crawling: As mentioned at the start of the post, this script is NOT set up for JavaScript crawling, so if your target website uses a JS frontend with client-side rendering to populate the main content (MC), the scrape will not be useful. However, the basic functionality of quickly getting the top XX (user-defined) queries and pages can still be useful by itself.
2. Google Drive / GSC API Auth: The first time you run the script in each new session it will prompt you to authenticate both the Google Drive and the Google Search Console credentials.
Google Drive authentication: Authenticate to whatever email is associated with the Google Drive with the script.
If you attempt to authenticate and you get an error that looks like the one below, please revisit the "Add the email(s) you’ll use the Colab app with into the 'Test Users'" from Part 3, step 3 in the process above: setting up the consent screen.
Quick tip: The Google Drive account and the GSC Authentication DO NOT have to be the same email, but they do require separate authentications with OAuth.
3. Running the script: Either navigate to "Runtime" > "Restart and Run All" or use the keyboard shortcut CTRL + fn9 to start running the script.
4. Populated datasets/folder structure: There are three CSVs populated by the script – all nested within a folder structure based on the "domain_lookup" input variable.
Automated Organization [Folders]: Each time you rerun the script on a new domain, it will create a new folder structure in order to keep things organized.
Automated Organization [File Naming]: The CSVs include the date of the export appended to the end, so you’ll always know when the process ran as well as the date range for the dataset.
5. Date range for dataset: Inside of the dataset there is a "gsc_datasetID" column generated, which includes the date range of the extraction.
6. Unfamiliar metrics: The resulting dataset has all the KPIs we know and love – e.g. clicks, impressions, average (mean) position — but there are also a few you cannot get directly from the GSC UI:
'count_instances_gsc' — the number of instances the query got at least 1 impression during the specified date range. Scenario example: GSC tells you that you were in an average position 6 for a large keyword like "flower delivery" and you only received 20 impressions in a 30-day date range. Doesn’t seem possible that you were really in position 6, right? Well, now you can see that was potentially because you only actually showed up on one day in that 30-day date range (e.g. count_instances_gsc = 1)
'max_position' & 'min_position' — the MAXIMUM and MINIMUM ranking position the identified page showed up for in Google Search within the specified date range.
Quick tip #1: Large variance in max/min may tell you that your keyword has been fluctuating heavily.
Quick tip #2: These KPIs, in conjunction with the "count_instances_gsc", can exponentially further your understanding of query performance and opportunity.
Access the recommended multi-use template.
Recommended use: Download file and use with Excel. Subjectively speaking, I believe Excel has a much more user friendly pivot table functionality in comparison to Google Sheets — which is critical for using this template.
Alternative use: If you do not have Microsoft Excel or you prefer a different tool, you can use most spreadsheet apps that contain pivot functionality.
For those who opt for an alternative spreadsheet software/app:
Below are the pivot fields to mimic upon setup.
You may have to adjust the Vlookup functions found on the "Step 3 _ Analysis Final Doc" tab, depending on whether your updated pivot columns align with the current pivot I’ve supplied.
Project description: Locate keywords that are driving clicks and impressions to high value pages and that do not exist within the <title> and <h1> tags by reviewing GSC query KPIs vs. current page elements. Use the resulting findings to re-optimize both the <title> and <h1> tags for pre-existing pages.
Project assumptions: This process assumes that inserting keywords into both the <title> and <h1> tags is a strong SEO practice for relevancy optimization, and that it’s important to include related keyword variants into these areas (e.g. non-exact match keywords with matching SERP intent).
Project description: Locate keywords that are driving clicks and impressions to editorial pieces of content that DO NOT exist within the first paragraph within the body of the main content (MC). Perform an on-page refresh of introductory content within editorial pages to include high value keyword opportunities.
Project assumptions: This process assumes that inserting keywords into the first several sentences of a piece of content is a strong SEO practice for relevancy optimization, and that it’s important to include related keyword variants into these areas (e.g. non-exact match keywords with matching SERP intent).
We hope this post has been helpful and opened you up to the idea of using Python and Google Colab to supercharge your relevancy optimization strategy.
As mentioned throughout the post, keep the following in mind:
Github repository will be updated with any changes we make in the future.
There is the possibility of undiscovered errors. If these occur, Inseev is happy to help! In fact, we would actually appreciate you reaching out to investigate and fix errors (if any do appear). This way others don’t run into the same problems.
Other than the above, if you have any ideas on ways to Colab (pun intended) on data analytics projects, feel free to reach out with ideas.
]]>When undertaking a link building campaign, it’s important to remember that much of the traffic you generate will be from first-time users. These users are less likely to wait around for slow loading content and less likely to return if you don’t make a good impression. So whether your campaign is a brand-focused PR push, content-focused outreach strategy, or something in between, it’s worth investing in a robust technical SEO framework that helps users connect to and engage with your brand, for years to come.
To make the most of campaign traffic, ensure that users can smoothly connect and engage with your site from a range of channels and sources. Your SEO priorities should be broadly divided into tasks that encourage social shareability, create opportunities for site-wide SEO gains, and maximize the viability of your campaign as part of your broader marketing activity.
Let's explore the SEO tactics you can use to optimize and improve the performance of your link building campaign. As part of your campaign, you should aim to:
Increase shareability by making the site load faster and display more consistently for the predominantly mobile audience that discovers your content via social media channels.
Improve opportunity for site-wide SEO gains through optimizations to internal linking, improvements to E-A-T indicators, and on-page SEO from landing pages.
Optimize for the viability of the campaign by improving tracking, channel integration, and planning for long-term link traffic.
In addition to being great for users, retention-based optimizations are efficient to manage because they can be planned and put in place well ahead of core campaigns. They’re also applicable to every type of link building campaign, and work in tandem with fundamental backlink management tactics.
Before undertaking the work to gain new backlinks, carry out a backlink audit for actionable data on the quantity of your existing backlinks and the quality of your referrers.
In the campaign planning stages, popular content identified in your audit can give you strategic insights into the kind of content that will perform well with your existing audiences and networks. The overall quantity of links will give you valuable benchmarks for evidencing campaign performance, and the quality and distribution of your inbound links will help your team set targets for which referrers, target pages, and anchor text will give you the most benefit.
Use a backlink audit tool to find out:
How many external sites are linking to your pages?
Which external sites are linking to your pages?
What is the quality of the external sites backlink profile?
Which pages have existing external backlinks?
Which backlinks are identified as "target errors" because they’re going 404 pages?
Which backlinks are going to 301s pages?
If your inbound links include a significant proportion of toxic or poor quality links, then you may wish to take corrective actions like creating and submitting a disavow list in Google Search Console before the campaign starts in earnest. But if your links are of decent quality, then your next focus should be to reclaim broken links to 404s and reduce links to 301 pages.
You should fix backlink errors to 301s and 404s to regain the link value and build momentum ahead of your campaign. When traffic arrives on a site via a 301 redirect link, you may lose some PageRank value from that connection and receive almost zero PageRank value from links pointing to a 404 page. Fixing these links re-establishes these connections and can help increase the organic performance of your campaign target pages by improving the overall domain authority.
These improvements also reduce user connection times and improve tracking data. Tools like Google Analytics find it difficult to attribute the original source of the click, often leading to referral traffic being incorrectly attributed as direct — which is less than helpful for marketers who want to know who their best performing referrers are.
During the link reclamation process, you will want to update links to the best possible new URL. To improve the value of links you already have, carry out these actions:
Assign 301 redirects to any backlink “target errors” that are going to 404 pages. When assigning the new page, try to match it with like-for-like content. An old link to a page about “shoes” should not be redirected to a page about “sharks”, it should go to a page that is also about shoes.
Ensure that any existing 301s are linking to the final destination. Where possible, redirect chains should be removed to maintain as much link value as possible, reduce demands on the server, and optimize for crawl budget.
Where you have control over the linking page — for instance, on social profiles, directory listings, internal project sites, or partner sites — update any 301 or 404 links to a relevant 200 URL. This is something that's often overlooked, but it's common for sites that have recently adopted HTTPS to have the old HTTP link on their social profiles and listings. You can and should update these.
Improving these links makes it easier for bots to crawl and index your site, improves user connection speeds, and gives more consistency for your brand messaging before you begin your outreach activity.
Optimizing for social media should be a key part of your link building strategy because social media drives around 25% of website referral traffic worldwide. This means that social media has become a core part of content promotion and should be an expected component of the link building process. You can optimize for this mobile-first audience by:
Securing network connections to reduce delays from referrals
Improving page speed for mobiles
Updating open graph data to improve social shareability
Security is an element of technical SEO that is often oversimplified. Yes, HTTPS is a ranking signal, but many believe this is achieved simply by obtaining an SSL certificate. That is part of it — you do need an SSL, but it’s only the start.
Security optimizations often involve server-side updates that can improve the speed and quality of your connections across the web by streamlining the security verification process.
This is particularly important for social media sites like Facebook, which have high levels of encryption and security on their end and equally high expectations for referred domains. Security layers can slow down connection times and affect user trust, so investing time here can help campaign performance overall.
Not all SSLs are equal, and the benefits of a secure website extend beyond achieving a lock icon in the search bar. An optimized SSL certificate will be using the latest transport layer security protocols and be error-free.
There are several tools that allow you to check the performance of your existing SSL. Each tool runs different diagnostics, so it’s worth running them in tangent to get a full picture of where action should be taken. It's possible to pass an inspection on one tool and not on another, so taking a layered approach will yield better results.
For a quick snapshot, run your site through the SSL Checker from SSL Shopper. They give you a top-level rundown of expiry dates and whether your SSL is trusted by all web browsers. You want all green ticks here. Any errors should be flagged and addressed directly with your certificate issuer.
In my experience, improving errors here can have an impact on site speed. One client saw an almost 70% reduction in overall server connection time and over 60% reduction in Average Page Loads on Chrome by resolving chain issues.
Run a full diagnostic
Using the SSL Server Test, you can get a more detailed diagnostic of your security configuration. This test gives your site a grade and assesses a range of security indicators. To support a link building campaign, you’ll want to confirm that your server is running the “modern” Transport Layer Security (TLS) protocols, specifically TLS 1.2 and ideally TLS 1.3.
TLS 1.3 came into use around 2018 and has since become the preferred connection protocol for large scale CDNs and operating systems. Not only is it more secure, but TLS 1.3 can improve connection latency by around 45% by removing network connection steps.
Cloudflare, Facebook, and Android use this protocol as a default, and a matching upgrade for your site could improve performance for a significant number of web users. As a point of reference, Cloudflare alone is used by around 16% of all websites worldwide and 81% of those with CDNs, so upgrading your TLS could help more users to quickly access your site from each new (and established) link.
With the introduction of Google’s Core Web Vitals, speed metrics and upcoming algorithm updates have put additional emphasis on mobile page loading times in general. In a link building campaign, you could see spike in traffic to a single page. But Google is watching, and if your page doesn’t deliver high quality UX as measured by CWV for at least 75% of users, this traffic could affect your page ranking.
Monitor your target pages in GA and Search Console to identify any challenges that should be addressed.
Social shares will almost certainly form part of your link building campaign — with good reason. With updates to your Open Graph (OG) tags, you can tailor your tags for better rendering and performance.
If the hero image for your page doesn’t display when you post links to social media, then you need to update your OG meta tags. It could be that your page doesn’t have the tags in place, or that the fields are not populating automatically. In either case, this is something you can fix.
Using post validators for Facebook, LinkedIn, and Twitter, you can inspect individual pages for performance.
At scale, you can use a Custom Xpath Extraction in Screaming Frog to crawl your site. Then review your site for the best regex to populate the fields automatically, and brief your dev team to update accordingly.
Your link building campaign can bring benefits to other pages on your site with careful planning.
Whether your link building campaign is based on a short-term campaign landing page, a thought leadership piece, or a core service, your on-page SEO should be optimized to encourage organic traffic as well.
This is because humans forget things.
So while a user may discover a campaign or promotion via direct link building activities, they should also be able to find it again or look it up to transfer any offline buzz into online knowledge.
Your target page should be supported by solid on-page SEO that includes:
Keyword-optimized content with H1s and H2s
Relevant images optimized with alt text, titles, and structured data
Metadata that is optimized for campaign keywords
If you’ve received a juicy backlink from a relevant source, don’t let the benefits stop at a single page. Ensure that your target page has links to other, relevant content across your site that will keep users coming back.
Map out your internal links to ensure that any pages being used as backlink landing pages include links to similarly-themed pages on your site. You'll get the most SEO benefit if your internal link structure includes links to pages with:
Optimizations for keywords that are lexicographically similar
A relevant parent folder
Good authority or low bounce rate
200 response codes for maximum crawlability
This allows users and web crawlers that are enabled for natural language search to understand that your content is part of a wider bank of knowledge and expertise. It also makes it more likely that users will return to other content on your site in the future.
Internal linking goes both ways, so don’t forget to create internal links into campaign content. For short-term campaigns, marketers sometimes create bespoke campaign landing pages with minimal links across the wider site.
This can result in slower indexing for campaign pages, making it more difficult for users to find you via search. To address this, create an announcement-style blog post that links to the campaign target page. In this way, both pages can be entered into the updated sitemap and submitted for indexing.
As mentioned previously, humans may not remember every detail of your campaign target page when they try to find it again. However, they may remember the person that shared, created, or was featured in the content. So include information about the team behind the content or campaign in order to build Expertise, Authority, and Trust for your brand, and increase the impact of social shares. This is particularly useful for thought leadership campaigns, where expertise signals like author biographies can be optimized with structured data.
For short-term campaigns like those for new events or products, including trust indicators like dynamic reviews can assist with conversions.
You aren’t building links for links’ sake. You’re doing so to meet wider business objectives like driving sales, increasing market share, or generating leads.
Once you’ve built your links, traffic to your target page becomes an opportunity to generate valuable data for your conversions funnel. Your tracking should be designed to give you data that supports your overall business objectives.
During your campaign, you should keep your business goals in mind and understand how your target page contributes to those goals. Update the tracking for your target pages to include metrics that correspond to your aims and the content type. Users visiting a data rich 5,000-word industry report are at a different stage in the customer funnel than those visiting from a quality niche directory.
Creating target page metrics can also help with KPIs, reporting, and evidencing campaign ROI, which clients and managers adore.
Once your business metrics are in place, use the data you’ve collected from link traffic to inform performance on other channels. For example:
Scroll depth data from users who access a piece of high-quality content could be used for re-marketing content on YouTube or display advertising.
Demographic data from users who visit for niche relevant “awareness month” content could improve audience targeting for advertising or PR activity around the same topic.
Email signups for an outreach event can be added to Facebook as Custom and Lookalike audiences for more direct conversions.
As technical SEOs, be aware of your team’s potential uses for customer data so that you can manage tags and script integration into the site overall. Tracking tools like Facebook Pixel and Google Tag Manager tend to require active third party javascript management. Test and optimize for any new scripts before your campaign gets underway.
Planning for long-term link traffic
Consider the lifecycle of your link building campaign target page.
Ideally, you want to be driving traffic to a URL that can accrue page authority over time. If you spend time creating traffic for a URL that needs to change soon after the campaign ends, you’ll eventually be driving traffic to a 301 link. As we discussed before, this doesn't give as much PageRank value for your site as an active 200 link does.
So, plan for ways to keep a consistent, live URL for an extended period of time. Depending on your link building strategy, you may be able to employ one of the following techniques:
Use evergreen URLs for long-term content: Thought leadership, cornerstone content like a white paper, or one-off reports are likely to remain on your site for a long time. For this content, consider removing dates from your URL to make the content more evergreen, as this allows for content to be updated and reduces future redirect requirements.
Create permanent pages for recurring campaigns: Landing pages for recurring outreach content like annual event sponsorship or awareness campaigns should become part of permanent navigation. This allows you to build links every year to a well-optimized, annually updated, static page, rather than starting from scratch with blogs to different URLs every year.
Avoid building links to PDFs: Put any downloadable resources into an HTML landing page and build links to that page. Links to PDFs can be difficult to redirect because of how they’re configured in your htaccess file.
Plan for any unavoidable redirects: For short-term campaigns like sales promotions, plan for which page will become the permanent page. Include common copy on both domains to help Google understand that it’s not a soft 404.
Technical SEO can help you gain and maintain backlinks, connect with mobile users, and improve the quality of your connections when you:
Secure network connections to reduce delays from referrals
Improve page speed for mobile users
Update open graph data to improve social shareability
Update on-page SEO
Optimize internal links
Include E-A-T optimizations
Customize tracking
Plan for channel integration
Plan for long-term link traffic
These tactics will help the links you build to add value for your customers, your rankings, and your business for many years to come.
What we can agree on is that — due to Google’s advancements in Natural Language Processing (NLP) — the long tail of search has exploded. However, I will argue that NLP has also imploded the long tail, and understanding how and why may save our collective sanity.
The long tail of search is the limitless space of low-volume (and often low-competition) keywords. Tactically, long-tail SEO centers on competing for a large number of low-volume keywords instead of focusing on a small set of high-volume keywords.
Long-tail SEO encourages us to let go of vanity, because high-volume, so-called “vanity” keywords are often out of reach or, at best, will empty our bank accounts. Low-volume keywords may be less attractive on the surface, but as you begin to compete on hundreds or thousands of them, they represent more traffic and ultimately more sales than a few vanity keywords.
You’ve probably seen a graph of the long tail like the one above. It’s a perfectly lovely power curve, but it’s purely hypothetical. And while you may smile and nod when you see it, it’s hard to translate this into a world of keywords. It might help to re-imagine the long tail of SEO:
I’m not sure the “reclining snowman of SEO” is ever going to catch on, but I think it helps to illustrate that — while head keywords are high-volume by themselves — the combined volume of the long tail eclipses the head or the middle. Like the familiar curve, this visualization dramatically underestimates the true scope of the long tail.
In the words of the ancient SEOs, “It doth depend.” Typically, long-tail keywords are low-volume, multi-word phrases, but the long-tail is relative to your starting point. Historically, any given piece of the long tail was assumed to be low-competition, but that’s changing as people realize the benefits of targeting specific phrases with clear intent (especially commercial intent).
Targeting “widgets” is not only expensive, but searcher intent is ambiguous. Targeting “buy blue widgets” narrows intent, and “where to buy Acme Widget LOL-42” laser-focuses you on a target audience. As searchers and SEOs adapt to natural language search, previously “long-tail” keywords may become higher volume and higher competition.
Google has told us that 15% of the searches they see every day are new. How is this possible? Are we creating that many new words? That’s sus, bruh!
I can explain it to you in a very short story. The other day, my (half-Taiwanese) 10-year-old daughter couldn’t remember what her Chinese zodiac sign was, so she asked Google Home:
Hey, Google, what's the animal for the Chinese new year calendar thingy for 2010?
It’s easy to get hung up on the voice-appliance aspect of this, but whether or not you believe in the future of voice appliances, the reality is that voice search in general has driven the need for natural language search, and as Google becomes better at handling natural language, we’re reverting to using it more often (it’s our default mode). This is especially evident in kids, who never had to learn to dumb down their searches for antiquated algorithms.
How can we hope to target keyword phrases that are literally evolving as we speak? Fortunately, NLP cuts both ways. As Google understands context better, the algorithm recognizes that many variations of the same phrase or question are essentially the same. Which leads us to...
Back in 2019, I did a keyword research case study at SearchLove London on UK mega-retailer, John Lewis. In my research, I was surprised to see how many searches Google was automatically redirecting. There’s the obvious, like Google assuming that people who searched for “Jon Lewis” in the UK probably meant “John Lewis” (sorry, Jon):
It’s interesting to note that Google has gradually, quietly moved from the previously more prevalent “Did you mean?” to the more assertive (some might say aggressive) “Showing results for…” In this case, optimizing for Jon Lewis in the UK is probably pointless.
I expected a rabbit hole, but I landed in a full-on bunny chasm. Consider this search:
Hjohjblewis?! I landed on this misspelling entirely by accident, but I imagine it involved an attention-starved cat and cat-adjacent keyboard. This level of rewriting/redirecting was shocking to me.
Misspellings are just the beginning, however. What about very similar long-tail phrases that don’t surface any kind of rewrite/redirect, but show very similar results?
Note that this same set of terms in the US overwhelmingly returns results about former US Representative and civil rights leader, John Lewis, demonstrating just how much not only intent can shift across localities, but how Google’s re-interpretations can change dynamically.
That same year, I did an experiment for MozCon targeting long-tail questions, such as “Can you reverse a 301-redirect?”, demonstrating that posts written around a specific question could often rank for many forms of that question. At the time, I didn’t have a way to measure this phenomenon, other than showing that the post ranked for variations of the phrase. Recently, I re-analyzed my 2019 keywords (with rankings from April 2021) using a simplified form of Rank-Biased Overlap (RBO) called RBOLite. RBOLite scores the similarity between two rank-ordered lists, yielding a score from 0-1. As the name implies, this score biases toward the higher-ranked items, so a shift at #1 will have more impact than a shift at #10.
Here are the scores for a sampling of the phrases I tracked for the 2019 post, with the title of the post shown at the top (and having a perfect match of 1.0):
You can see visually how the similarity of the results diverges as you change and remove certain keywords, and how this creates a complex interaction. What’s fascinating to me is that changing the question phrase from “Can you” to “How do you” or “How to” made very little difference in this case, while removing either “301” or “redirect” had more impact. Switching “you” vs. “I” by itself was fairly low impact, but was additive with other changes. Even the SERPs with “undo” in place of “reverse” showed fairly high similarity, but this change showed the most impact.
Note that the week-over-week RBOLite score for the initial phrase was 0.95, so even the same SERP will vary over time. All of these scores (>0.75) represent a fair degree of similarity. This post ranked #1 for many of these terms, so these scores often represent shifts farther down the top 10.
Here’s another example, based on the question “How do I improve my domain authority?”. As above, I’ve charted the RBOLite similarity scores between the main phrase and variations. In this case, the week-over-week score was 0.83, suggesting some background flux in the keyword space:
One immediately interesting observation is that the difference between “improve” and “increase” was negligible — Google easily equated the two terms. My time spent debating which keyword to use could’ve been spent on other projects, or on eating sandwiches. As before, switching from “How do I” to “How do you” or even “How to” made relatively little difference. Google even understood that “DA” is frequently substituted for “Domain Authority” in our industry.
Perhaps counterintuitively, adding “Moz” made more of a difference. This is because it shifted the SERP to be more brand-like (Moz.com got more mentions). Is that necessarily a bad thing? No, my post still ranked #1. Looking at the entire first page of the SERPs, though, adding the brand name caused a pretty clear intent shift.
In the past decade, the long tail has exploded and then imploded (in many ways, due to the same forces), and yet somehow we’ve landed in a very different keyword universe. So, where does that leave us — the poor souls fated to wander that universe?
The goods news of this post (I hope) is that we don’t have to work ourselves to death to target the long tail of search. It doesn’t take 10,000 pieces of content to rank for 10,000 variants of a phrase, and Google (and our visitors) would much prefer we not spin out that content. The new, post-NLP long tail of SEO requires us to understand how our keywords fit into semantic space, mapping their relationships and covering the core concepts. While our tools will inevitably improve to meet this challenge (and I’m directly involved in such projects at Moz), our human intuition can go a long way for now. Study your SERPs diligently, and you can find the patterns to turn your own long tail of keywords into a chonky thorax of opportunity.
]]>Certainly yes. Google even says so in their official How Search Works documents:
Exactly how Google uses engagement signals (i.e. clicks and interaction data) is subject to endless SEO debate. The passage above suggests Google uses engagement metrics to train their machine learning models. Google has also admitted to using click signals for both search personalization and evaluating new algorithms.
When pressed for specifics though, Google typically responds with either forced denials ("We're not using such metrics") to carefully-worded deflections ("clicks are noisy.")
While many Googlers no doubt work hard to be helpful to the SEO community, they are also under pressure "not to reveal too much detail" about their algorithms out of caution that SEOs will game search results. In reality, Google is never going to tell SEO exactly how they use engagement metrics, no matter how many times we ask.
Most SEO debate focuses on if Google uses organic Click-through Rates (CTR) in its ranking algorithms. If you are interested, AJ Kohn's piece is particularly outstanding as well as Rand Fishkin's Whiteboard Friday on covering this topic. For a nuanced counter-view, I'd recommend reading this excellent post by Dan Taylor.
To be fair, I believe most of the debate around CTR up to this point has likely been far too simple. Whatever way SEOs think Google uses click data, how Google actually uses clicks is guaranteed to be far more sophisticated than anything we may conceive. This complexity gap gives Google easy deniability, and justification for calling otherwise reasonable SEO theories "made up crap." (Google may very well say something similar about this article, which is fine.)
At this point, you may think this is another post adding to the CTR debate, but in fact, it's not. THIS SIMPLY ISN'T THAT POST.
Arguing "if" Google uses click signals leads us down the wrong path. We know Google does, we simply don't know how. For example, are they direct signals, or used for machine learning training only? Are click signals used in the broader algorithm, or only for personalization?
Instead, lets propose something far more radical, and likely far more helpful to your SEO:
Not too long ago, Google patent guru Bill Slawski posted his discovery of a newish Google patent that described "Modifying search result ranking based on implicit user feedback."
The patent is fascinating from an SEO perspective because it explains how using click signals can be very "noisy" (as Google often says) but describes a process for calculating "long click" and "last click" metrics to cut through the noise and better rank search results.
To be fair, we have no evidence Google uses the processes described in this patent, and even if they did, it would likely be far more sophisticated/nuanced than the process described here.
That said, the patent is riveting because it supports many of the same best SEO practices we've advocated for years. So much so that, if you optimized for these metrics, you'd almost certainly improve your SEO traffic and rankings, regardless if Google uses these exact processes or not. Specifically:
More Clicks ("High CTR"): earns you more traffic no matter your rank, and initial clicks form the basis of all subsequent click metrics.
Improved Engagement ("Long Clicks"): almost always a positive sign from your users, and often an indicator of quality as well as being correlated with future visits.
User Satisfaction ("Last Click"): the holy grail of SEO, and ultimately the experience Google strives to deliver in its search results.
We can summarize these principles into 3 tenets of click-based engagement metrics for SEO: First, Long, and Last.
Let's explore each of these in turn.
As stated earlier, this isn't a debate if Google uses CTR. There's plenty of evidence that they monitor and consider clicks in a variety of ways. (And to be fair, there's evidence that they don't use CTR as extensively as many SEOs believe.)
As the Google patent US8661029B1 states:
Even if CTR isn't a ranking signal, having a higher CTR is almost always good for SEO, because it means getting more clicks and more eyeballs on your content.
Besides the inherent value of earning a high CTR, clicks also form the basis of subsequent click-based metrics, including long clicks and last clicks. So earning that first click is an essential step.
Your ability to earn a higher CTR is almost entirely contained with optimizing your appearance in Google search results. How your snippet stands out and gets noticed for being a likely helpful, relevant answer—in a sea of other competing results—is the name of the game.
You may think your options at influencing CTR in this way are quite limited, but in fact, you have many, many surprisingly powerful levers to pull in your favor, including:
Compelling, relevant Title Tags (My Master Class, definitely worth a watch)
Compelling, keyword-rich Meta Descriptions
Structured Data & Rich Snippet Markup
Keywords-rich URLs, which Google may use as breadcrumbs
What about artificially manipulating your CTR, either using bots or one of the many blackhat click services you can find on the web? More often than not, these tactics lead to disappointing results. One possible reason why is that Google is very skilled at sniffing out "unnatural" browsing behavior.
So high CTR can be a good thing, but the fact remains—as Google has told us countless times—CTR is a "noisy" signal to use for ranking. Should a result with a flashy title be rewarded simply because users click on it, even if the actual page provides a lackluster experience?
In truth, while earning clicks is one of the primary goals of SEO, the "noise" of the signal is probably why Google avoids using CTR as a direct ranking signal itself.
In fact, earning a high CTR if your content leads to a poor user experience may actually hurt you in the end. More on this below.
So first, we need to figure out if our clicks create a good user experience. Read on...
So what if you trick people into clicking your URL, but your page actually doesn't deliver what you promised, or even adequately answer the query.
This isn't good for users, or for Google. And it definitely isn't good for you.
One measure of content relevancy search engines can use is weighted viewing time, based on the concept that users typically spend a bit longer time on a site they find relevant, versus a page they find not helpful. Within this framework, "long clicks" can carry more weight than "short clicks."
The patent explains it like this:
"But Cyrus," smart SEOs protest, "not every query needs a long click. Many searches, like the weather or the "highest mountains in Europe," can be answered very quickly, often in seconds. It doesn't make sense for these pages to have long clicks."
Those SEOs are right, of course. Fortunately, Google engineers understood not every query is the same and devised a clever solution: click scores can be weighted on a per-query basis, including language and country-specific click data.
"Note that such categories may also be broken down into sub-categories as well, such as informational-quick and informational-slow: a person may only need a small amount of time on a page to gather the information they seek when the query is “George Washington's Birthday”, but that same user may need a good deal more time to assess a result when the query is “Hilbert transform tutorial”
To dive a little deeper, it's not so much how long visitors stay on your page, but your ratio of long clicks (LC) to overall clicks (C), weighted on a per-query basis. This LC|C ratio could be used to re-rank queries based on user-engagement.
Take this a step further: results with good long-click ratios may rank higher, while results with poor long-click ratios may rank lower.
So consider a situation where you "hacked" your CTR to earn more clicks, but the page itself doesn't deliver, resulting in more short clicks. In theory, this could actually hurt your rankings, even though you started with a higher CTR!
So be sure to back up your higher CTRs with great user experiences, e.g. long clicks.
Many SEOs refer to long clicks as analogous to improving your "dwell time", or simply the amount of time a user spends on your site. The signals associated with improving dwell time are often known as "UX" (User Experience) signals.
The golden rule of getting more long clicks is simply this: provide the most useful, complete, and engaging answer to a user search query, in the most attractive and effective format possible.
A note of distinction: because most pages rank for multiple keywords, and multiple keyword variations, all with possibly varying search intent, it's often helpful to target for those various search intents all on the same page.
For example, a user searching for information about meta descriptions may also be interested in "meta description length", "meta description format" and "how to write meta descriptions." Optimizing more completely for these varying search intents can improve your long click metrics.
Pro Tip: You don't need to optimize for every user intent on the same page. Linking to other resources on your site is fine, and even encouraged! Visitors don't have to stay on the same page for a search click to count as "long."
Aside from the quality of the content itself, there are a number of UX factors you can employ to encourage your visitors to engage with your content at a deeper level. While not an exhaustive list, a few examples may include:
Have a clean, easy-to-use navigation
Make your site easy to search
Place important content above the fold, where it's easy to find
Leverage high-quality videos (Moz's Whiteboard Friday pages have an average view time of nearly 10 minutes!)
Strive for 10x Content
Use attractive, modern design
Prominently link to closely related topics to cover multiple searcher intents. These can be internal links, or even external links.
Admittedly, there aren't a ton of good excellent resources published on increasing engagement and improving long clicks. That said, I believe Brian Dean of Backlinko does an excellent job with this, and his resource on improving dwell time is worth checking out.
Yes, being the last click may be the holy grail of SEO.
A user clicks their way through a page of search results, not finding what they are looking for. Finally, they click on your URL and behold!.... You have the answer they sought.
It means you've satisfied the user query.
Put simply, being the last click means searchers don't return to Google to select another result (e.g. pogo sticking.)
Even if Google doesn't use this as a ranking factor, you can see how it might benefit your SEO to be the user's last click as much as possible. Satisfying the user query means users are more likely to browse and share your content, as well as seek you out again in the future.
In my own SEO, there are fewer things I've seen associated with greater success than improving visitor satisfaction, and this is exactly what Google seeks to reward.
It's also damn difficult to achieve.
Sadly, a typical process in SEO is to give a content brief to a copywriter, expect them to cover all the salient points, hit publish, and hope for the best. But more often than not, do you believe this content truly deserves to rank #1? Is this the first, last, and only result a user needs to click?
Years ago when working in a successful restaurant, a manager gave me advice about delivering 100% customer satisfaction that I will never forget: "Whatever happens, make sure they want to come back."
This is how you should treat SEO: make sure every visitor to your site wants to come back.
Exactly how to make sure your visitor wants to come back is going to vary based on each and every query, but generally, it means going the extra mile, answering questions more completely, and offering the user more resources and a better experience.
In short, deliver an experience superior to every one of your competitors.
Beyond this, I recommend these 3 resources when improving your content (all amazingly from Rand Fishkin):
To be honest, it's nearly impossible to accurately measure click-based signals, as Google holds all the data.
(Even if you could accurately measure your long click/click ratio, or last click metrics, calculating their actual value would be meaningless without an accurate account of every other Google search result, let alone on a per-query basis.)
That said, there are metrics that can help you directionally measure any progress you might make. These are all available either through Search Console or Google Analytics:
Keep in mind that there is no such thing as a "good" score for these numbers, as everything is relative to the specific query it appeared for, as well as every single one of your competitors.
Regardless, these metrics can be directionally useful indicators when making improvements to your content. For example, if you see a drop in bounce rate and increase in session duration after a major content update, you can take this as an indicator that things are moving in the right direction. And in fact, it's not unusual to see an increase in rankings/traffic after such a change accompanied by a positive shift in metrics.
While we can't directly see what Google might measure in terms of complex click metrics, we can often make educated guesses.
And even if Google isn't using these metrics exactly the way we speculate, we can still improve our SEO by paying attention to the user click behaviors we have influence over.
Thanks for making it this far. Remember:
Be First
Be Long
Be Last
Get those clicks, and earn them!
Google Posts That Local Results Are Influenced By Clicks, Then Deletes That
How Google Interferes With Its Search Algorithms and Changes Your Results
User Behavior and Local Search - Dallas State of Search 2014
Is CTR A Ranking Factor In Organic Results? (Negative result)
Queries & Clicks May Influence Google’s Results More Directly Than Previously Suspected
Test points to likely influence of click-through rate on search rankings
Google Brain Canada: Google Search Uses Click Data For Rankings?
So, when we think about how small and medium-sized enterprises (SMEs) can compete in today's ever-evolving SERP landscape, time and time again, well-implemented structured data is what makes the difference.
In this blog I’ll explain the following:
In my experience, well-implemented structured data is effective for websites of all sizes and in all verticals. For my own clients, schema implementation has enabled growth, improved performance on search and created opportunities to reach new audiences.
Though the target markets, objectives and audiences differed in each case, I was able to use schema markup as a strategic underpinning of a wider SEO and marketing strategy. This is because schema has become a fundamental element of scalable SEO.
On a website, structured data is a means of defining content with a uniform set of names and values, so that bots and machines can better read, index, understand, and serve the content of your site. While the phrase “structured data” can include elements like open graph for social media, microdata, or indeed any set of data that is organized uniformly (think of your CRM), generally when SEOs talk about structured data, we’re referring to structured data markup in JSON-LD as specified by Schema.org and recommended by Google.
Schema.org has become structured data HQ, because its framework — sets of vocabularies and relationships — was created and is maintained through a cross-platform partnership between Google, Microsoft, Yandex, and other major search engines. They regularly create new schema types and relationships aimed at making the information on the web more easily accessible to users.
Schema.org breaks down content into common vocabulary of predefined @types, which each have predefined properties, and can then be expressed using a common Javascript notation (JSON-LD). Like entries to the Oxford English Dictionary, the team behind Schema.Org are constantly adding new @types and properties in order to keep pace with user needs. At present, there are 778 types, but that number will continue to grow. Each new type brings more clarity, consistency, and ease of access to the information on the web — something that’s brilliant for search engines, and great for your traffic.
Sometimes when I'm explaining structured data to clients, I describe it as a means of essentially turning your beautiful website into a spreadsheet for robots. They can prioritize and process the critical information about the content of the page without having to understand the layout of your particular Wordpress theme, reams of CSS, or navigate your Joomla configuration.
This means that information a bot has on a page can be more consistent and resilient, even if the content changes day-to-day. So, in the example of a retailer with seasonal specials and campaigns that change the front end home page layout, structured data tells Google the same information about the page in the same way every time:
Simply put, structured data gives you the chance to jump the queue on the SERP.
When we look at the ways in which Google has enhanced its SERPs over the last few years, what we see consistently is the use of JSON-LD structured data in combination with Google APIs to create new features and new channels for content. Rich snippet SERP features like Google for Jobs, Google Shopping, featured snippets, how-to instructions, recipe cards, knowledge panels, and other monumental changes to the SERP have all been driven or improved by the creation and utilization of structured data frameworks.
Users love these features because they’re multi-media search enhancements, and are impossible to miss as they often take up the entire viewport on mobile:
Not a plain blue link in sight.
In many cases, your content cannot be included in these attractive rich snippets without structured data. So, if you literally want to get ahead of the competition, structured data needs to be a component in your SEO strategy.
Along with increased visibility, structured data implementation offers the following advantages for small businesses:
With almost 800 types of schema markup available to add to a website, it can be difficult to decide which are the best for your page, but to start, you can introduce or improve some new elements to help you better perform online and complement your existing content or e-commerce SEO strategy.
There are certain sets of schema markup that apply to almost every site, and others — like Product and Job Postings — that are niche critical to effective SEO. As a general rule of thumb, every time I get a new client, I run through the following initial checks:
If you answered “no” to any of these questions and the site doesn’t have the appropriate markup, then you should add schema markup to your site.
If the answer to these questions is “yes”, then it’s important to test the quality of the implementation before moving on to the next step. To do this, take a look at Search Console’s Rich Results Report to review pages at scale, or use their Structured Data Testing Tool and Rich Results Test to inspect individual pages. If you see errors, they should be addressed.
First, take a look at Search Console’s Rich Results Report to review pages at scale and identify which content is being read as Valid, Valid with Errors or Error.
Valid: If your markup is ‘Valid’, then it is being crawled and indexed correctly. These pages are unlikely to require further action.
Errors: Pages with markup that is identified with an ‘Error’ tend to have incorrect syntax, so you should review the individual page and correct the code as soon as possible. When the changes are complete, use the Validate Fix button, to request reassessment.
Valid with Warning: If your content is showing as ‘Valid a Warning’, then you are likely displaying schema markup with a Missing field. These warnings do not make the page or the markup invalid, but they can make the page less competitive, because the content is less targeted. Review your content to ensure that your schema is reflecting as much of the on page content as possible in order to reduce these errors, and therefore increase the performance of your schema markup.
Structured Data Testing Tool and Rich Results Test allow you to troubleshoot improvements to structured data on individual pages. Each of these tools you can enter the URL in question and you will receive itemised information on any errors or warnings.
The missing fields highlighted here correspond to properties within the Event schema type. So, to improve this markup, you would look up the definitions of the relevant properties on schema.org and, where applicable, use their example HTML to guide your optimizations.
In this instance, to improve the performance of my schema, I may need to build new performer fields into the CMS, or to work with the dev team to add the content from existing CMS data fields into the schema regex.
In either case you will be making improvements that help you better target and serve users.
If you need to add schema to your site there are a few options for implementation.
For some single pages with largely static content, adding markup types like Local Business, Organization, or a single FAQ page, can be a straightforward process of generating the code and placing it into the HTML of the page. Major CMS platforms like Shopify and Wordpress have plugins to assist with generating the markup for these pages which is easy to implement. Those with custom CMS configurations can use tools like the Schema Markup Generator to generate the JSON-LD, then pass it onto the development team to push it live.
Bulk schema implementation is almost essential for high volume content creators. This applies to e-commerce shops, but also to those who regularly post standard format content like recipes, blogs, articles, job vacancies, events, training courses, etc.
For these pages, the most effective way to get the most out of the schema on your site is to automate the process by building it into the structure of your site. In most instances, this involves a four phase approach, working in coordination with your developers and clients.
The impact of schema markup which generates rich results, can be easily monitored and measured in Search Console. Within the Enhancements tab, you can monitor the quality of your implementation and any current or recent errors.
To monitor impressions, rankings, clicks and CTR, visit the Search Appearance tab under Performance. This tab provides historic data that can be compared to earlier configurations of the site.
Within Google Analytics, your tracking and monitoring will depend upon your implementation. For instance, google-jobs-apply clicks may show as a separate source from standard search results within Organic. But I’ve also seen Google Shopping clicks show as part of the (other) channel. In either case, annotate your implementation dates to monitor relevant content for changes in clicks, impressions, and conversions.
For many small businesses, Search Console data should be sufficient but there are also tools which can help you drill down further into the data.
Taking a strategic, integrated approach to structured data implementation helps SMEs to stay competitive in today’s search environment because of its scalability, versatility and measurability. Furthermore, the applicability of schema markup as the underpinning of a cohesive content and advertising strategy, brings much needed efficiencies for SME marketers who want to make the most out of their content.
Want to learn more about technical SEO? Check out the Moz Academy Technical SEO Certification Series, an in-depth training series that hones in on the nuts and bolts of technical SEO.
JavaScript SEO is the discipline of technical SEO that’s focused on optimizing websites built with JavaScript for visibility by search engines. It’s primarily concerned with:
It depends! JavaScript is essential to the modern web and makes building websites scalable and easier to maintain. However, certain implementations of JavaScript can be detrimental to search engine visibility.
JavaScript can affect the following on-page elements and ranking factors that are important for SEO:
When we talk about sites that are built on JavaScript, we’re not referring to simply adding a layer of JS interactivity to HTML documents (for example, when adding JS animations to a static web page). In this case, JavaScript-powered websites refer to when the core or primary content is injected into the DOM via JavaScript.
This template is called an app shell and is the foundation for progressive web applications (PWAs). We’ll explore this next.
You can quickly check if a website is built on a JavaScript framework by using a technology look-up tool such as BuiltWith or Wappalyzer. You can also “Inspect Element” or “View Source” in the browser to check for JS code. Popular JavaScript frameworks that you might find include:
Here’s an example: Modern web apps are being built on JavaScript frameworks, like Angular, React, and Vue. JavaScript frameworks allow developers to quickly build and scale interactive web applications. Let’s take a look at the default project template for Angular.js, a popular framework produced by Google.
When viewed in the browser, this looks like a typical web page. We can see text, images, and links. However, let’s dive deeper and take a peek under the hood at the code:
Now we can see that this HTML document is almost completely devoid of any content. There are only the app-root and a few script tags in the body of the page. This is because the main content of this single page application is dynamically injected into the DOM via JavaScript. In other words, this app depends on JS to load key on-page content!
Potential SEO issues: Any core content that is rendered to users but not to search engine bots could be seriously problematic! If search engines aren’t able to fully crawl all of your content, then your website could be overlooked in favor of competitors. We’ll discuss this in more detail later.
Besides dynamically injecting content into the DOM, JavaScript can also affect the crawlability of links. Google discovers new pages by crawling links it finds on pages.
As a best practice, Google specifically recommends linking pages using HTML anchor tags with href attributes, as well as including descriptive anchor texts for the hyperlinks:
However, Google also recommends that developers not rely on other HTML elements — like div or span — or JS event handlers for links. These are called “pseudo” links, and they will typically not be crawled, according to official Google guidelines:
Despite these guidelines, an independent, third-party study has suggested that Googlebot may be able to crawl JavaScript links. Nonetheless, in my experience, I’ve found that it’s a best practice to keep links as static HTML elements.
Potential SEO issues: If search engines aren’t able to crawl and follow links to your key pages, then your pages could be missing out on valuable internal links pointing to them. Internal links help search engines crawl your website more efficiently and highlight the most important pages. The worst-case scenario is that if your internal links are implemented incorrectly, then Google may have a hard time discovering your new pages at all (outside of the XML sitemap).
JavaScript can also affect the crawlability of images that are lazy-loaded. Here’s a basic example. This code snippet is for lazy-loading images in the DOM via JavaScript:
Googlebot supports lazy-loading, but it does not “scroll” like a human user would when visiting your web pages. Instead, Googlebot simply resizes its virtual viewport to be longer when crawling web content. Therefore, the “scroll” event listener is never triggered and the content is never rendered by the crawler.
Here’s an example of more SEO-friendly code:
This code shows that the IntersectionObserver API triggers a callback when any observed element becomes visible. It’s more flexible and robust than the on-scroll event listener and is supported by modern Googlebot. This code works because of how Googlebot resizes its viewport in order to “see” your content (see below).
You can also use native lazy-loading in the browser. This is supported by Google Chrome, but note that it is still an experimental feature. Worst case scenario, it will just get ignored by Googlebot, and all images will load anyway:
Potential SEO issues: Similar to core content not being loaded, it’s important to make sure that Google is able to “see” all of the content on a page, including images. For example, on an e-commerce site with multiple rows of product listings, lazy-loading images can provide a faster user experience for both users and bots!
Javascript can also affect page load times, which is an official ranking factor in Google’s mobile-first index. This means that a slow page could potentially harm rankings in search. How can we help developers mitigate this?
Potential SEO issues: A slow website creates a poor user experience for everyone, even search engines. Google itself defers loading JavaScript to save resources, so it’s important to make sure that any served to clients is coded and delivered efficiently to help safeguard rankings.
Also, it’s important to note that SPAs that utilize a router package like react-router or vue-router have to take some extra steps to handle things like changing meta tags when navigating between router views. This is usually handled with a Node.js package like vue-meta or react-meta-tags.
What are router views? Here’s how linking to different “pages” in a Single Page Application works in React in five steps:
In other words, when users or bots follow links to URLs on a React website, they are not being served multiple static HTML files. But rather, the React components (like headers, footers, and body content) hosted on root ./index.html file are simply being reorganized to display different content. This is why they’re called Single Page Applications!
Potential SEO issues: So, it’s important to use a package like React Helmet for making sure that users are being served unique metadata for each page, or “view,” when browsing SPAs. Otherwise, search engines may be crawling the same metadata for every page, or worse, none at all!
How does this all affect SEO in the bigger picture? Next, we need to learn how Google processes JavaScript.
In order to understand how JavaScript affects SEO, we need to understand what exactly happens when GoogleBot crawls a web page:
First, Googlebot crawls the URLs in its queue, page by page. The crawler makes a GET request to the server, typically using a mobile user-agent, and then the server sends the HTML document.
Then, Google decides what resources are necessary to render the main content of the page. Usually, this means only the static HTML is crawled, and not any linked CSS or JS files. Why?
According to Google Webmasters, Googlebot has discovered approximately 130 trillion web pages. Rendering JavaScript at scale can be costly. The sheer computing power required to download, parse, and execute JavaScript in bulk is massive.
This is why Google may defer rendering JavaScript until later. Any unexecuted resources are queued to be processed by Google Web Rendering Services (WRS), as computing resources become available.
Finally, Google will index any rendered HTML after JavaScript is executed.
In other words, Google crawls and indexes content in two waves:
The bottom line is that content dependent on JS to be rendered can experience a delay in crawling and indexing by Google. This used to take days or even weeks. For example, Googlebot historically ran on the outdated Chrome 41 rendering engine. However, they’ve significantly improved its web crawlers in recent years.
Googlebot was recently upgraded to the latest stable release of the Chromium headless browser in May 2019. This means that their web crawler is now “evergreen” and fully compatible with ECMAScript 6 (ES6) and higher, or the latest versions of JavaScript.
So, if Googlebot can technically run JavaScript now, why are we still worried about indexing issues?
The short answer is crawl budget. This is the concept that Google has a rate limit on how frequently they can crawl a given website because of limited computing resources. We already know that Google defers JavaScript to be executed later to save crawl budget.
While the delay between crawling and rendering has been reduced, there is no guarantee that Google will actually execute the JavaScript code waiting in line in its Web Rendering Services queue.
Here are some reasons why Google might not actually ever run your JavaScript code:
Therefore, JavaScript can cause SEO issues when core content relies on JavaScript but is not rendered by Google.
E-commerce websites are a real-life example of dynamic content that is injected via JavaScript. For example, online stores commonly load products onto category pages via JavaScript.
JavaScript can allow e-commerce websites to update products on their category pages dynamically. This makes sense because their inventory is in a constant state of flux due to sales. However, is Google actually able to “see” your content if it does not execute your JS files?
For e-commerce websites, which depend on online conversions, not having their products indexed by Google could be disastrous.
Here are steps you can take today to proactively diagnose any potential JavaScript SEO issues:
There are also handy third-party tools and plugins that you can use. We’ll talk about these soon.
The best way to determine if Google is experiencing technical difficulties when attempting to render your pages is to test your pages using Google Webmaster tools, such as:
The goal is simply to visually compare and contrast your content visible in your browser and look for any discrepancies in what is being displayed in the tools.
Both of these Google Webmaster tools use the same evergreen Chromium rendering engine as Google. This means that they can give you an accurate visual representation of what Googlebot actually “sees” when it crawls your website.
There are also third-party technical SEO tools, like Merkle’s fetch and render tool. Unlike Google’s tools, this web application actually gives users a full-size screenshot of the entire page.
Alternatively, if you are unsure if JavaScript content is being indexed by Google, you can perform a quick check-up by using the site: search operator on Google.
Copy and paste any content that you’re not sure that Google is indexing after the site: operator and your domain name, and then press the return key. If you can find your page in the search results, then no worries! Google can crawl, render, and index your content just fine. If not, it means your JavaScript content might need some help gaining visibility.
Here’s what this looks like in the Google SERP:
Another method you can use to test and debug JavaScript SEO issues is the built-in functionality of the developer tools available in the Chrome web browser.
Right-click anywhere on a web page to display the options menu and then click “View Source” to see the static HTML document in a new tab.
You can also click “Inspect Element” after right-clicking to view the content that is actually loaded in the DOM, including JavaScript.
Compare and contrast these two perspectives to see if any core content is only loaded in the DOM, but not hard-coded in the source. There are also third-party Chrome extensions that can help do this, like the Web Developer plugin by Chris Pederick or the View Rendered Source plugin by Jon Hogg.
After diagnosing a JavaScript rendering problem, how do you resolve JavaScript SEO issues? The answer is simple: Universal Javascript, also known as “Isomorphic” JavaScript.
What does this mean? Universal or Isomorphic here refers to JavaScript applications that are capable of being run on either the server or the client.
There are a few different implementations of JavaScript that are more search-friendly than client-side rendering, to avoid offloading JS to both users and crawlers:
Each of these solutions helps make sure that, when search engine bots make requests to crawl HTML documents, they receive the fully rendered versions of the web pages. However, some of these can be extremely difficult or even impossible to implement after web infrastructure is already built. That’s why it’s important to keep JavaScript SEO best practices in mind when designing the architecture of your next web application.
Note, for websites built on a content management system (CMS) that already pre-renders most content, like WordPress or Shopify, this isn’t typically an issue.
This guide provides some general best practices and insights into JavaScript SEO. However, JavaScript SEO is a complex and nuanced field of study. We recommend that you read through Google’s official documentation and troubleshooting guide for more JavaScript SEO basics. Interested in learning more about optimizing your JavaScript website for search? Leave a comment below.
The web has moved from plain HTML - as an SEO you can embrace that. Learn from JS devs & share SEO knowledge with them. JS's not going away.
— ???? John ???? (@JohnMu) August 8, 2017
Want to learn more about technical SEO? Check out the Moz Academy Technical SEO Certification Series, an in-depth training series that hones in on the nuts and bolts of technical SEO.
And there’s evidence to support this — usually, where a seven-year-old B2C company is getting 500K visitors per month from SEO, a B2B brand the same age could be seeing only 15K visitors per month. (This is assuming all other things are equal.)
Check out the example below comparing Zola.com (a B2C brand) and Yiedify.com (B2B):
These two sites were founded around the same time (2013) and have been publishing lots of content. Yet, the difference in their traffic numbers makes it look like Yieldify hasn’t been doing much SEO, but that’s not the case.
For instance, when I used the MozBar to analyze the on-page optimization they did on their article about trust badges, I could tell they’re at least following basic SEO principles, like having focus keywords in their URL, page titles, headers, and meta descriptions:
I’d say they’ve not been terrible at optimizing their content for SEO — if they do optimize all their content like they did this one on trust badges.
My point here is: B2C and e-commerce businesses (usually) have way more opportunities in SEO than B2B, especially in terms of search traffic.
But while that is true, it’s also true that no matter how few the search visits, there are still a lot of opportunities in SEO for B2B businesses.
Most of the time, what B2B brands lose in search traffic, they make up in revenue — since their products/services are usually more expensive than those in B2C.
Long story short: there are opportunities for B2B companies in search, and here’s how to capitalize on them in the year ahead.
Every funnel begins at the top, but if you want to generate results as quickly as possible, you should kick off your B2B SEO strategy targeting customers at the bottom of the funnel.
Ready-to-buy customers are already at the bottom of the funnel (BoFu), searching for information that’ll help them make a purchase decision. They’re often searching with keywords like:
As a smart marketer, your strategy should be to prioritize reaching them with the bottom of funnel content they’re looking for.
You probably know what BoFu content looks like, but just so we’re on the same page as to what it really is, see these examples of BoFu content from SocialPilot ranking on page one:
I’m not affiliated with SocialPilot, so I don’t know if they kicked off their SEO content marketing with these BoFu topics (search terms).
But if they did, chances are they experienced quick success (in terms of relevant product awareness and sign-ups), since the articles are ranking on Google’s front page for searchers looking for “Buffer alternatives”.
Bottom line is, as a B2B brand, you’ll be better off prioritizing BoFu topics in your SEO strategy. It’s a much better approach than starting all the way at the top of the funnel, which would be targeting searchers who aren’t ready to make a purchase (or sign-up) decision.
If you think your strategy should be to first target visitors at the top of the funnel (ToFu), you’re probably assuming that your prospects will first consume your ToFu content before ever getting to the bottom.
That’s hardly ever the case in real life. What often happens is:
If you think back to the last purchase decision you made, this was probably the route you took.
So it’s not all the time that buyers will start reading your top of funnel content, discover your product, and then decide to start consuming your BoFu content. Sometimes they’re already at BoFu and all it’d take to convince them to buy your product is the right BoFu content.
You’re probably thinking, “what’s t-shaped content?”. Allow me to explain.
At my agency (Premium Content Shop), we use “t-shaped content” to describe the type of content that performs two functions at the same time:
AND
This little illustration below should help you better understand what our “t-shaped content framework” means:
In practice, this is an example of t-shaped content from Mailshake:
Right after the fifth paragraph of the article, they introduce a CTA:
This is a t-shaped content piece because:
I often advise clients not to introduce anything about their product/service until readers have scrolled about 40% into the content they’re consuming, just to avoid coming across as overly promotional. And I’m not saying putting your CTA that early in an article could never work — it could — but your readers should feel like you're prioritizing them getting value from the content over trying to sell your own stuff right off the bat.
In any case, creating and ranking t-shaped content helps you achieve two objectives:
One reason SEO gets a bad rap, especially among B2B marketers, is the sheer amount of low-quality B2B content that’s ranking on page one in the SERPs. And that’s because, while Google’s algorithm is able to determine search-friendly content, it’s currently not able to see if a page is relevant for a searcher, at least from a human perspective.
So, it ends up ranking content on page one that meets Google’s ranking standards, but not always the searcher’s standards.
As a B2B marketer, you don’t just want to meet Google’s requirements and rank on page one. You need your content to rank AND impress your audience well enough to convert them into leads.
How do you do that? You need to write like professionals speaking to professionals.
Usually, this means you need to see what other industry professionals are saying or have published on any given topic and spell out:
Derek Gleason of CXL mirrors the same idea in a recent tweet:
And as an expert in your field, this is a no-brainer: you’ll almost always have a different opinion to share about popular topics in your industry.
For instance, as an SEO expert, you most likely have fact-based opinions about topics like Google ranking factors, B2B marketing, technical SEO, etc. This knowledge you have about all the topics in your industry is “from-field-experience” ideas that’ll help you connect with customers on a deeper level.
And when you’re creating content based on your original opinions, experience, thoughts, or convictions, you won’t be sounding like everyone else and your content will stand out. Even if it’s similar to other competitors’ content, it’ll still have your original ideas.
Your clients aren't all at the bottom of the funnel. While I’ve advised kicking off your SEO marketing strategy by addressing BoFu topics, many of your potential buyers are still at the top and middle of the funnel.
This means, at the stage where they’re reading your “from-field-experience” content, they’re not even thinking about your product at all. But with the right type of content — with your original thoughts and ideas, you can move them from the top/middle to the bottom of the funnel.
So, if they’ve been consuming your ToFu content for any amount of time, your brand will get their attention better when it’s time for them to consider making a purchase decision.
And yes, they’ll ultimately make a decision based on reviews and other BoFu content, but your ToFu and MoFu content will help you develop authority and trust with potential customers. This will often give you a leg up on your competitors when it’s time for ToFu/MoFu prospects to make a decision.
For example, Dom Kent of Mio once shared how people in the collaboration industry keep finding Mio whenever they search for anything related to their industry; that’s one example of what ToFu and MoFu content does for your brand.
It's like when you Google something about sales management, and Close’s content keeps showing up. When it’s time to buy — or even just recommend — a sales management tool, guess which product you’ll think of? That’s right, Close. It doesn’t always mean you’ll sign up for Close, but that’s at least one of the brands you’d think of first.
Often in B2B, your ideal buyers are experienced professionals. This means that most of the time, they don't need content on the basic topics that entry-level employees might.
If they're sales leaders, for instance, they seldom search for content on basic topics like "what is a sales script" or "how does CRM work?".
You're better off covering more important and sophisticated topics — regardless of whether those topics have high search volume or not.
For instance, CRM provider Copper currently ranks for “cold call script to get appointment”.
It’s a long-tail keyword with only about 500 searches per month.
The low search volume may look unattractive on the surface, but Copper’s target customers are the ones searching for it, and that’s more important than them ranking for a high search volume keyword like “what’s a sales pipeline?” that doesn’t frequently get searched by those customers.
During your keyword research phase, it’s easy to get distracted by high search volume keywords that your target audience barely ever searches for on Google. Move past that distraction and focus on creating content for keywords your target buyers need content on — even if those keywords have low search volumes.
In my first four points, I covered things you need to know about high-quality content creation and the content strategy side of SEO, but I haven’t forgotten about the technical side.
You need to pay attention to technical SEO as well, as it can make or break the opportunities any B2B website can get from search. :
Here are the most important parts of tech SEO that you should get in the habit of checking:
There are a lot of opportunities in SEO for B2B companies — even though the search volumes are often low. I’ve covered what you’d need to use search to your advantage as a B2B marketer.
To recap, you should kick-off your SEO and content marketing by targeting BoFu prospects. And make your content T-shaped, so that it benefits your audience and business at the same time.
Also, don’t just rank content for organic search traffic, rank with “from-field-experience” content/ideas; this will help you generate demand and quality leads as readers will be drawn to your expertise.
And then avoid covering too many basic topics, especially when your target buyers are experienced professionals or C-level decision-makers. Finally, pay attention to the technical side of SEO, too; it can make or break your entire search engine optimization efforts.