Thứ Ba, 29 tháng 9, 2020

Page Authority 2.0: An Update on Testing and Timing

Posted by rjonesx.

One of the most difficult decisions to make in any field is to consciously choose to miss a deadline. Over the last several months, a team of some of the brightest engineers, data scientists, project managers, editors, and marketers have worked towards a release date of the new Page Authority (PA) on September 30, 2020. The new model is exceptional in nearly every way to the current PA, but our last quality control measure revealed an anomaly that we could not ignore.

As a result, we’ve made the tough decision to delay the launch of Page Authority 2.0. So, let me take a moment to retrace our steps as to how we got here, where that leaves us, and how we intend to proceed.

Seeing an old problem with fresh eyes

Historically, Moz has used the same method over and over again to build a Page Authority model (as well as Domain Authority). This model's advantage was its simplicity, but it left much to be desired.

Previous Page Authority models trained against SERPs, trying to predict whether one URL would rank over another, based on a set of link metrics calculated from the Link Explorer backlink index. A key issue with this type of model was that it couldn’t meaningfully address the maximum strength of a particular set of link metrics.

For example, imagine the most powerful URLs on the Internet in terms of links: the homepages of Google, Youtube, Facebook, or the share URLs of followed social network buttons. There are no SERPs that pit these URLs against one another. Instead, these extremely powerful URLs often rank #1 followed by pages with dramatically lower metrics. Imagine if Michael Jordan, Kobe Bryant, and Lebron James each scrimaged one-on-one against high school players. Each would win every time. But we would have great difficulty extrapolating from those results whether Michael Jordan, Kobe Bryant, or Lebron James would win in one-on-one contests against each other.

When tasked with revisiting Domain Authority, we ultimately chose a model with which we had a great deal of experience: the original SERPs training method (although with a number of tweaks). With Page Authority, we decided to go with a different training method altogether by predicting which page would have more total organic traffic. This model presented several promising qualities like being able to compare URLs that don’t occur on the same SERP, but also presented other difficulties, like a page having high link equity but simply being in an infrequently-searched topic area. We addressed many of these concerns, such as enhancing the training set, to account for competitiveness using a non-link metric.

Measuring the quality of the new Page Authority

The results were — and are — very promising.

First, the new model obviously predicted the likelihood that one page would have more valuable organic traffic than another. This was expected, because the new model was directed at this particular goal, while the current Page Authority merely attempted to predict whether one page would rank over another.

Second, we found that the new model predicted whether one page would rank over another better than the previous Page Authority. This was especially pleasing, as it laid to rest many of our concerns that the new model would underperform on old quality controls due to the new training model.

How much better is the new model at predicting SERPs than the current PA? At every interval — all the way down to position 4 vs 5 — the new model tied or out-performs the current model. It never lost.

Everything was looking great. We then started analyzing outliers. I like to call this the “does anything look stupid?” test. Machine learning makes mistakes, just as humans can, but humans tend to make mistakes in a very particular manner. When a human makes a mistake, we often understand exactly why the mistake was made. This isn’t the case for ML, especially Neural Nets; we pulled URLs with high Page Authorities under the new model that happened to have zero organic traffic, and included them in the training set to learn for those errors. We quickly saw bizarre 90+ PAs drop down to much more reasonable 60s and 70s… another win.

We were down to one last test.

The problem with branded search

Some of the most popular keywords on the web are navigational. People search Google for Facebook, Youtube, and even Google itself. These keywords are searched an astronomical number of times relative to other keywords. Subsequently, a handful of highly powerful brands can have an enormous impact on a model that looks at total search volume as part of its core training target.

The last test involves comparing the current Page Authority to the new Page Authority, in order to determine if there are any bizarre outliers (where PA shifted dramatically and without obvious reason). First, let’s look at a simple comparison of the LOG of Linking Root Domains compared to the Page Authority.

Not too shabby. We see a generally positive correlation between Linking Root Domains and Page Authority. But can you spot the oddities? Go ahead and take a minute…

There are two anomalies that stand out in this chart:

  1. There is a curious gap separating the main distribution of URLs and the outliers above and below.
  2. The largest variance for a single score is at PA 99. There are an awful lot of PA 99s with a wide range of Linking Root Domains.

Here is a visualization that will help draw out these anomalies:



The gray spaces between the green and red represent this odd gap between the bulk of the distribution and the outliers. The outliers (in red) tend to clump together, especially above the main distribution. And, of course, we can see the poor distribution at the top of PA 99s.

Bear in mind that these issues are not sufficient to make the new Page Authority model less accurate than the current model. However, upon further examination, we found that the errors the model did produce were significant enough that they could adversely influence the decisions of our customers. It’s better to have a model that is off by a little everywhere (because the adjustments SEOs make are not incredibly fine-tuned) than it is to have a model that is right mostly everywhere but bizarrely wrong in a limited number of cases.

Luckily, we’re fairly confident as to what the problem is. It seems that homepage PAs are disproportionately inflated, and that the likely culprit is the training set. We can’t be certain this is the cause until we complete retraining, but it is a strong lead.

The good news and the bad news

We are in good shape insofar as we have multiple candidate models that outperform the existing Page Authority. We’re at the point of bug squashing, not model building. However, we are not going to ship a new score until we are confident that it will steer our customers in the right direction. We are highly conscientious of the decisions our customers make based on our metrics, not just whether the metrics meet some statistical criteria.

Given all of this, we have decided to delay the launch of Page Authority 2.0. This will give us the necessary time to address these primary concerns and produce a stellar metric. Frustrating? Yes, but also necessary.

As always, we thank you for your patience, and we look forward to producing the best Page Authority metric we have ever released.

Visit the PA Resource Center

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

My 8 Best Local SEO Tips for the 2020 Holidays

Posted by MiriamEllis



Image credit: DoSchu

“No place like home for the holidays.” This will be the refrain for the majority of your customers as we reach 2020’s peak shopping season. I can’t think of another year in which it’s been more important for local businesses to plan and implement a seasonal marketing strategy extra early, to connect up with customers who will be traveling less and seeking ways to celebrate at home.

Recently, it’s become trendy in multiple countries to try to capture the old Danish spirit of hygge, which the OED defines as: A quality of coziness and comfortable conviviality that engenders a feeling of contentment or well-being.

While this sometimes-elusive state of being isn’t something you can buy direct from a store, and while some shoppers are still unfamiliar with hygge by name, many will be trying to create it at home this year. Denmark buys more candles than any other nation, and across Scandinavia, fondness for flowers, warming foods, cozy drinks, and time with loved ones characterizes the work of weaving a gentle web of happiness into even the darkest of winters.

Whatever your business can offer to support local shoppers’ aspirations for a safe, comfortable, happy holiday season at home is commendable at the end of a very challenging 2020. I hope these eight local search marketing tips will help you make good connections that serve your customers — and your business — well into the new year.

1) Survey customers now and provide what they want

Reasonably-priced survey software is worth every penny in 2020. For as little as $20/month, your local business can understand exactly how much your customers’ needs have changed this past year by surveying:

  • Which products locals are having trouble locating
  • Which products/services they most want for the holidays
  • Which method of shopping/delivery would be most convenient for them
  • Which hours of operation would be most helpful
  • Which safety measures are must-haves for them to transact with a business
  • Which payment methods are current top choices

Doubtless, you can think of many questions like these to help you glean the most possible insight into local needs. Poll your customer email/text database and keep your surveys on the short side to avoid abandonment.

Don’t have the necessary tools to poll people at-the-ready? Check out Zapier’s roundup of the 10 Best Online Survey Apps in 2020 and craft a concise survey geared to deliver insights into customers’ wishes.

2) Put your company’s whole heart into affinity

If I could gift every local business owner with a mantra to carry them through not just the 2020 holiday shopping season, but into 2021, it would be this:

It’s not enough to have customers discover my brand — I need them to like my brand.

Chances are, you can call to mind some brands of which you’re highly aware but would never shop with because they don’t meet your personal or business standards in some way. You’ve discovered these brands, but you don’t like them. In 2020, you may even have silently or overtly boycotted them.

On the opposite side of this scenario are the local brands you love. I can wax poetic about my local independent grocery store, stocking its shelves with sustainable products from local farmers, flying its Black Lives Matter and LGBTQ+ flags with pride from its storefront, and treating every customer like a cherished neighbor.

For many years, our SEO industry has put great effort into and emphasis on the discovery phase of the consumer journey, but my little country-town grocer has gone leaps and bounds beyond this by demonstrating affinity with the things my household cares about. The owners can consider us lifetime loyal customers for the ways they are going above-and-beyond in terms of empathy, diversity, and care for our community.

I vigorously encourage your business to put customer-brand affinity at the heart of its holiday strategy. Brainstorm how you can make meaningful changes that declare your company’s commitment to being part of the work of positive social change.

3) Be as accessible and communicative as possible

Once you’ve accomplished the above two goals, open the lines of communication about what your brand offers and the people-friendly aspects of how you operate across as many of the following as possible:

  • Website
  • Local business listings
  • Email
  • Social channels
  • Forms
  • Texts/Messaging
  • Phone on-hold marketing
  • Storefront and in-store signage
  • Local news, radio, and TV media

In my 17 years as a local SEO, I can confidently say that local business listings have never been a greater potential asset than they will be this holiday season. Google My Business listings, in particular, are an interface that can answer almost any customer who-what-where-when-why — if your business is managing these properly, whether manually or via software like Moz Local.

Anywhere a customer might be looking for what you offer, be there with accurate and abundant information about identity, location, hours of operation, policies, culture, and offerings. From setting special hours for each of your locations, to embracing Google Posts to microblog holiday content, to ensuring your website and social profiles are publicizing your USP, make your biggest communications effort ever this year.

At the same time, be sure you’re meeting Google’s mobile-friendly standards, and that your website is ADA-compliant so that no customer is left out. Provide a fast, intuitive, and inclusive experience to keep customers engaged.

With the pandemic necessitating social distancing, make the Internet your workhorse for connecting up with and provisioning your community as much as you can.

4) Embrace local e-commerce and product listings

Digital Commerce 360 has done a good job charting the 30%+ rise in online sales in the first half or 2020, largely resulting from the pandemic. The same publication summarizes the collective 19% leap in traffic to North America’s largest retailers. At the local business level, implementing even basic e-commerce function in advance of the holiday season could make a major difference, if you can find the most-desired methods of delivery. These could include:

  • Buy-online, pick up in-store (BOPIS)
  • Buy-online, pick up curbside
  • Buy online for postal delivery
  • Buy online for direct home delivery by in-house or third-party drivers

Here’s an extensive comparison of popular e-commerce solutions, including which ones have free trials, and the e-commerce column of the Moz blog is a free library of expert advice on optimizing digital sales.

Put your products everywhere you can. Don’t forget that this past April, Google surprised everybody by offering free product listings, and that they also recently acquired the Pointy device, which lets you transform scanned barcodes into online inventory pages.

Additionally, in mid-September, Google took their next big product-related step by adding a “nearby” filter to Google Shopping, taking us closer and closer to the search engine becoming a source for real-time local inventory, as I’ve been predicting here in my column for several years.

Implement the public safety protocols that review research from GatherUp shows consumers are demanding, get your inventory onto the web, identify the most convenient ways to get purchases from your storefront into the customer’s hands, and your efforts could pave the way for increased Q4 profits.

5) Reinvent window shopping with QR codes

“How can I do what I want to do?” asked Jennifer Bolin, owner of Clover Toys in Seattle.

What she wanted to do was use her storefront window to sell merchandise to patrons who were no longer able to walk into her store. When a staff member mentioned that you could use a QR code generator like this one to load inventory onto pedestrians’ cell phones, she decided to give it a try.

Just a generation or two ago, many Americans cherished the tradition of going to town or heading downtown to enjoy the lavish holiday window displays crafted by local retailers. The mercantile goal of this form of entertainment was to entice passersby indoors for a shopping spree. It’s time to bring this back in 2020, with the twist of labeling products with QR codes and pairing them with desirable methods of delivery, whether through a drive-up window, curbside, or delivery.

“We’ve even gotten late night sales,” Bolin told me when I spoke with her after my colleague Rob Ousbey pointed out this charming and smart independent retail shop to me.

If your business locations are in good areas for foot traffic, think of how a 24/7 asset like an actionable, goodie-packed window display could boost your sales.

6) Tie in with DIY, and consider kits

With so many customers housebound, anything your business can do to support activities and deliver supplies for domestic merrymaking is worth considering. Can your business tie in with decorating, baking, cooking, crafting, handmade gift-giving, home entertainment, or related themes? If so, create video tutorials, blog posts, GMB posts, social media tips, or other content to engage a local audience.

One complaint I am encountering frequently is that shoppers are feeling tired trying to piecemeal together components from the internet for something they want to make or do. Unsurprisingly, many people are longing for the days when they could leisurely browse local businesses in-person, taking inspiration from their hands-on interaction with merchandise. I think kits could offer a stopgap solution in some cases. If relevant to your business, consider bundling items that could provide everything a household needs to:

  • Prepare a special holiday meal
  • Bake treats
  • Outfit a yard for winter play
  • Trim a tree or decorate a home
  • Build a fire
  • Create a night of fun for children of various age groups
  • Dress appropriately for warmth and safety, based on region
  • Create a handmade gift, craft, or garment
  • Winter prep a home or vehicle
  • Create a complete home spa/health/beauty experience
  • Plant a spring garden

Kits could be a welcome all-in-one resource for many shoppers. Determine whether your brand has the components to offer one.

7) Manage reviews meticulously

Free, near-real-time quality control data from your holiday efforts can most easily be found in your review profiles. Use software like Moz Local to keep a running tally of your incoming new reviews, or assign a staff member at each location of your business to monitor your local business profiles daily for any complaints or questions.

If you can quickly solve problems people cite in their reviews, your chances are good of retaining the customer and demonstrating responsiveness to all your profiles’ visitors. You may even find that reviews turn up additional, unmet local needs your formal survey missed. Acting quickly to fulfill these requests could win you additional business in Q4 and beyond.

8) Highly publicize one extra reason to shop local this year

“72% of respondents...are likely or very likely to continue to shop at independent stores, either locally or online, above larger retailers such as Amazon.” — Bazaarvoice

I highly recommend reading the entire survey of 12,000 global respondents by Bazaarvoice, quantifying how substantially shopping behaviors have changed in 2020. It’s very good news for local business owners that so many customers want to keep transacting with nearby independents, but the Amazon dilemma remains.

Above, we discussed the fatigue that can result from trying to cobble together a bunch of different resources to check everything off a shopping list. This can drive people to online “everything stores”, in the same way that department stores, supermarkets, and malls have historically drawn in shoppers with the promise of convenience.

A question every local brand should do their best to ask and answer in the runup to the holidays is: What’s to prevent my community from simply taking their whole holiday shopping list to Amazon, or Walmart, or Target this year?

Whatever your business can offer to support local shoppers’ aspirations for a safe, comfortable, happy holiday season at home is commendable at the end of a very challenging 2020. I hope these eight local search marketing tips will help you make good connections that serve your customers — and your business — well into the new year.

My completely personal answer to this question is that I want my town’s local business district, with its local flavor and diversity of shops, to still be there after a vaccine is hopefully developed for COVID-19. But that’s just me. Inspiring your customers’ allegiance to keeping your business going might be best supported by publicizing some of the following:

  • The economic, societal, and mental health benefits proven to stem from the presence of small, local businesses in a community.
  • Your philanthropic tie-ins, such as generating a percentage of sales to worthy local causes — there are so many ways to contribute this year.
  • The historic role your business has played in making your community a good place to live, particularly if your brand is an older, well-established one. I hear nostalgia is a strong influencer in 2020, and old images of your community and company through the years could be engaging content.
  • Any recent improvements you’ve made to ensure fast home delivery, whether by postal mail or via local drivers who can get gifts right to people’s doors.
  • Uplifting content that simply makes the day a bit brighter for a shopper. We’re all looking for a little extra support these days to keep our spirits bright.

Be intentional about maximizing local publicity of your “extra reason” to shop with you. Your local newspaper is doubtless running a stream of commentary about the economic picture in your city, and if your special efforts are newsworthy, a few mentions could do you a lot of good.

Don’t underestimate just how reliant people have become on the recommendations of friends, family, and online platforms for sourcing even the basics of life these days. In my own circle, everyone is now regularly telling everyone else where to find items from hand sanitizer to decent potatoes. Networking will be happening around gifts, too, so anything you get noticed for could support extensive word-of-mouth information sharing.

I want to close by thanking you for being in or marketing businesses that will help us all celebrate the many upcoming holidays in our own ways. Your efforts are appreciated, and I’m wishing you a peaceful, profitable, and hyggelig finish to 2020.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Thứ Hai, 28 tháng 9, 2020

How to Detect and Improve Underperforming Content: A Guide to Optimization

Posted by SamuelMangialavori


Content, content, and more content! That’s what SEO is all about nowadays, right? Compared to when I started working in SEO (2014), today, content is consistently one of the most popular topics covered at digital marketing conferences, there are way more tools that focus on content analysis and optimization, and overall it seems to dominate most of SEO news.

Don’t believe me? Here’s a nice Google Trends graph that may change your mind:

Google Trends screenshot for “content marketing” as a topic, set for worldwide interest.

But why is it that content is now dominating the SEO scene? How vital is content for your SEO strategy, actually? And most importantly: how can you be content with your site’s content? Puns aside, this post aims to help you figure out potential causes of your underperforming content and how to improve it.

Why content is key in SEO in 2020

Content is one of the most important factors in SEO. Just by paying close attention to what Google has been communicating to webmasters in the last few years, it’s clear that they’ve put a strong emphasis on “content” as a decisive ranking factor.

For instance, let’s have a look at this post, from August 2019, which talks about Google’s regular updates and what webmasters should focus on:

“Focus on content: pages that drop after a core update don’t have anything wrong to fix. We suggest focusing on ensuring you’re offering the best content you can. That’s what our algorithms seek to reward.”

The article goes on, listing a series of questions that may help webmasters when self-assessing their own content (I strongly recommend reading the entire post).

That said, content alone cannot and should not be enough for a website to rank well, but it is a pretty great starting point!

Underperforming content: theory first

What is underperforming content?

When I say “underperforming content”, I’m referring to content, either on transactional/commercial pages or editorial ones, that does not perform up to its potential. This could be content that either used to attract a good level of organic traffic and now doesn’t, or content that never did generate any organic traffic despite the efforts you might have put in.

Over 90% of content gets no traffic from Google. Ninety bloody percent! This means that nine pages out of 10 are likely not to receive any organic traffic at all — food for thought.

What are the causes of underperforming content?

There could be many reasons why your content is not doing well, but the brutal truth is often simple: in most cases, your content is simply not good enough and does not deserve to rank in the top organic positions.

Having said that, here are the most common reasons why your content may be underperforming: they are in no particular order and I will highlight the most important, in my opinion.

Your content does not match the user intent

Based on my experience, this is a very important thing that even experienced marketers still get wrong. It may be the case that your content is good and relevant to your users, but does not match the intent that Google is showcasing in the SERP for the keywords of focus.

As SEOs, our aim should be to match user intent, which means we first need to understand the what and the who before defining the how. Whose intent we are targeting and what is represented in the SERP will define the strategy we use to get there.

Example: webmasters who hope to rank for a “navigational or informational” keyword with a transactional, page or vice versa.

Your content isn’t in the ideal format Google is prioritizing

Google may be favoring a certain type of format which your content doesn’t conform to, hence it isn’t receiving the expected visibility.

Example: you hope to rank with a text-heavy blog post for a “how to” keyword where Google is prioritizing video content.

Your content is way too “thin” compared to what is ranking

It doesn’t necessarily have to be a matter of content length (there is no proven content length formula out there, trust me) but more relevance and comprehensiveness. It may be the case that your content is simply not as compelling as other sites out there, hence Google prioritizing those over you.

Example: you hope to rank for heavily competitive informational keywords with a 200-words blog post.

Your content isn’t as up-to-date

If your content is very topical, and such a topic heavily depends on information which may change with time, then Google will reward sites that put effort into keeping the content fresh and up-to-date. Apart from search engines themselves, users really care about fresh content — no one wants to read an “SEO guide to improve underperforming content” that was created in 2015!

Example: certain subjects/verticals tend to be more prone to this issue, but generally anything related to regulations/laws/guidelines which tend to change often.

Your content is heavily seasonal or tied to a past event/experience

Self-explanatory: if your content is about something that occurred in the past, generally the interest for that particular subject will gradually decrease over time. There are exceptions, of course (god save the 90s and my fav Netflix show “The Last Dance”), but you get the gist.

Example: topics such as dated events or experiences (Olympics 2016, past editions of Black Friday, and so on) or newsworthy content (2016 US election, Kanye running for president — no wait that is still happening...).

Your tech directives have changed the page’s indexation status

If something happens to your page that makes it fall out of Google’s index. The most common issues could be: unexpected no-index tag, canonical tag, incorrect hreflang tags, page status changes, page removed with Google Search Console’s remove tool, and so on.

Example: after some SEO recommendations, your devs mistakenly put a no-index tag on your page without you realizing.

Your page is victim of duplication or cannibalization

If you happen to cover the same or similar keyword topic with multiple pages, this may trigger duplication and/or cannibalization, which ultimately will result in a loss of organic visibility.

Example: you launch a new service page alongside your current offerings, but the on-page focus (metadata, content, linking structure) isn’t different or unique enough and it ends up cannibalizing your existing visibility.

Your page has been subject to JavaScript changes that make the content hard to index for Google

Let’s not go into a JavaScript (JS) rabbit hole and keep it simple: if some JS stuff is happening on your page and it’s dynamically changing some on-page SEO elements, this may impact how Google indexes your content.

Example: fictitious case where your site goes through a redesign, heavy JS is now happening on your browser and changing a key part of your content that now Google cannot render easily — that is a problem!

Your page has lost visibility following drastic SERP changes

The SERP has changed extensively in the last few years, which means many more new features that are now present weren’t there before. This may cause disruption to previous rankings (hence to your previous CTR), or make your pages fall out of Google’s precious page one.

Also, don’t forget to consider that the competition might have gotten stronger with time, so that could be another reason why you lose significant visibility.

Example: some verticals have been impacted more than others (jobs, flights, and hotels, for instance) where Google’s own snippets and tools are now getting the top of the SERP. If you are as obsessed with SERP chances, and in particular PAA, as I am and want more details, have a read here.

Your content doesn’t have any backlinks

Without going into too much detail on this point — it could be a separate blog post — for very competitive commercial terms, not having any/too few backlinks (and what backlinks represent for your site in Google’s eyes) can hold you back, even if your page content is compelling on its own. This is particularly true for new websites operating in a competitive environment.

Example: for a challenging vertical like fashion, for instance, it is extremely difficult to rank for key head terms without a good amount of quality (and naturally gained) backlinks to support your transactional pages.

How to find the issues affecting your content

We’ve covered the why above, let’s now address the how: how to determine what issue affects your page/content. This part is especially dedicated to a not-too savvy SEO audience (skip this part and go straight to next if you are after the how-to recommendations).

I’ll go through a list of checks that can help you detect the issues listed above.

Technical checks

Google Search Console

Use the URL inspection tool to analyze the status of the page: it can help you answer questions such as:

  • Has my page been crawled? Are we even allowing Google to crawl the page?
  • Has my page been indexed? Are we even allowing Google to index the page?

By assessing the Coverage feature, Google will share information about the crawlability and indexability of the page.


Pay particular attention to the Indexing section, where they mention user-declared canonical vs google-selected canonical. If the two differ, it’s definitely worth investigating the reason, as this means Google isn’t respecting the canonical directives placed on the page — check official resources to learn more about this.

Chrome extensions

I love Chrome extensions and I objectively have way too many on my browser…

Some Chrome extensions can give you lots of info on the indexability status of the page with a simple click, checking things like canonical tags and meta robots tags.

My favorite extensions for this matter are:

JavaScript check

I’ll keep it simple: JavaScript is key in today’s environment as it adds interactivity to a page. By doing so, it may alter some key HTML elements that are very important for SEO. You can easily check how a page would look without JS by using this convenient tool by Onley: WWJD.

Realistically speaking, you need only one of the following tools in order to check whether JavaScript might be a problem for your on-page SEO:

All the above tools are very useful for any type of troubleshooting as they are showcasing the rendered-DOM resources in real-time (different from what the “view-source” of a page looks like).

Once you’ve run the test, click to see the rendered HTML and try and do the following checks:

  • Is the core part of my content visible?
    • Quick way to do so: find a sentence in your content, use the search function or click CTRL + F with that sentence to see if it’s present in the rendered version of the page.
  • Are internal links visible to Google?
    • Quick way to do so: find an internal link on the page, use the search function or click CTRL + F with that sentence to see if it’s present in the rendered version of the page.
  • Can Google access other key elements of the page?
    • Check for things such as headers (example below with a Brainlabs article), products, pagination, reviews, comments, etc.

Intent and SERP analysis

By analyzing the SERP for key terms of focus, you’ll be able to identify a series of questions that relate to your content in relation to intent, competition, and relevance. All major SEO tools nowadays provide you with tons of great information about what the SERP looks like for whatever keyword you’re analyzing.

For the sake of our example, let’s use Ahrefs and the sample keyword below is “evergreen content”:

Based on this example, these are a few things I can notice:

  • This keyword triggers a lot of interesting SERP features (Featured Snippet, Top Stories, People also ask)
  • The top organic spots are owned by very established and authoritative sources (Ahrefs blog, Hubspot, Wordstream etc), which makes this keyword quite difficult to compete for

Here are quick suggestions on what types of checks I recommend:

  • Understand and classify the keyword of analysis, based on the type of results Google is showing in the SERP: any ads showing, or organic snippets? Are the competing pages mainly transactional or informational?
  • Check the quality of the sites that are ranking in page one: indicative metrics that can help you gather insights on the quality of each domain (DA/DR) are helpful, the number of keywords those pages are visible for, the estimated traffic per page, and so on.
  • Do a quick crawl of these pages to bulk check the comprehensiveness of their content and metadata, or manually check some if you prefer that way.

By doing most of these checks, you’ll be able to see if your content is underperforming for any of the reasons previously mentioned:

  • Content not compelling enough compared to what is ranking on page one
  • Content in the wrong format compared to what Google is prioritizing
  • Content is timely or seasonal
  • Content is being overshadowed by SERP features

Duplication and cannibalization issues

Check out my 2019 post on this subject, which goes into a lot more detail. The quick version of the post is below.

Use compelling SEO tools to understand the following:

  • whether, for tracked keywords of interest, two or more ranking URLs have been flip-flopping. That is a clear sign that search engines are confused and cannot “easily decide” on what URL to rank for a certain keyword.
  • whether, for tracked keywords of interest, two or more ranking URLs are appearing at the same time (not necessarily on page one of the SERP). That is a clear signal of duplication/cannibalization.
  • check your SEO visibility by landing page: if different URLs that rank for very similar keyword permutations, chances are there is a risk there.
  • last but not least: do a simple site search for keywords of interest in order to get an initial idea of how many pages (that cover a certain topic) have been indexed by Google. This is an insightful preliminary exercise and also useful to validate your worries.

How to fix underperforming content

We’ve covered the most common cases of underperforming content and how to detect such issues — now let’s talk about ways to fix them.

Below is a list of suggested actions to take when improving your underperforming content, with some very valuable links to other resources (mostly from Moz or Google) that can help you expand on individual concepts.

Make sure your page can be crawled and indexed “properly”

  • Ensure that your page does not fall under any path of blocked resources in Robots.txt
  • Ensure your page is not provided with a no-index meta robots tag or a canonical tag pointing elsewhere (a self-referencing canonical tag is something you may want to consider but not compulsory at all).
  • Check whether other pages have a canonical tag pointing to your URL of focus. Irrelevant or poorly-done canonical tags tend to get ignored by Google — you can check if that is the case in the URL Inspection tool.
  • Ensure your site (not just your page) is free from any non-SEO friendly JavaScript that can alter key on-page elements (such as headers, body content, internal links, etc.).
  • Ensure your page is linked internally on the site and present in your XML sitemap.

Understand search intent

  • Search intent is a fascinating topic in and of itself, and there are a lot of great resources on the subject if you want to delve deeper into it.
  • Put simply, you should always research what the SERP looks like for the topic of interest: by analyzing the SERP and all its features (organic and non), you can get a much better understanding of what search engines are looking for in order to match intent.
  • By auditing the SERP, you should be able to answer the following questions:
    • What type of content is Google favoring here: transactional, navigational, informational?
    • How competitive are the keywords of focus and how authoritative are those competitors ranking highly for them?
    • What content format is Google showcasing in the SERP?
    • How comprehensive should my content be to get a chance to rank in page one?
    • What keywords are used in the competitor’s metadata?
    • What organic features should I consider addressing with my content (things like featured snippets, people also ask, top images, etc.)?
  • Hopefully all the questions above will also give you a realistic view of your chances of ranking on Google’s first page. Don’t be afraid to switch your focus to PPC for some very competitive keywords where your real possibility of organic rankings are slim.

Map your pages against the right keywords

  • This is a necessary step to make sure you have a clear understanding of not only what keywords you want to rank for, but also what keywords you are eligible to rank for.
  • Don’t overdo it and be realistic about your ranking possibilities: mapping your page against several keywords variations, all of which show very different SERPs and intents, is not realistic.
  • My suggestion is to pick two or three primary keyword variations and focus on getting your content as relevant as possible to those terms.

Write great metadata

  • Title tags are still an incredibly important on-page ranking factor, so dedicate the right time when writing unique and keyword-rich titles.
  • Meta descriptions are not a ranking factor anymore, but they still play a part in enticing the user to click on a search result. So from a CTR perspective, they still matter.
  • SEO keyword research is the obvious choice to write compelling metadata, but don’t forget about PPC ad copies — check what PPC ad copies work best for your site and take learnings from them.
  • Don’t change metadata too often, though: make sure you do your homework and give enough time to properly test new metadata, once implemented.

Make the right content amends

  • Based on the intent audit and keyword mapping insights, you’re now ready to work on your actual page content.
  • By now, you’ve done your homework, so you just need to focus on writing great content for the user (and not for Google).
  • Readability is a very important part of a page. Tricks that I’ve learned from colleagues over the years are the following:
    • Read the content out loud and try to objectively assess how interesting it is for your target audience.
    • Make sure to use enough spacing between lines and paragraphs. People’s attention span these days is very short, and chances are people will skim through your content rather than dedicating 100% of their attention to it (I’m sure some of YOU readers are doing it right now!).
    • Make sure your tone of voice and language match your target audience (if you can write things in plain English vs. highly technical jargon, do so and don’t over-complicate your life).
  • Make sure you’ve thought about all internal linking possibilities across the site. Not only for the same type of page (transactional page to transactional page, for instance) but also across different types (transactional page to video/blog post, if that helps people make a decision, for example).
  • Optional step: once everything is ready, request indexing of your page in Google Search Console with the URL inspection tool.

Final thoughts

Underperforming content is a very common issue and should not take you by surprise, especially considering that content is considered among (if not the) most important ranking factors in 2020. With the right tools and process in place, solving this issue is something everyone can learn: SEO is not black magic, the answer tends to be logical.

First, understand the cause(s) for your underperforming content. Once you’re certain you’re compliant with Google’s technical guidelines, move on to determining what intent you’re trying to satisfy. Your research on intent should be comprehensive: this is what’s going to decide what changes you’ll need to make to your content. At that point, you’ll be ready to make the necessary SEO and content changes to best match your findings.

I hope this article is useful! Feel free to chat about any questions you may have in the comments or via Twitter or LinkedIn.


To help us serve you better, please consider taking the 2020 Moz Blog Reader Survey, which asks about who you are, what challenges you face, and what you'd like to see more of on the Moz Blog.

Take the Survey

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Thứ Sáu, 25 tháng 9, 2020

Accessible Machine Learning for SEOs — Whiteboard Friday

Posted by BritneyMuller

Machine learning — a branch of artificial intelligence that studies the automatic improvement of computer algorithms — might seem far outside the scope of your SEO work. MozCon speaker (and all-around SEO genius) Britney Muller is here with a special edition of Whiteboard Friday to tell you why that's not true, and to go through a few steps to get you started. 

To see more on machine learning from Britney and our other MozCon 2020 speakers, check out this year's video bundle. 

Get my MozCon 2020 video bundle

Accessible Machine Learning

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Hey, Moz fans. Welcome to this special edition of Whiteboard Friday. Today we are taking a sneak peek at what I spoke about at MozCon 2020, where I made machine learning accessible to SEOs everywhere.

This is so, so exciting because it is readily at your fingertips today, and I'm going to show you exactly how to get started. 

So to kick things off, I learned about this weird concept called brood parasites this summer, and it's fascinating. It's basically where one animal tricks another animal of the same species to raise its young.

It's fascinating, and the more I learned about it, the more I realized: oh my gosh, I'm sort of like a brood parasite when it comes to programming and machine learning! I latch on and find these great models that do all the work — all of the raising — and I put in my data and my ideas, and it does things for me.

So we are going to use this concept to our advantage. In fact, I have been able to teach my dad most of these models that, again, are readily available to you today within a tool called Colab. Let me just walk you through what that looks like. 

Models to get you started

So to get started, if you want to start warming up right now, just start practicing clicking "Shift" and then click "Enter".

Just start practicing that right now. It's half the battle. You're about to be firing up some really cool models. 



All right. What are some examples of that? What does that look like? So some of the models you can play with today are things like DeOldify, which is where you repair and colorize old photos. It's really, really fun. 

Another one is a text generator. I created one with GTP-2 — super silly, it's this excuse generator. You can manipulate it and make it do different things for you. 

There's also a really, really great forecasting model, where you basically put in a chunk of time series data and it predicts what the future might have in store. It's really, really powerful and fun.

You can summarize text, which is really valuable. Think about meta descriptions, all that good stuff. 

You can also automate keyword research grouping, which I'll show you here in a second. 

You can do really powerful internal link analysis, set up a notebook for that.

Perhaps one of the most powerful things is you can extract entities and categories as Google perceives them. It's one of my favorite APIs. It's through Google's NLP API. I pull it into a notebook, and you basically put the URLs you want to extract this information from and you can compare how your URL compares to competitors.

It's really, really valuable, fun stuff. So most importantly, you cannot break any of this. Do not be intimidated by any of the code whatsoever. Lots of seasoned developers don't know what's happening in some of those code blocks. It's okay.

Using Colab

We get to play in this environment. It's hosted in Google Drive, and so there's no fear of this breaking anything on your computer or with your data or anything. So just get ready to dive in with me. Please, it's going to be so much fun. Okay, so like I said, this is through a free tool called Colab. So you know how Google basically took Excel and made Google Sheets?

They did the same thing with what's known as Jupyter Notebooks. So these were locally on computers. It's one of the most popular notebook environments. But it requires some setup, and it can be somewhat clunky. It gets confused with different versions and yada, yada. Google put that into the cloud and is now calling it Colab. It's unbelievably powerful.

So, again, it's free. It's available to you right now if you want to open it up in a new tab. There is zero setup. Google also gives you access to free GPU and TPU computing, which is great. It has a 12-hour runtime. 

Some cons is that you can hit limits. So I hit the limits, and now I'm paying $9.99 a month for the Pro version and I've had no problems.

Again, I'm not affiliated with this whatsoever. I'm just super passionate about it, and the fact that they offer you a free version is so exciting. I've already seen a lot of people get started in this. It's also something to note that it's probably not as secure or robust as Google's Enterprise solution. So if you're doing this for a large company or you're getting really serious about this, you should probably check out some other options. But if you're just kind of dabbling and want to explore and have fun, let's keep this party going. 

Using pandas

All right. So again, this is basically a cloud hosted notebook environment. So one thing that I want to really focus on here, because I think it's the most valuable for SEOs, is this library known as "pandas".

Pandas is a data frame library, where you basically run one — or two — lines of code. You can choose your file from your local computer, so I usually just upload CSVs. This silly example is one that I really did run with Google Search Console data.

So you run this in a notebook. Again, I'm sharing this entire notebook with you today. So if you just go to it and you do this, it brings you through the cells. It's not as intimidating as it looks. So if you just click into that first cell, even if it's just that text cell, "Shift + Enter", it will bring you through the notebook. 


So once you get past and once you fire up this chunk of code right here, upload your CSV. Then once you upload it, you are going to name your data frame. 


So these are the only two cells you need to really change or do anything with if you want. Well, you need to. 

So we are uploading your file, and then we are grabbing that file name. In this case, mine was just "gsc-example.csv". Again, once you upload it, you will see the name in that output here. So you just put that within this code block, run this, and then you can do some really easy lines of code to check to make sure that your data is in there.


So one of the first ones that most people do is "df". This is your data frame that you named with your file right here. So you just do "df.head()". This shows you the first five rows of your data frame. You can also do "df.tail()", and it shows you the last five rows of your data frame.

You can even put in a number in here to modify how many rows you want to explore. So maybe you do "df.head(30)", and then you see the first 30 rows. It's that easy just to get it in there and to see it. Now comes the really fun stuff, and this is just tip of the iceberg.

So you can run this really, really cool code cell here to create a filterable table. What's powerful about this, especially with your Google Search Console data, is you can easily extract and explore keywords that have high click-through rate and a low ranking in search. It's one of my favorite ways to explore keyword opportunities for clients, and it couldn't be easier.

So check that out. This is kind of the money part right here. 

If you're doing keyword research, which can take a lot, right, you're trying to bucket keywords, you're trying to organize topics and all that good stuff, you can instantly create a new column with pandas with branded keyword terms.

So just to walk you through this, we're going "df["Branded"]". This is the name of the new column we're going to create. We have this query string "contains," and this is just regex, ("moz|rand|ose"). So any keywords that contain one of those words gets in the "Branded" column a "True".

So now that makes filtering and exploring that so much faster. You can even do this in ways where you can create an entirely different data frame table. So sometimes if you have lots and lots of data, you can use the other cell in that example. All of these examples will be in the notebook.

You can use that and export your keywords into buckets like that, and there's no stall time. Things don't freeze up like Excel. You can account for misspellings and all sorts of good stuff so, so easily with regular expressions. So super, super cool.

Conclusion

Again, this is just tip of the iceberg, my friends. I am most excited to sort of plant this seed within all of you so that you guys can come back and teach me what you've been able to accomplish. I think we have so much more to explore in this space. It is going to be so much fun. If you get a kick out of this and you want to continue exploring different models, different programs within Colab, I highly suggest you download the Colab Chrome extension.

It just makes opening up the notebook so much easier. You can save a copy to your drive and play with it all you want. It's so much fun. I hope this kind of sparked some inspiration in some of you, and I am so excited to hear what all of you think and create. I really appreciate you watching.

So thank you so much. I will see you all next time. Bye.

Video transcription by Speechpad.com


Ready for more?

You'll uncover even more SEO goodness in the MozCon 2020 video bundle. At this year's special low price of $129, this is invaluable content you can access again and again throughout the year to inspire and ignite your SEO strategy:

  • 21 full-length videos from some of the brightest minds in digital marketing
  • Instant downloads and streaming to your computer, tablet, or mobile device
  • Downloadable slide decks for presentations

Get my MozCon 2020 video bundle


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Thứ Ba, 22 tháng 9, 2020

How to Turn One Piece of Content into Multiple for SEO

Posted by liambbarnes

As most SEO specialists have learned, you must create quality content to grow organically. The same thing can be said for businesses that are building a social media presence or a new newsletter following.

But as people consume more and more content each day, they become less receptive to basic content that doesn't provide a new perspective. To counter this issue, you must make sure that your content is native to each platform you publish on.

However, that doesn’t mean that you need to start from scratch. There's a way to take one content idea and turn it into multiple, which can scale across multiple platforms and improve your brand awareness.

It takes time to write a brand-new blog article every day, especially when you're an in-house team with a low number of resources and budget. The biggest challenge here is building a content strategy at scale.

So, how do you create a lot of great content?

You start with video.

If you have a video on a relevant topic, it can be repurposed into various individual pieces of content and distributed over a period of time across the right channels. Let’s walk through the process.

Using video to scale content

Did you know that the average person types at 41 words per minute (WPM), but the average person speaks at about 150 WPM? That is about 3.5 times faster speaking rather than typing. 

In fact, this article was transcribed.

For every article you write about, you must do extensive research, write out your first draft, edit, make changes, and more. It can consume an entire workday.

An easier way to do this? Record yourself on Loom or another video software, save it, and send the video file to an audio/video transcription service. There are so many tools, like Rev.com or TranscribeMe, that do this for relatively cheap.

Of course, even if you're relying on text-to-speech, there's still editing time to take into account, and some would argue it will take MORE time to edit a text-to-speech transcription. There isn’t a “best way” to create content, however, for those who aren't strong writers but are strong speakers, transcription will be a powerful way to move at a quicker pace.

The step-by-step process 

Once you write out your content, how do you ensure that people read it?

Like any other content strategy, make sure that the process of planning, creating, and executing is written down (most likely digitally in a spreadsheet or tracking tool) and followed.

Let’s break down how to get the most out of your content.

1. Grab attention with your topic

Sometimes, content ideation can be the most challenging part of the process. Depending on the purpose of your content, there are various starting points.

For example, if you're writing a top-of-funnel blog article where the goal is to drive high amounts of organic traffic, start by performing keyword research to craft your topic. Why? You need to understand what your audience searches for and how to ensure you’re in the mix of search results. 

If you're creating a breakdown of your product or service, you may want to start by interviewing a subject matter expert (SME) to gain real-life details on the product/service and the solutions it provides to your target audience. Why? Note what they’re saying are the most important aspects or if there is a new feature/addition for the audience. These points can be tied into a topic that might pique the target reader's interest.

2. Create an outline for the blog

When you're building out your blog structure, record a video similar to how you would write a blog article.

In this case, by creating an outline for the article with the questions that you ask yourself, it'll be easier to format the transcription and the blog after you record.

3. Pick your poison (distribution strategy)

Now that you're ready to begin recording your video, decide where your content will be distributed.

The way you'll distribute your content heavily influences the way you record your video, especially if you're going to be utilizing the video as the content itself (Hello, YouTube!).

For example, if you run a business consultancy, the videos that you record should be more professional than if you run an e-commerce surf lifestyle brand. Or, if you know you’re going to be breaking the video up, leave time for natural “breaks” for easy editing later on.

By planning ahead of time, you give yourself a better idea of where the content will go, and how it will get there.

4. Your time to shine

There are numerous free video recording software available, including Zoom and Loom.

With Zoom, you can record the video of yourself speaking into your camera, and you will get an audio file after you hang up your call.

With Loom, you can use the chrome extension, which allows you to record yourself in video form while sharing your screen. If you have additional content, like a Powerpoint presentation or a walk-through, this might be the tool for you.

Regardless of the way that you record, you need an audio file to transcribe and transform into other content formats later on.

5. Transcribe your video

The average writer transcribes one hour of audio in around four hours, but some of the best transcribers can do it in as little as two hours.

To put that into perspective, the average one-hour audio file is about 7,800 words, which would take the average writer around three and a half hours to write.

Additionally, you have to add research time, internal linking, and many other factors to this, so on average it'll take around an hour to write 1,000 words of a high-quality blog post.

Transcription shortens the length of this process.

When looking to transcribe your audio, you can send files out to transcription tools including Rev or TranscribeMe. Once you send them the audio file, you'll typically receive the audio file back in a few hours (depending on the demand).

6. Alter transcription into blog format

You'll receive the transcribed content via email, broken out by speaker. This makes it much easier to format post-transcription.

If you properly outlined the blog prior to recording, then this editing process should be simple. Copy and paste each section into the desired area for your blog and add your photos, keywords, and links as desired.

7. Chop your video into digestible parts

Here’s where things get interesting.

If you're using your video for social media posts, shorten the video into multiple parts to be distributed across each platform (and make sure they’re built to match each platform's guidelines).

Additionally, quotes from the video can be used to create text graphics, text-based social posts, or entire articles themselves.

Think of the watering holes that your target audience consumes information on the internet:

  • Google
  • LinkedIn
  • Instagram
  • Facebook
  • Twitter
  • YouTube

Each platform requires creating a different experience that involves new, native content. But that doesn’t mean you have to start at zero.

If you have a 10-minute-long video, it can be transcribed into a 2,500-word blog that takes about 10-15 minutes to read.

Boom. You have another resource to share, which can also include proper keywords so it ranks higher on the SERP.

Let’s say you end up editing the video down to about five minutes. From here, you can make:

  • A five minute video to post on YouTube and your blog
  • Ten 30-second videos to post across several social media platforms
  • Twenty 100-word posts on LinkedIn
  • Thirty 50 to 60-word posts on Twitter

Woah.

Not to mention there are other platforms like Reddit and Quora, as well as email marketing, that you can also distribute your content with. (Turn one of the 100-word LinkedIn posts into the opening in your latest newsletter, and attach the full video for those who want to learn more!)

By starting off with an all-encompassing video, you extend your content capabilities from a regular blog article into 50+ pieces of content across multiple social media platforms and search engines.

For example, Lewis Howes (and many other brands and marketers) are famous for utilizing this method.

As you can see below, Howes had an interview for his podcast with Mel Robbins, which is scaled across YouTube and podcast platforms, but he took a quote from her in the interview and scaled it across Instagram, Twitter, and LinkedIn.

When you build out your content calendar, simply copy and paste certain sections into an excel spreadsheet, and organize them based on date and platform. Make sure they make sense on the platform, add an extra line or two if you need to, and work your magic.

This will save you hours of time in your planning process.

8. Distribute

Now that you have created your various forms of content, it’s time to make sure it appears before the right eyes.

Having a consistent flow of relevant content on your website and social media platforms is a crucial part of empowering your brand, building credibility, and showing that you’re worth trusting as a potential partner.

As you repurpose older content as well, you can repeat this process and pull together another 50+ pieces of content from a previously successful article.

Improving organic search visibility

"Discoverability" is a popular term in marketing. Another way to say it is "organic search visibility". Your brand’s search visibility is the percentage of clicks that your website gets in comparison to the total number of clicks for that particular keyword or group of keywords.

Normally, you can improve your visibility through writing a piece of content that reflects a target keyword the best and build links to that page, which improves your rankings for that keyword and long-tail variations of that keyword.

However, as you begin to grow your business, you may begin heavily relying on branded search traffic.

In fact, one of the biggest drivers of organic traffic is branded traffic. If you don't have an authoritative brand, it's challenging to receive backlinks naturally, and therefore more difficult to rank organically.

One of the biggest drivers of brand awareness is through social media. More than 4.5 billion people are using the internet and 3.8 billion are using social media.

If you want more people to search for your brand, push relevant social media campaigns that do just that.

But even further than that, we are seeing more and more social media platforms such as Pinterest, YouTube, and Twitter showing up as search results and snippets. For example, below is the SERP for the keyword “how to make cookies”, where a series of YouTube videos show up:

And this SERP for the keyword “Moz“ has the most recent Tweets from Moz's Twitter.

Writing content that ranks will continue to be important — but as Google keeps integrating other forms of social media into the SERPs, make time to post on every social media platform to improve search visibility and make your brand discoverable. 

But, duplicate content?

Duplicate content can be defined as the same content used across multiple URLs, and can be detrimental to your website’s health. However, from what we have seen through multiple conversations with marketers in the SEO world, there is no indication that websites are getting penalized for duplicate content when reposting said content on social media platforms.

Conclusion

Say goodbye to the time drain of creating one piece of content at a time. The most effective way to create a successful content marketing strategy is to share thought-provoking and data-driven content. Take advantage of this process to maximize your output and visibility.

Here are some final tips to take away to successfully launch a content marketing strategy, using this method:

  1. Consistently analyze your results and double down on what works.
  2. Don’t be afraid to try new tactics to see what your audience is interested in (Check out a real-world content strategy I helped get results for here).
  3. Analyze the response from your audience. They'll tell you what is good and what is not!

Have other ideas? Let me know in the comments! 


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Thứ Hai, 21 tháng 9, 2020

Search Intent and SEO: A Quick Guide

Posted by DawnMacri

Understanding search intent can be the secret ingredient that brings your content strategy from okay to outstanding. As an SEO Strategist at a digital marketing agency (Brainlabs), we often find clients on the brink of ranking success. They’re sitting on stellar content that simply isn’t ranking for their target keywords. Why? Oftentimes, the keywords and the intent simply don’t match.

Here we’ll discuss the different types of search intent, how to determine the best intent for given keywords, and how to optimize for search intent. First–let’s iron out the basics.

What is search intent?

Search intent (also known as user intent) is the primary goal a user has when searching a query in a search engine. Many times, users are searching for a specific type of answer or resource as they search.

Take pizza for example. Searching for a pizza recipe has a different intent than searching for a takeout pizza, which is also different from searching for the history of pizza. Though they all revolve around the same overall topic (pizza), these users all have different intents.

Why is search intent important for SEO?

Google cares about search intent

The short answer is: Satisfying search intent is a primary goal for Google, which in turn makes it a primary goal for SEOs. When a user searches for a specific term and finds irrelevant information, that sends a signal back to Google that the intent is likely mismatched.

For example, if a user searches “How to build a website,” and they’re shown a slew of product pages for CMS platforms and hosting sites, they’ll try another search without clicking on anything. This is a signal to Google that the intent of those results do not reflect the intent of the searcher.

Broaden your reach across funnel stages

When it comes to running a business and building a successful content marketing strategy, I can’t stress enough the importance of remembering search intent, and letting that be the driving force behind the pieces of content you create and how you create them.

And just why is this so important? The more specific your content is to various search intents, the more users you can reach, and at different stages of the funnel. From those who are still to discover your brand to those looking to convert, you can increase your chances of reaching them all by focusing your efforts on matching search intent.

You can improve rankings

Since Google’s primary ranking factors are relevance, authority, and user satisfaction, it’s easy to connect the dots and see how improving your keyword targeting to mirror search intent can improve your overall rankings.

Relevance: This has to do with your user’s behavior. If they find the information they’re looking for on your site, they’re less likely to return to Google within seconds and explore a different result (pogo-sticking). You’ll notice a difference in such KPIs as click-through rate and bounce rate when your content is relevant to search intent.

Authority: While much of a site’s authority is connected to backlinks, it’s also important to develop a strong internal linking strategy that signals to Google “I have a lot of content covering all angles and intents surrounding this topic” to rank well. Additionally, you can increase brand authority and visibility by creating valuable content around topics your brand is well versed in, that satisfies various intents.

User satisfaction: Does the content you create provide value and is it relevant to your audience? End of story.

Types of search intent

While there are endless search terms, there are just four primary search intents:

  1. Informational
  2. Preferential/Commercial Investigation
  3. Transactional
  4. Navigational

Now you may be thinking, that’s all well and good, but what do they mean for my content? Luckily, I’ve broken each one down with example terms that suggest intent. Keep in mind, however, that searches are not binary –– many will fall under more than one category.

Informational

As you may have guessed, searches with informational intent come from users looking for... information! This could be in the form of a how-to guide, a recipe, or a definition. It’s one of the most common search intents, as users can look for answers to an infinite number of questions. That said, not all informational terms are questions. Users searching for simply “Bill Gates” are most likely looking for information about Bill Gates.

Examples:

  • How to boil an egg
  • What is a crater
  • Ruth Bader Ginsburg
  • Directions to JFK Airport

Preferential/Commercial Investigation

Before they’re ready to make a purchase, users start their commercial investigation. This is when they use search to investigate products, brands, or services further. They’re past the informational stage of their research and have narrowed their focus to a few different options. Users here are often comparing products and brands to find the best solution for them.

Note: These searches often include non-branded localized terms such as “best body shop near me” or “top sushi restaurant NYC.”

Examples:

  • Semrush vs Moz
  • Best website hosting service
  • Squarespace reviews
  • Wordpress or wix for blog

Transactional

Transactional searchers are looking to make a purchase. This could be a product, service, or subscription. Either way, they have a good idea of what they’re looking for. Since the user is already in buying mode, these terms are usually branded. Users are no longer researching the product, they’re looking for a place to purchase it.

Examples:

  • Buy Yeti tumbler
  • Seamless coupon
  • Shop Louis Vuitton bags
  • Van’s high tops sale

Navigational

These searchers are looking to navigate to a specific website, and it’s often easier to run a quick search in Google than to type out the URL. The user could also be unsure of the exact URL or looking for a specific page, e.g. a login page. As such, these searches tend to be brand or website names and can include additional specifications to help users find an exact page.

Examples:

  • Spotify login
  • Yelp
  • MOZ beginner SEO
  • distilledU

How to determine search intent

Consider keyword modifiers

As we briefly noted above, keyword modifiers can be helpful indicators for search intent. But it’s not enough just to know the terms, you may also be wondering, when it comes to keyword research, how do you find these terms?

Thankfully, there are a range of trusted keyword research tools out there to use. Their filter features will be most useful here, as you can filter terms that include certain modifiers or phrases.

Additionally, you can filter keywords by SERP feature. Taking informational intent for example, you can filter for keywords that rank for knowledge panels, related questions, and featured snippets.

Read the SERPs

Another way to determine search intent is to research the SERPs. Type in the keyword you’re targeting into the search bar and see what Google comes up with. You’ll likely be able to tell by the types of results what Google deems the most relevant search intent for each term.

Let’s take a closer look at search results for each intent type.

SERP results for informational intent

As mentioned above, informational keywords tend to own SERP results that provide condensed information. These include knowledge grabs, featured snippets, and related questions. The top results are most likely organic results, and consist of Wikipedia, dictionary, or informative blog posts.

SERP results for preferential/commercial research intent

Preferential intent is similar in that results may include a featured snippet, but they’ll also include paid results at the top of the SERP. The results will also likely provide information about the brands searched, rather than topical information.

In the example below, the organic results compare product features between competing site hosts, rather than explaining what site hosts are and how they function.

SERP results for transactional intent

Transactional SERPs are some of the most straightforward to spot. They usually lead with paid results and/or shopping results, shopping carousels, and reviews. The organic results are largely product pages from online and brick and mortar retailers, and depending on the search, can include maps to their locations.


SERP results for navigational intent

Since users with navigational intent already know which website they’re looking for, these results usually feature the most relevant page at the top: e.g. if the user searches “Spotify”, Spotify’s homepage will be the first result, whereas the login page will take first position for “Spotify login.”

Additional features such as site links, knowledge cards, and top stories may also be present, depending on the specific search.

Look at the full picture

Keep in mind that terms often have more than one search intent, so looking only at keywords or the SERP is rarely enough to truly define it. That said, taking this holistic approach will bring you closer to the most prominent intent.

It’s also important to note that SERPs are volatile, so while a keyword may rank for one intent this month, that could change next month.

How to optimize for search intent

Match metadata and content type to the intent

You’ve done your research and know which keywords you’re targeting with which pages. Now it’s time to optimize. A solid place to start is with your pages’ metadata –– update your title tag, H1, and H2s to reflect your specific keyword targeting. To increase click-through rate, try to leverage your title tag with some snappy copy (without creating clickbait).

Examine the competition

As with most competitions, it’s a good idea to suss out the current winners prior to the event. So, before jumping in to creating new pages or reformatting existing content, take a look at the top-ranking pages and ask yourself the following questions:

  • How are they formatted?
  • What’s their tone?
  • Which points do they cover?
  • What are they missing?

You can now use your answers to create the best, most relevant piece of content on the topic.

Format content for relevant SERP features

Just as you used the SERP features as clues to search intent, they can also be used to inform your pages’ formatting and content. If the featured snippet contains a numbered list, for example, it’s safe to say that Google appreciates and rewards that format for that term.

In a similar vein, if the SERP returns related questions, be sure to answer those questions clearly and concisely in your content.

Key takeaways:

When creating SEO content around search intent, be sure to keep the following in mind:

  • Understand the search intent before optimizing content
  • When discovering new terms, use specific modifiers in your keyword research
  • Use the SERPs to determine optimal formatting and content options
  • Provide valuable, quality content every time

Creating SEO optimized content for specific search intents is simple, but not easy. Follow these guidelines and you’ll be well on your way to giving users the content they need in a format that they want.

For a deeper dive on fulfilling search intent, be sure to check out this informative Whiteboard Friday from Britney Muller.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Thứ Sáu, 18 tháng 9, 2020

The Theory Behind Ranking Factors — Whiteboard Friday

Posted by rjonesx.

Since day one of SEO, marketers have tried to determine what factors Google takes into account when ranking results on the SERPs. In this brand new Whiteboard Friday, Russ Jones discusses the theory behind those ranking factors, and gives us some improved definitions and vocabulary to use when discussing them.

Agency tactics

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Hi, folks. Welcome back to another Whiteboard Friday. Today, we're going to be talking about ranking factors and the theory behind them, and hopefully get past some of these — let's say controversies — that have come up over the years, when we've really just been talking past one another.

You see, ranking factors have been with us since pretty much day one of search engine optimization. We have been trying as SEOs to identify exactly what influences the algorithm. Well, that's what we're going to go over today, but we're going to try and tease out some better definitions and vocabulary so that we're not talking past one another, and we're not constantly beating each other over the heads about correlation and not causation, or some other kind of nuance that really doesn't matter.

Direct 

So let's begin at the beginning with direct ranking factors. This is the most narrow kind of understanding of ranking factors. It's not to say that it's wrong — it's just pretty restrictive. A direct ranking factor would be something that Google measures and directly influences the performance of the search result.

So a classic example would actually be your robots.txt file. If you make a change to your robots.txt file, and let's say you disallow Google, you will have a direct impact on your performance in Google. Namely, your site is going to disappear.

The same is true for the most part with relevancy. Now, we might not know exactly what it is that Google is using to measure relevancy, but we do know that if you improve the relevancy of your content, you're more likely to rank higher. So these are what we would call direct ranking factors. But there's obviously a lot more to it than that.

Google has added more and more features to their search engine. They have changed the way that their algorithm has worked. They've added more and more machine learning. So I've done my best to try and tease out some new vocabulary that we might be able to use to describe the different types of ranking factors that we often discuss in our various communities or online.

Indirect 

Now, obviously, if there are direct ranking factors, it seems like there should be indirect ranking factors. And these are just once-removed ranking factors or interventions that you could take that don't directly influence the algorithm, but they do influence some of the direct ranking factors which influence the algorithm.

I think a classic example of this is hosting. Let's say you have a site that's starting to become more popular and it's time to move off of that dollar-a-month cPanel hosting that you signed up for when you first started your blog. Well, you might choose to move to, let's say, a dedicated host that has a lot more RAM and CPU and can handle more threads so everything is moving faster.

Time to first byte is faster. Well, Google doesn't have an algorithm that's going out and digging into your server and identifying exactly how many CPU cores there are. But there are a number of direct ranking factors, those that are related perhaps to user experience or perhaps to page speed, that might be influenced by your hosting environment.

Subsequently, we have good reason to believe that improving your hosting environment could have a positive influence on your search rankings. But it wouldn't be a direct influence. It would be indirect. 

The same would be true with social media. While we're pretty sure that Google isn't just going out and saying, "Okay, whoever is the most popular on Twitter is going to rank," there is good reason to believe that investing your time and your money and your energy in promoting your content on social media can actually influence your search results.

A perfect example of this would be promoting an article on Facebook, which later gets picked up by some online publication and then links back to your site. So while the social media activity itself did not directly influence your search results, it did influence the links, and those links influenced your search results.

So we can call these indirect ranking factors. For politeness' sake, please, when someone talks about social media as a ranking factor, just don't immediately assume that they mean that it is a direct ranking factor. They very well may mean that it is indirect, and you can ask them to clarify:  "Well, what do you mean? Do you think Google measures social media activity, or are you saying that doing a better job on social is likely to influence search results in some way or another?" 

So this is part of the process of teasing out the differences between ranking factors. It gives us the ability to communicate about them in a way in which we're not, let's say, confusing what we mean by the words.

Emergent 

Now, the third type is probably the one that's going to be most controversial, and I'm actually okay with that. I would love to talk in either the comments or on Twitter about exactly what I mean by emergent ranking factors. I think it's important that we get this one clear in some way, shape, or form because I think it's going to be more and more and more important as machine learning itself becomes more and more and more important as a part of Google's algorithm.

Many, many years ago, search engine optimizers like myself noticed that web pages on domains that had strong link authority seemed to do well in organic search results, even when the page itself wasn't particularly good, didn't have particularly good external links — or any at all, and even didn't have particularly good internal links.

That is to say it was a nearly orphaned page. So SEOs started to wonder whether or not there was some sort of domain-level attribute that Google was using as a ranking factor. We can't know that. Well, we can ask Google, but we can only hope that they'll tell us.

So at Moz, what we decided to do was try and identify a series of domain-level link metrics that actually predict the likelihood that a page will perform well in the search results. We call this an emergent ranking factor, or at least I call it an emergent ranking factor, because it is obviously the case that Google does not have a specific domain-authority-like feature inside their algorithm.

But on the contrary, they also do have a lot of data about links pointing to different pages on that same domain. What I believe is going on is what I would call an emergent ranking factor, which is where, let's say, the influence of several different metrics — none of which have a particularly intended purpose of creating something — end up being easy to measure and to talk about as an emergent ranking factor, rather than as part of all of its constituent elements.

Now, that was kind of a mouthful, so let me give you an example. When you're making a sauce if you're cooking, one of the most common parts of that would be the production of a roux. A roux would be a mix, normally of equal weights of flour and fat, and you would use this to thicken the sauce.

Now, I could write an entire recipe book about sauces and never use the word "roux".  Just don't use it, and describe the process of producing a roux a hundred times, but never actually use the word "roux", because "roux" describes this intermediate state. But it becomes very, very useful as a chef to be able to just say to another chef (or a sous-chef, or a cook in their cookbook), "produce a roux out of" and then whatever is the particular fat that you're using, whether it's butter or oil or something of that sort.

So the analogy here is that there isn't really a thing called a roux that's inside the sauce. What's in the sauce is the fat and the flour. But at the same time, it's really convenient to refer to it as a roux. In fact, we can use the word "roux" to know a lot about a particular dish without ever talking about the actual ingredients of flour and of fat.

For example, we can be pretty confident that if a roux is called for in a particular dish, that dish is likely not bacon because it's not a sauce. So I guess what I'm trying to get at here is that a lot of what we're talking about with ranking factors is using language that is convenient and valuable for certain purposes.

Like DA is valuable for helping predict search results, but it doesn't actually have to be a part of the algorithm in order to do that. In fact, I think there's a really interesting example that's going on right now — and we're about to see a shift from the categories — which are Core Web Vitals.

Google has been pushing page speed for quite some time and has provided us several iterations of different types of metrics for determining how fast a page loads. However, what appears to be the case is that Google has decided not to promote individual, particular steps that a website could take in order to speed up, but instead wants you to maximize or minimize a particular emergent value that comes from the amalgamation of all of those steps.

We know that the three different types of Core Web Vitals are: first input delay, largest contentful paint, and cumulative layout shift. So let's talk about the third one. If you've ever been on your cell phone and you've noticed that the text loads before certain other aspects and you start reading it and you try and scroll down and as soon as put your finger there an ad pops up because the ad took longer to load and it's just jostling the page, well, that's layout shift, and Google has learned that users just don't like it. So, even though they don't know all of the individual factors underneath that are responsible for cumulative layout shift, they know that there's this measurement, that explains all of it, that is great shorthand, and a really effective way of determining whether or not a user is going to enjoy their experience on that page.

This would be an emergent ranking factor. Now, what's interesting is that Google has now decided that this emergent ranking factor is going to become a direct ranking factor in 2021. They're going to move these descriptive factors that are amalgamations of lots of little things and make them directly influence the search results.

So we can see how these different types of ranking factors can move back and forth from categories. Back to the question of domain authority. Now, Google has made it clear they don't use Moz's domain authority — of course they don't — and they do not have a domain-authority-like metric. However, there's nothing to say that at some point they could not build exactly that, some sort of domain-level, link-based metric which is used to inform how to rank certain pages.

So an emergent ranking factor isn't stuck in that category. It can change. Well, that's enough about emergent ranking factors. Hopefully, we can talk more about that in the comments. 

Validating 

The next type I wanted to run through is what I would call a validating ranking factor. This is another one that's been pretty controversial, which is the Quality Rating Guidelines' list of things that matter, and probably the one that gets the most talked about is E-A-T: Expertise, Authority, and Trustworthiness.

Well, Google has made it clear that not only do they not measure E-A-T (or at least, as best as I've understood, they don't have metrics that are specifically targeted at E-A-T), not only do they not do that, they also, when they collect the data from quality raters on whether or not the SERPs they're looking at meet these qualifications, they don't train their algorithm against the labeled data that comes back from their quality raters, which, to me, is surprising.

It seems to me like if you had a lot of labeled data about quality, expertise, and authoritativeness, you might want it trained against that, but maybe Google found out that it wasn't very productive. Nevertheless, we know that Google cares about E-A-T, and we also have anecdotal evidence.

That is to say webmasters have noticed over time, especially in "your money or your life" types of industries, that expertise and authority does appear to matter in some way, shape, or form. So I like to call these validating ranking factors because Google uses them to validate the quality of the SERPs and the sites that are ranking, but doesn't actually use them in any kind of direct or indirect way to influence the search results.

Now, I've got an interesting one here, which is what I would call user engagement, and the reason why I've put it here is because this still remains to be a fairly controversial ranking factor. We're not quite sure exactly how Google uses it, although we do get some hints every now and then like Core Web Vitals.

If that data is collected from actual user behavior in Chrome, then we've got an idea of exactly how user engagement could have an indirect impact on the algorithm because user engagement measures the Core Web Vitals, which, coming in 2021, are going to directly influence the search results.

Correlation 

So validating is this fourth category of ranking factors, and the last — the one that I think is the most controversial  — is correlates. We get into this argument every time: "correlation does not equal causation", and it seems to me to be the statement that the person who only knows one thing about statistics knows, and so they always say it whenever anything ever comes up about correlation.

Yes, correlation does not imply causation, but that doesn't mean it isn't very, very useful. So let's talk about social metrics. This is one of the classic ones. Several times we've run various studies of ranking factors and discovered a direct relationship — a strong relationship — between things like Facebook likes or Google pluses in rankings.

All right. Now, pretty much everyone immediately understood that the reason why a site would have more plus-ones in Google+ and would have more likes in Facebook would be because they rank. That is to say, it's not Google going out and depending on Facebook's API to determine how they're going to rank the sites in their search engine.

On the contrary, performing well in their search engine drives traffic, and that traffic then tends to like the page. So I understand the frustration there when customers start asking, "Well, these two things correlate. Why aren't you getting me more likes?"

I get that, but it doesn't mean that it isn't useful in other ways. So I'll give you a good example. If you are ranking well for a keyword but yet your social media metrics are poorer than your competitors', well, it means that there's something going on in that situation that is making your users engage better with your competitors' sites than your own, and that's important to know.

It might not change your rankings, but it might change your conversion rate. It might increase the likelihood that you get found on social media. Even more so, it could actually influence your search results. Because, when you recognize the reason why you're not getting any likes to your page is because you have broken code, so the Facebook button isn't working, and then you add it and you start getting shared and more and more people are engaging with and linking to your content, well, then we start having that indirect effect on your rankings.

So, yeah, correlation isn't the same as causation, but there's a lot of value there. There's a new area that I think is going to be really, really important for this. This is going to be natural language processing metrics. These are various different technologies that are on the cutting edge. Well, some are older. Some are newer. But they allow us to kind of predict how good content is. 

Now, chances are we are not going to guess the exact way that Google is measuring content quality. I mean, unless a leaked document or something shows up, we're probably not going to get that lucky. But that doesn't mean we can't be really productive if we have a number of correlates, and those correlates can then be used to guide us. 

So I drew a little map here to kind of serve as an example. Imagine that it's the evening and you're camping, and you decide to go on a quick hike, and you take with you, let's say, a flag or a series of flags, and you mark the trail as you go so that when it gets later, you can flick on your flashlight and just follow the flags, picking them up, to lead you back to camp.

But it gets super dark, and then you realize you left your flashlight back at camp. What are you going to do? Well, we need to find a way to guide ourselves back to camp. Now, obviously, the flags would have been the best situation, but there are lots of things that are not the camp itself and are not the path itself, but would still be really helpful in getting us back to camp. For example, let's say that you had just put out the fire after you left camp. Well, the smell of the smoke is a great way for you to find your way back to the camp, but the smoke isn't the camp. It didn't cause the camp. It didn't build the camp. It's not the path. It didn't create the path. In fact, the trail of smoke itself is probably quite off the path, but once you do find where it crosses you, you can follow that scent. Well, in that case, it's really valuable even though it just mildly correlates with exactly where you need to get.

Well, the same thing is true when we're talking about something like NLP metrics or social media metrics. While they might not matter in terms of influencing the search results directly, they can guide your way. They can help you make better decisions. The thing you want to stay away from is manipulating these types of metrics for their own sake, because we know that correlates are the furthest away from direct ranking factors — at least when we know that the correlate itself is not a direct ranking factor.

All right. I know that's a lot to stomach, a lot to take in. So hopefully, we have some material for us to discuss below in the comments, and I look forward to talking with you more. Good luck. Bye.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!