The Biggest Mistake Digital Marketers Ever Made: Claiming to Measure Everything

Posted by willcritchlow

Digital marketing is measurable.

It’s probably the single most common claim everyone hears about digital, and I can’t count the number of times I’ve seen conference speakers talk about it (heck, I’ve even done it myself).

I mean, look at those offline dinosaurs, the argument goes. They all know that half their spend is wasted — they just don’t know which half.

Maybe the joke’s on us digital marketers though, who garnered only 41% of global ad spend even in 2017 after years of strong growth.

Unfortunately, while we were geeking out about attribution models and cross-device tracking, we were accidentally triggering a common human cognitive bias that kept us anchored on small amounts, leaving buckets of money on the table and fundamentally reducing our impact and access to the C-suite.

And what’s worse is that we have convinced ourselves that it’s a critical part of what makes digital marketing great. The simplest way to see this is to realize that, for most of us, I very much doubt that if you removed all our measurement ability we’d reduce our digital marketing investment to nothing.

In truth, of course, we’re nowhere close to measuring all the benefits of most of the things we do. We certainly track the last clicks, and we’re not bad at tracking any clicks on the path to conversion on the same device, but we generally suck at capturing:

  • Anything that happens on a different device
  • Brand awareness impacts that lead to much later improvements in conversion rate, average order value, or lifetime value
  • Benefits of visibility or impressions that aren’t clicked
  • Brand affinity generally

The cognitive bias that leads us astray

All of this means that the returns we report on tend to be just the most direct returns. This should be fine — it’s just a floor on the true value (“this activity has generated at least this much value for the brand”) — but the “anchoring” cognitive bias means that it messes with our minds and our clients’ minds. Anchoring is the process whereby we fixate on the first number we hear and subsequently estimate unknowns closer to the anchoring number than we should. Famous experiments have shown that even showing people a totally random number can drag their subsequent estimates up or down.

So even if the true value of our activity was 10x the measured value, we’d be stuck on estimating the true value as very close to the single concrete, exact number we heard along the way.

This tends to result in the measured value being seen as a ceiling on the true value. Other biases like the availability heuristic (which results in us overstating the likelihood of things that are easy to remember) tend to mean that we tend to want to factor in obvious ways that the direct value measurement could be overstating things, and leave to one side all the unmeasured extra value.

The mistake became a really big one because fortunately/unfortunately, the measured return in digital has often been enough to justify at least a reasonable level of the activity. If it hadn’t been (think the vanishingly small number of people who see a billboard and immediately buy a car within the next week when they weren’t otherwise going to do so) we’d have been forced to talk more about the other benefits. But we weren’t. So we lazily talked about the measured value, and about the measurability as a benefit and a differentiator.

The threats of relying on exact measurement

Not only do we leave a whole load of credit (read: cash) on the table, but it also leads to threats to measurability being seen as existential threats to digital marketing activity as a whole. We know that there are growing threats to measuring accurately, including regulatory, technological, and user-behavior shifts:

Now, imagine that the combination of these trends meant that you lost 100% of your analytics and data. Would it mean that your leads stopped? Would you immediately turn your website off? Stop marketing?

I suggest that the answer to all of that is “no.” There’s a ton of value to digital marketing beyond the ability to track specific interactions.

We’re obviously not going to see our measurable insights disappear to zero, but for all the reasons I outlined above, it’s worth thinking about all the ways that our activities add value, how that value manifests, and some ways of proving it exists even if you can’t measure it.

How should we talk about value?

There are two pieces to the brand value puzzle:

  1. Figuring out the value of increasing brand awareness or affinity
  2. Understanding how our digital activities are changing said awareness or affinity

There’s obviously a lot of research into brand valuations generally, and while it’s outside the scope of this piece to think about total brand value, it’s worth noting that some methodologies place as much as 75% of the enterprise value of even some large companies in the value of their brands:

Image source

My colleague Tom Capper has written about a variety of ways to measure changes in brand awareness, which attacks a good chunk of the second challenge. But challenge #1 remains: how do we figure out what it’s worth to carry out some marketing activity that changes brand awareness or affinity?

In a recent post, I discussed different ways of building marketing models and one of the methodologies I described might be useful for this – namely so-called “top-down” modelling which I defined as being about percentages and trends (as opposed to raw numbers and units of production).

The top-down approach

I’ve come up with two possible ways of modelling brand value in a transactional sense:

1. The Sherlock approach

When you have eliminated the impossible, whatever remains, however improbable, must be the truth.”
Sherlock Holmes

The outline would be to take the total new revenue acquired in a period. Subtract from this any elements that can be attributed to specific acquisition channels; whatever remains must be brand. If this is in any way stable or predictable over multiple periods, you can use it as a baseline value from which to apply the methodologies outlined above for measuring changes in brand awareness and affinity.

2. Aggressive attribution

If you run normal first-touch attribution reports, the limitations of measurement (clearing cookies, multiple devices etc) mean that you will show first-touch revenue that seems somewhat implausible (e.g. email; email surely can’t be a first-touch source — how did they get on your email list in the first place?):

Click for a larger version

In this screenshot we see that although first-touch dramatically reduces the influence of direct, for instance, it still accounts for more than 15% of new revenue.

The aggressive attribution model takes total revenue and splits it between the acquisition channels (unbranded search, paid social, referral). A first pass on this would simply split it in the relative proportion to the size of each of those channels, effectively normalizing them, though you could build more sophisticated models.

Note that there is no way of perfectly identifying branded/unbranded organic search since (not provided) and so you’ll have to use a proxy like homepage search vs. non-homepage search.

But fundamentally, the argument here would be that any revenue coming from a “first touch” of:

  • Branded search
  • Direct
  • Organic social
  • Email

…was actually acquired previously via one of the acquisition channels and so we attempt to attribute it to those channels.

Even this under-represents brand value

Both of those methodologies are pretty aggressive — but they might still under-represent brand value. Here are two additional mechanics where brand drives organic search volume in ways I haven’t figured out how to measure yet:

Trusting Amazon to rank

I like reading on the Kindle. If I hear of a book I’d like to read, I’ll often Google the name of the book on its own and trust that Amazon will rank first or second so I can get to the Kindle page to buy it. This is effectively a branded search for Amazon (and if it doesn’t rank, I’ll likely follow up with a [book name amazon] search or head on over to Amazon to search there directly).

But because all I’ve appeared to do is search [book name] on Google and then click through to Amazon, there is nothing to differentiate this from an unbranded search.

Spotting brands you trust in the SERPs

I imagine we all have anecdotal experience of doing this: you do a search and you spot a website you know and trust (or where you have an account) ranking somewhere other than #1 and click on it regardless of position.

One time that I can specifically recall noticing this tendency growing in myself was when I started doing tons more baby-related searches after my first child was born. Up until that point, I had effectively zero brand affinity with anyone in the space, but I quickly grew to rate the content put out by babycentre (babycenter in the US) and I found myself often clicking on their result in position 3 or 4 even when I hadn’t set out to look for them, e.g. in results like this one:

It was fascinating to me to observe this behavior in myself because I had no real interaction with babycentre outside of search, and yet, by consistently ranking well across tons of long-tail queries and providing consistently good content and user experience I came to know and trust them and click on them even when they were outranked. I find this to be a great example because it is entirely self-contained within organic search. They built a brand effect through organic search and reaped the reward in increased organic search.

I have essentially no ideas on how to measure either of these effects. If you have any bright ideas, do let me know in the comments.

Budgets will come under pressure

My belief is that total digital budgets will continue to grow (especially as TV continues to fragment), but I also believe that individual budgets are going to come under scrutiny and pressure making this kind of thinking increasingly important.

We know that there is going to be pressure on referral traffic from Facebook following the recent news feed announcements, but there is also pressure on trust in Google:

While I believe that the opportunity is large and still growing (see, for example, this slide showing Google growing as a referrer of traffic even as CTR has declined in some areas), it’s clear that the narrative is going to lead to more challenging conversations and budgets under increased scrutiny.

Can you justify your SEO investment?

What do you say when your CMO asks what you’re getting for your SEO investment?

What do you say when she asks whether the organic search opportunity is tapped out?

I’ll probably explore the answers to both these questions more in another post, but suffice it to say that I do a lot of thinking about these kinds of questions.

The first is why we have built our split-testing platform to make organic SEO investments measurable, quantifiable and accountable.

The second is why I think it’s super important to remember the big picture while the media is running around with hair on fire. Media companies saw Facebook overtake Google as a traffic channel (and then are likely seeing that reverse right now), but most of the web has Google as the largest and growing source of traffic and value.

The reality (from clickstream data) is that it’s really easy to forget how long the long-tail is and how sparse search features and ads are on the extreme long-tail:

  1. Only 3–4% of all searches result in a click on an ad, for example. Google’s incredible (and still growing) business is based on a small subset of commercial searches
  2. Google’s share of all outbound referral traffic across the web is growing (and Facebook’s is shrinking as they increasingly wall off their garden)

The opportunity is for smart brands to capitalize on a growing opportunity while their competitors sink time and money into a social space that is increasingly all about Facebook, and increasingly pay-to-play.

What do you think? Are you having these hard conversations with leadership? How are you measuring your digital brand’s value?

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Advertisements

Using the Cross Domain Rel=Canonical to Maximize the SEO Value of Cross-Posted Content – Whiteboard Friday

Posted by randfish

Same content, different domains? There’s a tag for that. Using rel=canonical to tell Google that similar or identical content exists on multiple domains has a number of clever applications. You can cross-post content across several domains that you own, you can benefit from others republishing your own content, rent or purchase content on other sites, and safely use third-party distribution networks like Medium to spread the word. Rand covers all the canonical bases in this not-to-be-missed edition of Whiteboard Friday.

https://fast.wistia.net/embed/iframe/qsmvv6edgb?seo=false&videoFoam=true

https://fast.wistia.net/assets/external/E-v1.js

Using the Cross Domain Rel=Canonical to Maximize the SEO Value of X-Posted Content

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat about the cross-domain rel=canonical tag. So we’ve talked about rel=canonical a little bit and how it can be used to take care of duplicate content issues, point Google to the right pages from potentially other pages that share similar or exactly the same content. But cross-domain rel=canonical is a unique and uniquely powerful tool that is designed to basically say, “You know what, Google? There is the same content on multiple different domains.”

So in this simplistic example, MyFriendSite.com/green-turtles contains this content that I said, “Sure, it’s totally fine for you, my friend, to republish, but I know I don’t want SEO issues. I know I don’t want duplicate content. I know I don’t want a problem where my friend’s site ends up outranking me, because maybe they have better links or other ranking signals, and I know that I would like any ranking credit, any link or authority signals that they accrue to actually come to my website.

There’s a way that you can do this. Google introduced it back in 2009. It is the cross-domain rel=canonical. So essentially, in the header tag of the page, I can add this link, rel=canonical href — it’s a link tag, so there’s an href — to the place where I want the link or the canonical, in this case, to point to and then close the tag. Google will transfer over, this is an estimate, but roughly in the SEO world, we think it’s pretty similar to what you get in a 301 redirect. So something above 90% of the link authority and ranking signals will transfer from FriendSite.com to MySite.com.

So my green turtles page is going to be the one that Google will be more likely to rank. As this one accrues any links or other ranking signals, that authority, those links should transfer over to my page. That’s an ideal situation for a bunch of different things. I’ll talk about those in a sec.

Multiple domains and pages can point to any URL

Multiple domains and pages are totally cool to point to any URL. I can do this for FriendSite.com. I can also do this for TurtleDudes.com and LeatherbackFriends.net and SeaTees.com and NatureIsLit.com. All of them can contain this cross-domain rel=canonical pointing back to the site or the page that I want it to go to. This is a great way to potentially license content out there, give people republishing permissions without losing any of the SEO value.

A few things need to match:

I. The page content really does need to match

That includes things like text, images, if you’ve embedded videos, whatever you’ve got on there.

II. The headline

Ideally, should match. It’s a little less crucial than the page content, but probably you want that headline to match.

III. Links (in content)

Those should also match. This is a good way to make sure. You check one, two, three. This is a good way to make sure that Google will count that rel=canonical correctly.

Things that don’t need to match:

I. The URL

No, it’s fine if the URLs are different. In this case, I’ve got NatureIsLit.com/turtles/p?id=679. That’s okay. It doesn’t need to be green-turtles. I can have a different URL structure on my site than they’ve got on theirs. Google is just fine with that.

II. The title of the piece

Many times the cross-domain rel=canonical is used with different page titles. So if, for example, CTs.com wants to publish the piece with a different title, that’s okay. I still generally recommend that the headlines stay the same, but okay to have different titles.

III. The navigation

IV. Site branding

So all the things around the content. If I’ve got my page here and I have like nav elements over here, nav elements down here, maybe a footer down here, a nice little logo up in the top left, that’s fine if those are totally different from the ones that are on these other pages cross-domain canonically. That stuff does not need to match. We’re really talking about the content inside the page that Google looks for.

Ways to use this protocol

Some great ways to use the cross-domain rel=canonical.

1. If you run multiple domains and want to cross-post content, choose which one should get the SEO benefits and rankings.

If you run multiple domains, for whatever reason, let’s say you’ve got a set of domains and you would like the benefit of being able to publish a single piece of content, for whatever reason, across multiples of these domains that you own, but you know you don’t want to deal with a duplicate content issue and you know you’d prefer for one of these domains to be the one receiving the ranking signals, cross-domain rel=canonical is your friend. You can tell Google that Site A and Site C should not get credit for this content, but Site B should get all the credit.

The issue here is don’t try and do this across multiple domains. So don’t say, “Oh, Site A, why don’t you rel=canonical to B, and Site C, why don’t you rel=canonical to D, and I’ll try and get two things ranked in the top.” Don’t do that. Make sure all of them point to one. That is the best way to make sure that Google respects the cross-domain rel=canonical properly.

2. If a publication wants to re-post your content on their domain, ask for it instead of (or in addition to) a link back.

Second, let’s say a publication reaches out to you. They’re like, “Wow. Hey, we really like this piece.” My wife, Geraldine, wrote a piece about Mario Batali’s sexual harassment apology letter and the cinnamon rolls recipe that he strangely included in this apology. She baked those and then wrote about it. It went quite viral, got a lot of shares from a ton of powerful and well-networked people and then a bunch of publications. The Guardian reached out. An Australian newspaper reached out, and they said, “Hey, we would like to republish your piece.” Geraldine talked to her agent, and they set up a price or whatever.

One of the ways that you can do this and benefit from it, not just from getting a link from The Guardian or some other newspaper, but is to say, “Hey, I will be happy to be included here. You don’t even have to give me, necessarily, if you don’t want to, author credit or link credit, but I do want that sweet, sweet rel=canonical.” This is a great way to maximize the SEO benefit of being posted on someone else’s site, because you’re not just receiving a single link. You’re receiving credit from all the links that that piece might generate.

Oops, I did that backwards. You want it to come from their site to your site. This is how you know Whiteboard Friday is done in one take.

3. Purchase/rent content from other sites without forcing them to remove the content from their domain.

Next, let’s say I am in the opposite situation. I’m the publisher. I see a piece of content that I love and I want to get that piece. So I might say, “Wow, that piece of content is terrific. It didn’t do as well as I thought it would do. I bet if we put it on our site and broadcast it with our audience, it would do incredibly well. Let’s reach out to the author of the piece and see if we can purchase or rent for a time period, say two years, for the next two years we want to put the cross-domain rel=canonical on your site and point it back to us and we want to host that content. After two years, you can have it back. You can own it again.”

Without forcing them to remove the content from their site, so saying you, publisher, you author can keep it on your site. We don’t mind. We’d just like this tag applied, and we’d like to able to have republishing permissions on our website. Now you can get the SEO benefits of that piece of content, and they can, in exchange, get some money. So your site sending them some dollars, their site sending you the rel=canonical and the ranking authority and the link equity and all those beautiful things.

4. Use Medium as a content distribution network without the drawback of duplicate content.

Number four, Medium. Medium is a great place to publish content. It has a wide network, people who really care about consuming content. Medium is a great distribution network with one challenge. If you post on Medium, people worry that they can’t post the same thing on their own site because you’ll be competing with Medium.com. It’s a very powerful domain. It tends to rank really well. So duplicate content is an issue, and potentially losing the rankings and the traffic that you would get from search and losing that to Medium is no fun.

But Medium has a beautiful thing. The cross-domain rel=canonical is built in to their import tool. So if you go to Medium.com/p/import and you are logged in to your Medium account, you can enter in their URL field the content that you’ve published on your own site. Medium will republish it on your account, and they will include the cross-domain rel=canonical back to you. Now, you can start thinking of Medium as essentially a distribution network without the penalties or problems of duplicate content issues. Really, really awesome tool. Really awesome that Medium is offering this. I hope it sticks around.

All right, everyone. I think you’re going to have some excellent additional ideas for the cross-domain rel=canonical and how you have used it. We would love you to share those in the comments below, and we’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reading Between the Lines: A 3-Step Guide to Reviewing Web Page Content

Posted by Jackie.Francis

In SEO, reviewing content is an unavoidable yet extremely important task. As the driving factor that brings people to a page, best practice dictates that we do what we can to ensure that the work we’ve invested hours and resources into creating remains impactful and relevant over time. This requires occasionally going back and re-evaluating our content to identify areas that can be improved.

That being said, if you’ve ever done a content review, you know how surprisingly challenging this is. A large variety of formats and topics alongside the challenge of defining “good” content makes it hard to pick out the core elements that matter. Without these universal focus areas, you may end up neglecting an element (e.g. tone of voice) in one instance but paying special attention to that same element in another.

Luckily there are certain characteristics — like good spelling, appealing layouts, and relevant keywords — that are universally associated with what we would consider “good” content. In this three-step guide, I’ll show you how to use these characteristics (or elements, as I like to call them) to define your target audience, measure the performance of your content using a scorecard, and assess your changes for quality assurance as part of a review process that can be applied to nearly all types of content across any industry.


Step 1: Know your audience

Arguably the most important step mentioned in this post, knowing your target reader will identify the details that should make up the foundation of your content. This includes insight into the reader’s intent, the ideal look and feel of the page, and the goals your content’s message should be trying to achieve.

To get to this point, however, you first need to answer these two questions:

  1. What does my target audience look like?
  2. Why are they reading my content?

What does my target audience look like?

The first question relies on general demographic information such as age, gender, education, and job title. This gives a face to the ideal audience member(s) and the kind of information that would best suit them. For example, if targeting stay-at-home mothers between the ages of 35 and 40 with two or more kids under the age of 5, we can guess that she has a busy daily schedule, travels frequently for errands, and constantly needs to stay vigilant over her younger children. So, a piece that is personable, quick, easy to read on-the-go, and includes inline imagery to reduce eye fatigue would be better received than something that is lengthy and requires a high level of focus.

Why are they reading my content?

Once you have a face to your reader, the second question must be answered to understand what that reader wants from your content and if your current product is effectively meeting those needs. For example, senior-level executives of mid- to large-sized companies may be reading to become better informed before making an important decision, to become more knowledgeable in their field, or to use the information they learn to teach others. Other questions you may want to consider asking:

  • Are they reading for leisure or work?
  • Would they want to share this with their friends on social media?
  • Where will they most likely be reading this? On the train? At home? Waiting in line at the store?
  • Are they comfortable with long blocks of text, or would inline images be best?
  • Do they prefer bite-sized information or are they comfortable with lengthy reports?

You can find the answers to these questions and collect valuable demographic and psychographic information by using a combination of internal resources, like sales scripts and surveys, and third-party audience insight tools such as Google Analytics and Facebook Audience Insights. With these results you should now have a comprehensive picture of your audience and can start identifying the parts of your content that can be improved.


Step 2: Tear apart your existing content

Now that you understand who your audience is, it’s time to get to the real work: assessing your existing content. This stage requires breaking everything apart to identify the components you should keep, change, or discard. However, this task can be extremely challenging because the performance of most components — such as tone of voice, design, and continuity — can’t simply be bucketed into binary categories like “good” or “bad.” Rather, they fall into a spectrum where the most reasonable level of improvement falls somewhere in the middle. You’ll see what I mean by this statement later on, but one of the most effective ways to evaluate and measure the degree of optimization needed for these components is to use a scorecard. Created by my colleague, Ben Estes, this straightforward, reusable, and easy to apply tool can help you objectively review the performance of your content.

Make a copy of the Content Review Grading Rubric

Note: The card sampled here, and the one I personally use for similar projects, is a slightly altered version of the original.

As you can see, the card is divided into two categories: Writing and Design. Listed under each category are elements that are universally needed to create a good content and should be examined. Each point is assigned a grading scale ranging from 1–5, with 1 being the worst score and 5 being best.

To use, start by choosing a part of your page to look at first. Order doesn’t matter, so whether you choose to first check “spelling and grammar” or “continuity” is up to you. Next, assign it a score on a separate Excel sheet (or mark it directly on the rubric) based on its current performance. For example, if the copy has no spelling errors but some minor grammar issues, you would rank “spelling and grammar” as a four (4).

Finally, repeat this process until all elements are graded. Remember to stay impartial to give an honest assessment.

Once you’re done, look at each grade and see where it falls on the scale. Ideally each element should have a score of 4 or greater, although a grade of 5 should only be given out sparingly. Tying back to my spectrum comment from earlier, a 5 is exclusively reserved for top-level work and should be something to strive for but will typically take more effort to achieve than it is worth. A grade of 4 is often the highest and most reasonable goal to attempt for, in most instances.

A grade of 3 or below indicates an opportunity for improvement and that significant changes need to be made.

If working with multiple pieces of content at once, the grading system can also be used to help prioritize your workload. Just collect the average writing or design score and sort them in ascending/descending order. Pages with a lower average indicate poorer performance and should be prioritized over pages whose averages are higher.

Whether you choose to use this scorecard or make your own, what you review, the span of the grading scale, and the criteria for each grade should be adjusted to fit your specific needs and result in a tool that will help you honestly assess your content across multiple applications.

Don’t forget the keywords

With most areas of your content covered by the scorecard, the last element to check before moving to the editing stage is your keywords.

Before I get slack for this, I’m aware that the general rule of creating content is to do your keyword research first. But I’ve found that when it comes to reviews, evaluating keywords last feels more natural and makes the process a lot smoother. When first running through a page, you’re much more likely to notice spelling and design flaws before you pick up whether a keyword is used correctly — why not make note of those details first?

Depending on the outcomes stemming from the re-evaluation of your target audience and content performance review, you will notice one of two things about your currently targeted keywords:

  1. They have not been impacted by the outcomes of the prior analyses and do not need to be altered
  2. They no longer align with the goals of the page or needs of the audience and should be changed

In the first example, the keywords you originally target are still best suited for your content’s message and no additional research is needed. So, your only remaining task is to determine whether or not your keywords are effectively used throughout the page. This means assessing things like title tag, image alt attributes, URL, and copy.

In an attempt to stay on track, I won’t go into further detail on how to optimize keywords but if you want a little more insight, this post by Ken Lyons is a great resource.

If, however, your target keywords are no longer relevant to the goals of your content, before moving to the editing stage you’ll need to re-do your keyword research to identify the terms you should rank for. For insight into keyword research this chapter in Moz’s Beginner’s Guide to SEO is another invaluable resource.


Step 3: Evaluate your evaluation

At this point your initial review is complete and you should be ready to edit.

That’s right. Your initial review.

The interesting thing about assessing content is that it never really ends. As you make edits you’ll tend to deviate more and more from your initial strategy. And while not always a bad thing, you must continuously monitor these changes to ensure that you are on the right track to create a highly valued piece of content.

The best approach would be to reassess all your material when:

  • 50% of the edits are complete
  • 85% of the edits are complete
  • You have finished editing

At the 50% and 85% marks, keep the assessment quick and simple. Look through your revisions and ask the following questions:

  • Am I still addressing the needs of my target audience?
  • Are my target keywords properly integrated?
  • Am I using the right language and tone of voice?
  • Does it look like the information is structured correctly (hierarchically)?

If your answer is “Yes” to all four questions, then you’ve effectively made your changes and should proceed. For any question you answer “No,” go back and make the necessary corrections. The areas targeted here become more difficult to fix the closer you are to completion and ensuring they’re correct throughout this stage will save a lot of time and stress in the long run.

When you’ve finished and think you’re ready to publish, run one last comprehensive review to check the performance status of all related components. This means confirming you’ve properly addressed the needs of your audience, optimized your keywords, and improved the elements highlighted in the scorecard.


Moving forward

No two pieces of content are the same, but that does not mean there aren’t some important commonalities either. Being able to identify these similarities and understand the role they play across all formats and topics will lead the way to creating your own review process for evaluating subjective material.

So, when you find yourself gearing up for your next project, give these steps a try and always keep the following in mind:

  1. Your audience is what makes or breaks you, so keep them happy
  2. Consistent quality is key! Ensure all components of your content are performing at their best
  3. Keep your keywords optimized and be prepared to do additional research if necessary
  4. Unplanned changes will happen. Just remember to remain observant as to keep yourself on track

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

The 2018 Local SEO Forecast: 9 Predictions According to Mozzers

Posted by MiriamEllis

It’s February, and we’ve all dipped our toes into the shallow end of the 2018 pool. Today, let’s dive into the deeper waters of the year ahead, with local search marketing predictions from Moz’s Local SEO Subject Matter Expert, our Marketing Scientist, and our SEO & Content Architect. Miriam Ellis, Dr. Peter J. Myers, and Britney Muller weigh in on what your brand should prepare for in the coming months in local.


WOMM, core SEO knowledge, and advice for brands both large and small

Miriam Ellis, Moz Associate & Local SEO SME

LSAs will highlight the value of Google-independence

Word-of-mouth marketing (WOMM) and loyalty initiatives will become increasingly critical to service area business whose results are disrupted by Google’s Local Service Ads. SABs aren’t going to love having to “rent back” their customers from Google, so Google-independent lead channels will have enhanced value. That being said, the first small case study I’ve seen indicates that LSAs may be a winner over traditional Adwords in terms of cost and conversions.

Content will be the omni-channel answer

Content will grow in value, as it is the answer to everything coming our way: voice search, Google Posts, Google Questions & Answers, owner responses, and every stage of the sales funnel. Because of this, agencies which have formerly thought of themselves as strictly local SEO consultants will need to master the fundamentals of organic keyword research and link building, as well as structured data, to offer expert-level advice in the omni-channel environment. Increasingly, clients will need to become “the answer” to queries… and that answer will predominantly reside in content dev.

Retail may downsize but must remain physical

Retail is being turned on its head, with Amazon becoming the “everything store” and the triumphant return of old-school home delivery. Large brands failing to see profits in this new environment will increasingly downsize to the showroom scenario, significantly cutting costs, while also possibly growing sales as personally assisted consumers are dissuaded from store-and-cart abandonment, and upsold on tie-ins. Whether this will be an ultimate solution for shaky brands, I can’t say, but it matters to the local SEO industry because showrooms are, at least, physical locations and therefore eligible for all of the goodies of our traditional campaigns.

SMBs will hold the quality high card

For smaller local brands, emphasis on quality will be the most critical factor. Go for the customers who care about specific attributes (e.g. being truly local, made in the USA, handcrafted, luxury, green, superior value, etc.). Evaluating and perfecting every point of contact with the customer (from how phone calls are assisted, to how online local business data is managed, to who asks for and responds to reviews) matters tremendously. This past year, I’ve watched a taxi driver launch a delivery business on the side, grow to the point where he quit driving a cab, hire additional drivers, and rack up a profusion of 5-star, unbelievably positive reviews, all because his style of customer service is memorably awesome. Small local brands will have the nimbleness and hometown know-how to succeed when quality is what is being sold.


In-pack ads, in-SERP features, and direct-to-website traffic

Dr. Peter J. Meyers, Marketing Scientist at Moz

In-pack ads to increase

Google will get more aggressive about direct local advertising, and in-pack ads will expand. In 2018, I expect local pack ads will not only appear on more queries but will make the leap to desktop SERPs and possibly Google Home.

In-SERP features to grow

Targeted, local SERP features will also expand. Local Service Ads rolled out to more services and cities in 2017, and Google isn’t going to stop there. They’ve shown a clear willingness to create specialized content for both organic and local. For example, 2017 saw Google launch a custom travel portal and jobs portal on the “organic” side, and this trend is accelerating.

Direct-to-website traffic to decline

The push to keep local search traffic in Google properties (i.e. Maps) will continue. Over the past couple of years, we’ve seen local packs go from results that link directly to websites, to having a separate “Website” link to local sites being buried 1–2 layers deep. In some cases, local sites are being almost completely supplanted by local Knowledge Panels, some of which (hotels being a good example) have incredibly rich feature sets. Google wants to deliver local data directly on Google, and direct traffic to local sites from search will continue to decline.


Real-world data and the importance of Google

Britney Muller, SEO & Content Architect at Moz

Relevance drawn from the real world

Real-world data! Google will leverage device and credit card data to get more accurate information on things like foot traffic, current gas prices, repeat customers, length of visits, gender-neutral bathrooms, type of customers, etc. As the most accurate source of business information to date, why wouldn’t they?

Google as one-stop shop

SERPs and Maps (assisted by local business listings) will continue to grow as a one-stop-shop for local business information. Small business websites will still be important, but are more likely to serve as a data source as opposed to the only place to get their business information, in addition to more in-depth data like the above.


Google as friend or foe? Looking at these expert predictions, that’s a question local businesses of all sizes will need to continue to ask in 2018. Perhaps the best answer is “neither.” Google represents opportunity for brands that know how to play the game well. Companies that put the consumer first are likely to stand strong, no matter how the nuances of digital marketing shift, and education will remain the key to mastery in the year ahead.

What do you think? Any hunches about the year ahead? Let us know in the comments.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

New Research: 35% of Competitive Local Keywords Have Local Pack Ads

Posted by Dr-Pete

Over the past year, you may have spotted a new kind of Google ad on a local search. It looks something like this one (on a search for “oil change” from my Pixel phone in the Chicago suburbs):

These ads seem to appear primarily on mobile results, with some limited testing on desktop results. We’ve heard rumors about local pack ads as far back as 2016, but very few details. How prevalent are these ads, and how seriously should you be taking them?

11,000 SERPs: Quick summary

For this study, we decided to look at 110 keywords (in 11 categories) across 100 major US cities. We purposely focused on competitive keywords in large cities, assuming, based on our observations as searchers, that the prevalence rate for these ads was still pretty low. The 11 categories were as follows:

  • Apparel
  • Automotive
  • Consumer Goods
  • Finance
  • Fitness
  • Hospitality
  • Insurance
  • Legal
  • Medical
  • Services (Home)
  • Services (Other)

We purposely selected terms that were likely to have local pack results and looked for the presence of local packs and local pack ads. We collected these searches as a mobile user with a Samsung Galaxy 7 (a middle-ground choice between iOS and a “pure” Google phone).

Why 11 categories? Confession time – it was originally 10, and then I had the good sense to ask Darren Shaw about the list and realized I had completely left out insurance keywords. Thanks, Darren.

Finding #1: I was very wrong

I’ll be honest – I expected, from casual observations and the lack of chatter in the search community, that we’d see fewer than 5% of local packs with ads, and maybe even numbers in the 1% range.

Across our data set, roughly 35% of SERPs with local packs had ads.

Across industry categories, the prevalence of pack ads ranged wildly, from 10% to 64%:

For the 110 individual keyword phrases in our study, the presence of local ads ranged from 0% to 96%. Here are the keywords with >=90% local pack ad prevalence:

  • “car insurance” (90%)
  • “auto glass shop” (91%)
  • “bankruptcy lawyer” (91%)
  • “storage” (92%)
  • “oil change” (95%)
  • “mattress sale” (95%)
  • “personal injury attorney” (96%)

There was no discernible correlation between the presence of pack ads and city size. Since our study was limited to the top 100 US cities by population, though, this may simply be due to a restricted data range.

Finding #2: One is the magic number

Every local pack with ads in our study had one and only one ad. This ad appeared in addition to regular pack listings. In our data set, 99.7% of local packs had three regular/organic listings, and the rest had two listings (which can happen with or without ads).

Finding #3: Pack ads land on Google

Despite their appearance, local packs ads are more like regular local pack results than AdWords ads, in that they’re linked directly to a local panel (a rich Google result). On my Pixel phone, the Jiffy Lube ad at the beginning of this post links to this result:

This is not an anomaly: 100% of the 3,768 local pack ads in our study linked back to Google. This follows a long trend of local pack results linking back to Google entities, including the gradual disappearance of the “Website” link in the local pack.

Conclusion: It’s time to get serious

If you’re in a competitive local vertical, it’s time to take local pack ads seriously. Your visitors are probably seeing them more often than you realize. Currently, local pack ads are an extension of AdWords, and require you to set up location extensions.

It’s also more important than ever to get your Google My Business listing in order and make sure that all of your information is up to date. It may be frustrating to lose the direct click to your website, but a strong local business panel can drive phone calls, foot traffic, and provide valuable information to potential customers.

Like every Google change, we ultimately have to put aside whether we like or dislike it and make the tough choices. With more than one-third of local packs across the competitive keywords in our data set showing ads, it’s time to get your head out of the sand and get serious.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Diagnosing Why a Site’s Set of Pages May Be Ranking Poorly – Whiteboard Friday

Posted by randfish

Your rankings have dropped and you don’t know why. Maybe your traffic dropped as well, or maybe just a section of your site has lost rankings. It’s an important and often complex mystery to solve, and there are a number of boxes to check off while you investigate. In this Whiteboard Friday, Rand shares a detailed process to follow to diagnose what went wrong to cause your rankings drop, why it happened, and how to start the recovery process.

https://fast.wistia.net/embed/iframe/ygti7i5x2b?seo=false&videoFoam=true

https://fast.wistia.net/assets/external/E-v1.js

Diagnosing why a site's pages may be ranking poorly

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to talk about diagnosing a site and specifically a section of a site’s pages and why they might be performing poorly, why their traffic may have dropped, why rankings may have dropped, why both of them might have dropped. So we’ve got a fairly extensive process here, so let’s get started.

Step 1: Uncover the problem

First off, our first step is uncovering the problem or finding whether there is actually a problem. A good way to think about this is especially if you have a larger website, if we’re talking about a site that’s 20 or 30 or even a couple hundred pages, this is not a big issue. But many websites that SEOs are working on these days are thousands, tens of thousands, hundreds of thousands of pages. So what I like to urge folks to do is to

A. Treat different site sections as unique segments for investigation. You should look at them individually.

A lot of times subfolders or URL structures are really helpful here. So I might say, okay, MySite.com, I’m going to look exclusively at the /news section. Did that fall in rankings? Did it fall in traffic? Or was it /posts, where my blog posts and my content is? Or was it /cities? Let’s say I have a website that’s dealing with data about the population of cities. So I rank for lots of those types of queries, and it seems like I’m ranking for fewer of them, and it’s my cities pages that are poorly performing in comparison to where they were a few months ago or last year at this time.

B. Check traffic from search over time.

So I go to my Google Analytics or whatever analytics you’re using, and you might see something like, okay, I’m going to look exclusively at the /cities section. If you can structure your URLs in this fashion, use subfolders, this is a great way to do it. Then take a look and see, oh, hang on, that’s a big traffic drop. We fell off a cliff there for these particular pages.

This data can be hiding inside your analytics because it could be that the rest of your site is performing well. It’s going sort of up and to the right, and so you see this slow plateauing or a little bit of a decline, but it’s not nearly as sharp as it is if you look at the traffic specifically for a single subsection that might be performing poorly, like this /cities section.

From there, I’m going to next urge you to use Google Trends. Why? Why would I go to Google Trends? Because what I want you to do is I want you to look at some of your big keywords and topics in Google Trends to see if there has been a serious decline in search volume at the same time. If search demand is rising or staying stable over the course of time where you have lost traffic, it’s almost certainly something you’ve done, not something searchers are doing. But if you see that traffic has declined, for example, maybe you were ranking really well for population data from 2015. It turns out people are now looking for population data for 2016 or ’17 or ’18. Maybe that is part of the problem, that search demand has fallen and your curve matches that.

C. Perform some diagnostic queries or use your rank tracking data if you have it on these types of things.

This is one of the reasons I like to rank track for even these types of queries that don’t get a lot of traffic.

1. Target keywords. In this case, it might be “Denver population growth,” maybe that’s one of your keywords. You would see, “Do I still rank for this? How well do I rank for this? Am I ranking more poorly than I used to?”

2. Check brand name plus target keyword. So, in this case, it would be my site plus the above here plus “Denver population growth,” so My Site or MySite.com Denver population growth. If you’re not ranking for that, that’s usually an indication of a more serious problem, potentially a penalty or some type of dampening that’s happening around your brand name or around your website.

3. Look for a 10 to 20-word text string from page content without quotes. It could be shorter. It could be only six or seven words, or it could be longer, 25 words if you really need it. But essentially, I want to take a string of text that exists on the page and put it in order in Google search engine, not in quotes. I do not want to use quotes here, and I want to see how it performs. This might be several lines of text here.

4. Look for a 10 to 20-word text string with quotes. So those lines of text, but in quotes searched in Google. If I’m not ranking for this, but I am for this one … sorry, if I’m not ranking for the one not in quotes, but I am in quotes, I might surmise this is probably not duplicate content. It’s probably something to do with my content quality or maybe my link profile or Google has penalized or dampened me in some way.

5. site: urlstring/ So I would search for “site:MySite.com/cities/Denver.” I would see: Wait, has Google actually indexed my page? When did they index it? Oh, it’s been a month. I wonder why they haven’t come back. Maybe there’s some sort of crawl issue, robots.txt issue, meta robots issue, something. I’m preventing Google from potentially getting there. Or maybe they can’t get there at all, and this results in zero results. That means Google hasn’t even indexed the page. Now we have another type of problem.

D. Check your tools.

1. Google Search Console. I would start there, especially in the site issues section.

2. Check your rank tracker or whatever tool you’re using, whether that’s Moz or something else.

3. On-page and crawl monitoring. Hopefully you have something like that. It could be through Screaming Frog. Maybe you’ve run some crawls over time, or maybe you have a tracking system in place. Moz has a crawl system. OnPage.org has a really good one.

4. Site uptime. So I might check Pingdom or other things that alert me to, “Oh, wait a minute, my site was down for a few days last week. That obviously is why traffic has fallen,” those types of things.

Step 2: Offer hypothesis for falling rankings/traffic

Okay, you’ve done your diagnostics. Now it’s time to offer some hypotheses. So now that we understand which problem I might have, I want to understand what could be resulting in that problem. So there are basically two situations you can have. Rankings have stayed stable or gone up, but traffic has fallen.

A. If rankings are up, but traffic is down…

In those cases, these are the five things that are most typically to blame.

1. New SERP features. There’s a bunch of featured snippets that have entered the population growth for cities search results, and so now number one is not what number one used to be. If you don’t get that featured snippet, you’re losing out to one of your competitors.

2. Lower search demand. Like we talked about in Google Trends. I’m looking at search demand, and there are just not as many people searching as there used to be.

3. Brand or reputation issues. I’m ranking just fine, but people now for some reason hate me. People who are searching this sector think my brand is evil or bad or just not as helpful as it used to be. So I have issues, and people are not clicking on my results. They’re choosing someone else actively because of reputation issues.

4. Snippet problems. I’m ranking in the same place I used to be, but I’m no longer the sexiest, most click-drawing snippet in the search results, and other people are earning those clicks instead.

5. Shift in personalization or location biasing by Google. It used to be the case that everyone who searched for city name plus population growth got the same results, but now suddenly people are seeing different results based on maybe their device or things they’ve clicked in the past or where they’re located. Location is often a big cause for this.

So for many SEOs for many years, “SEO consultant” resulted in the same search results. Then Google introduced the Maps results and pushed down a lot of those folks, and now “SEO consultant” results in different ranked results in each city and each geography that you search in. So that can often be a cause for falling traffic even though rankings remain high.

B. If rankings and traffic are down…

If you’re seeing that rankings have fallen and traffic has fallen in conjunction, there’s a bunch of other things that are probably going on that are not necessarily these things. A few of these could be responsible still, like snippet problems could cause your rankings and your traffic to fall, or brand and reputation issues could cause your click-through rate to fall, which would cause you to get dampened. But oftentimes it’s things like this:

1. & 2. Duplicate content and low-quality or thin content. Google thinks that what you’re providing just isn’t good enough.

3. Change in searcher intent. People who were searching for population growth used to want what you had to offer, but now they want something different and other people in the SERP are providing that, but you are not, so Google is ranking you lower. Even though your content is still good, it’s just not serving the new searcher intent.

4. Loss to competitors. So maybe you have worse links than they do now or less relevance or you’re not solving the searcher’s query as well. Your user interface, your UX is not as good. Your keyword targeting isn’t as good as theirs. Your content quality and the unique value you provide isn’t as good as theirs. If you see that one or two competitors are consistently outranking you, you might diagnose that this is the problem.

5. Technical issues. So if I saw from over here that the crawl was the problem, I wasn’t getting indexed, or Google hasn’t updated my pages in a long time, I might look into accessibility things, maybe speed, maybe I’m having problems like letting Googlebot in, HTTPS problems, or indexable content, maybe Google can’t see the content on my page anymore because I made some change in the technology of how it’s displayed, or crawlability, internal link structure problems, robots.txt problems, meta robots tag issues, that kind of stuff.

Maybe at the server level, someone on the tech ops team of my website decided, “Oh, there’s this really problematic bot coming from Mountain View that’s costing us a bunch of bandwidth. Let’s block bots from Mountain View.” No, don’t do that. Bad. Those kinds of technical issues can happen.

6. Spam and penalties. We’ll talk a little bit more about how to diagnose those in a second.

7. CTR, engagement, or pogo-sticking issues. There could be click-through rate issues or engagement issues, meaning pogo sticking, like people are coming to your site, but they are clicking back because they weren’t satisfied by your results, maybe because their expectations have changed or market issues have changed.

Step 3: Make fixes and observe results

All right. Next and last in this process, what we’re going to do is make some fixes and observe the results. Hopefully, we’ve been able to correctly diagnose and form some wise hypotheses about what’s going wrong, and now we’re going to try and resolve them.

A. On-page and technical issues should solve after a new crawl + index.

So on-page and technical issues, if we’re fixing those, they should usually resolve, especially on small sections of sites, pretty fast. As soon as Google has crawled and indexed the page, you should generally see performance improve. But this can take a few weeks if we’re talking about a large section on a site, many thousands of pages, because Google has to crawl and index all of them to get the new sense that things are fixed and traffic is coming in. Since it’s long tail to many different pages, you’re not going to see that instant traffic gain and rise as fast.

B. Link issues and spam penalty problems can take months to show results.

Look, if you have crappier links or not a good enough link profile as your competitors, growing that can take months or years even to fix. Penalty problems and spam problems, same thing. Google can take sometimes a long time. You’ve seen a lot of spam experts on Twitter saying, “Oh, well, all my clients who had issues over the last nine months suddenly are ranking better today,” because Google made some fix in their latest index rollout or their algorithm changed, and it’s sort of, okay, well we’ll reward the people for all the fixes that they’ve made. Sometimes that’s in batches that take months.

C. Fixing a small number of pages in a section that’s performing poorly might not show results very quickly.

For example, let’s say you go and you fix /cities/Milwaukee. You determine from your diagnostics that the problem is a content quality issue. So you go and you update these pages. They have new content. It serves the searchers much better, doing a much better job. You’ve tested it. People really love it. You fixed two cities, Milwaukee and Denver, to test it out. But you’ve left 5,000 other cities pages untouched.

Sometimes Google will sort of be like, “No, you know what? We still think your cities pages, as a whole, don’t do a good job solving this query. So even though these two that you’ve updated do a better job, we’re not necessarily going to rank them, because we sort of think of your site as this whole section and we grade it as a section or apply some grades as a section.” That is a real thing that we’ve observed happening in Google’s results.

Because of this, one of the things that I would urge you to do is if you’re seeing good results from the people you’re testing it with and you’re pretty confident, I would roll out the changes to a significant subset, 30%, 50%, 70% of the pages rather than doing only a tiny, tiny sample.

D. Sometimes when you encounter these issues, a remove and replace strategy works better than simply upgrading old URLs.

So if Google has decided /cities, your /cities section is just awful, has all sorts of problems, not performing well on a bunch of different vectors, you might take your /cities section and actually 301 redirect them to a new URL, /location, and put the new UI and the new content that better serves the searcher and fixes a lot of these issues into that location section, such that Google now goes, “Ah, we have something new to judge. Let’s see how these location pages on MySite.com perform versus the old cities pages.”

So I know we’ve covered a ton today and there are a lot of diagnostic issues that we haven’t necessarily dug deep into, but I hope this can help you if you’re encountering rankings challenges with sections of your site or with your site as a whole. Certainly, I look forward to your comments and your feedback. If you have other tips for folks facing this, that would be great. We’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Rewriting the Beginner’s Guide to SEO, Chapter 1: SEO 101

Posted by BritneyMuller

Back in mid-November, we kicked off a campaign to rewrite our biggest piece of content: the Beginner’s Guide to SEO. You offered up a huge amount of helpful advice and insight with regards to our outline, and today we’re here to share our draft of the first chapter.

In many ways, the Beginner’s Guide to SEO belongs to each and every member of our community; it’s important that we get this right, for your sake. So without further ado, here’s the first chapter — let’s dive in!


Chapter 1: SEO 101

What is it, and why is it important?

Welcome! We’re excited that you’re here!

If you already have a solid understanding of SEO and why it’s important, you can skip to Chapter 2 (though we’d still recommend skimming the best practices from Google and Bing at the end of this chapter; they’re useful refreshers).

For everyone else, this chapter will help build your foundational SEO knowledge and confidence as you move forward.

What is SEO?

SEO stands for “search engine optimization.” It’s the practice of increasing both the quality and quantity of website traffic, as well as exposure to your brand, through non-paid (also known as “organic”) search engine results.

Despite the acronym, SEO is as much about people as it is about search engines themselves. It’s about understanding what people are searching for online, the answers they are seeking, the words they’re using, and the type of content they wish to consume. Leveraging this data will allow you to provide high-quality content that your visitors will truly value.

Here’s an example. Frankie & Jo’s (a Seattle-based vegan, gluten-free ice cream shop) has heard about SEO and wants help improving how and how often they show up in organic search results. In order to help them, you need to first understand their potential customers:

  • What types of ice cream, desserts, snacks, etc. are people searching for?
  • Who is searching for these terms?
  • When are people searching for ice cream, snacks, desserts, etc.?
    • Are there seasonality trends throughout the year?
  • How are people searching for ice cream?
    • What words do they use?
    • What questions do they ask?
    • Are more searches performed on mobile devices?
  • Why are people seeking ice cream?
    • Are individuals looking for health conscious ice cream specifically or just looking to satisfy a sweet tooth?
  • Where are potential customers located — locally, nationally, or internationally?

And finally — here’s the kicker — how can you help provide the best content about ice cream to cultivate a community and fulfill what all those people are searching for?

Search engine basics

Search engines are answer machines. They scour billions of pieces of content and evaluate thousands of factors to determine which content is most likely to answer your query.

Search engines do all of this by discovering and cataloguing all available content on the Internet (web pages, PDFs, images, videos, etc.) via a process known as “crawling and indexing.”

What are “organic” search engine results?

Organic search results are search results that aren’t paid for (i.e. not advertising). These are the results that you can influence through effective SEO. Traditionally, these were the familiar “10 blue links.”

Today, search engine results pages — often referred to as “SERPs” — are filled with both more advertising and more dynamic organic results formats (called “SERP features”) than we’ve ever seen before. Some examples of SERP features are featured snippets (or answer boxes), People Also Ask boxes, image carousels, etc. New SERP features continue to emerge, driven largely by what people are seeking.

For example, if you search for “Denver weather,” you’ll see a weather forecast for the city of Denver directly in the SERP instead of a link to a site that might have that forecast. And, if you search for “pizza Denver,” you’ll see a “local pack” result made up of Denver pizza places. Convenient, right?

It’s important to remember that search engines make money from advertising. Their goal is to better solve searcher’s queries (within SERPs), to keep searchers coming back, and to keep them on the SERPs longer.

Some SERP features on Google are organic and can be influenced by SEO. These include featured snippets (a promoted organic result that displays an answer inside a box) and related questions (a.k.a. “People Also Ask” boxes).

It’s worth noting that there are many other search features that, even though they aren’t paid advertising, can’t typically be influenced by SEO. These features often have data acquired from proprietary data sources, such as Wikipedia, WebMD, and IMDb.

Why SEO is important

While paid advertising, social media, and other online platforms can generate traffic to websites, the majority of online traffic is driven by search engines.

Organic search results cover more digital real estate, appear more credible to savvy searchers, and receive way more clicks than paid advertisements. For example, of all US searches, only ~2.8% of people click on paid advertisements.

In a nutshell: SEO has ~20X more traffic opportunity than PPC on both mobile and desktop.

SEO is also one of the only online marketing channels that, when set up correctly, can continue to pay dividends over time. If you provide a solid piece of content that deserves to rank for the right keywords, your traffic can snowball over time, whereas advertising needs continuous funding to send traffic to your site.

Search engines are getting smarter, but they still need our help.

Optimizing your site will help deliver better information to search engines so that your content can be properly indexed and displayed within search results.

Should I hire an SEO professional, consultant, or agency?

Depending on your bandwidth, willingness to learn, and the complexity of your website(s), you could perform some basic SEO yourself. Or, you might discover that you would prefer the help of an expert. Either way is okay!

If you end up looking for expert help, it’s important to know that many agencies and consultants “provide SEO services,” but can vary widely in quality. Knowing how to choose a good SEO company can save you a lot of time and money, as the wrong SEO techniques can actually harm your site more than they will help.

White hat vs black hat SEO

“White hat SEO” refers to SEO techniques, best practices, and strategies that abide by search engine rule, its primary focus to provide more value to people.

“Black hat SEO” refers to techniques and strategies that attempt to spam/fool search engines. While black hat SEO can work, it puts websites at tremendous risk of being penalized and/or de-indexed (removed from search results) and has ethical implications.

Penalized websites have bankrupted businesses. It’s just another reason to be very careful when choosing an SEO expert or agency.

Search engines share similar goals with the SEO industry

Search engines want to help you succeed. They’re actually quite supportive of efforts by the SEO community. Digital marketing conferences, such as Unbounce, MNsearch, SearchLove, and Moz’s own MozCon, regularly attract engineers and representatives from major search engines.

Google assists webmasters and SEOs through their Webmaster Central Help Forum and by hosting live office hour hangouts. (Bing, unfortunately, shut down their Webmaster Forums in 2014.)

While webmaster guidelines vary from search engine to search engine, the underlying principles stay the same: Don’t try to trick search engines. Instead, provide your visitors with a great online experience.

Google webmaster guidelines

Basic principles:

  • Make pages primarily for users, not search engines.
  • Don’t deceive your users.
  • Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you’d feel comfortable explaining what you’ve done to a website to a Google employee. Another useful test is to ask, “Does this help my users? Would I do this if search engines didn’t exist?”
  • Think about what makes your website unique, valuable, or engaging.

Things to avoid:

  • Automatically generated content
  • Participating in link schemes
  • Creating pages with little or no original content (i.e. copied from somewhere else)
  • Cloaking — the practice of showing search engine crawlers different content than visitors.
  • Hidden text and links
  • Doorway pages — pages created to rank well for specific searches to funnel traffic to your website.

Full Google Webmaster Guidelines version here.

Bing webmaster guidelines

Basic principles:

  • Provide clear, deep, engaging, and easy-to-find content on your site.
  • Keep page titles clear and relevant.
  • Links are regarded as a signal of popularity and Bing rewards links that have grown organically.
  • Social influence and social shares are positive signals and can have an impact on how you rank organically in the long run.
  • Page speed is important, along with a positive, useful user experience.
  • Use alt attributes to describe images, so that Bing can better understand the content.

Things to avoid:

  • Thin content, pages showing mostly ads or affiliate links, or that otherwise redirect visitors away to other sites will not rank well.
  • Abusive link tactics that aim to inflate the number and nature of inbound links such as buying links, participating in link schemes, can lead to de-indexing.
  • Ensure clean, concise, keyword-inclusive URL structures are in place. Dynamic parameters can dirty up your URLs and cause duplicate content issues.
  • Make your URLs descriptive, short, keyword rich when possible, and avoid non-letter characters.
  • Burying links in Javascript/Flash/Silverlight; keep content out of these as well.
  • Duplicate content
  • Keyword stuffing
  • Cloaking — the practice of showing search engine crawlers different content than visitors.

Guidelines for representing your local business on Google

These guidelines govern what you should and shouldn’t do in creating and managing your Google My Business listing(s).

Basic principles:

  • Be sure you’re eligible for inclusion in the Google My Business index; you must have a physical address, even if it’s your home address, and you must serve customers face-to-face, either at your location (like a retail store) or at theirs (like a plumber)
  • Honestly and accurately represent all aspects of your local business data, including its name, address, phone number, website address, business categories, hours of operation, and other features.

Things to avoid

  • Creation of Google My Business listings for entities that aren’t eligible
  • Misrepresentation of any of your core business information, including “stuffing” your business name with geographic or service keywords, or creating listings for fake addresses
  • Use of PO boxes or virtual offices instead of authentic street addresses
  • Abuse of the review portion of the Google My Business listing, via fake positive reviews of your business or fake negative ones of your competitors
  • Costly, novice mistakes stemming from failure to read the fine details of Google’s guidelines

Fulfilling user intent

Understanding and fulfilling user intent is critical. When a person searches for something, they have a desired outcome. Whether it’s an answer, concert tickets, or a cat photo, that desired content is their “user intent.”

If a person performs a search for “bands,” is their intent to find musical bands, wedding bands, band saws, or something else?

Your job as an SEO is to quickly provide users with the content they desire in the format in which they desire it.

Common user intent types:

Informational: Searching for information. Example: “How old is Issa Rae?”

Navigational: Searching for a specific website. Example: “HBOGO Insecure”

Transactional: Searching to buy something. Example: “where to buy ‘We got y’all’ Insecure t-shirt”

You can get a glimpse of user intent by Googling your desired keyword(s) and evaluating the current SERP. For example, if there’s a photo carousel, it’s very likely that people searching for that keyword search for photos.

Also evaluate what content your top-ranking competitors are providing that you currently aren’t. How can you provide 10X the value on your website?

Providing relevant, high-quality content on your website will help you rank higher in search results, and more importantly, it will establish credibility and trust with your online audience.

Before you do any of that, you have to first understand your website’s goals to execute a strategic SEO plan.

Know your website/client’s goals

Every website is different, so take the time to really understand a specific site’s business goals. This will not only help you determine which areas of SEO you should focus on, where to track conversions, and how to set benchmarks, but it will also help you create talking points for negotiating SEO projects with clients, bosses, etc.

What will your KPIs (Key Performance Indicators) be to measure the return on SEO investment? More simply, what is your barometer to measure the success of your organic search efforts? You’ll want to have it documented, even if it’s this simple:

For the website ________________________, my primary SEO KPI is _______________.

Here are a few common KPIs to get you started:

  • Sales
  • Downloads
  • Email signups
  • Contact form submissions
  • Phone calls

And if your business has a local component, you’ll want to define KPIs for your Google My Business listings, as well. These might include:

  • Clicks-to-call
  • Clicks-to-website
  • Clicks-for-driving-directions

Notice how “Traffic” and “Ranking” are not on the above lists? This is because, for most websites, ranking well for keywords and increasing traffic won’t matter if the new traffic doesn’t convert (to help you reach the site’s KPI goals).

You don’t want to send 1,000 people to your website a month and have only 3 people convert (to customers). You want to send 300 people to your site a month and have 40 people convert.

This guide will help you become more data-driven in your SEO efforts. Rather than haphazardly throwing arrows all over the place (and getting lucky every once in awhile), you’ll put more wood behind fewer arrows.

Grab a bow (and some coffee); let’s dive into Chapter 2 (Crawlers & Indexation).


We’re looking forward to hearing your thoughts on this draft of Chapter 1. What works? Anything you feel could be added or explained differently? Let us know your suggestions, questions, and thoughts in the comments.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!