Sunday, May 8, 2016

What day is Mother’s Day 2016? Today’s Google Doodle leads to a direct answer

Standard
mothers-day-2016-Google doodle 2016

Google is celebrating moms today with a Mother’s Day doodle illustrated by doodler Sophia Diao.

“As we get older, we forget how heavily we once relied on our mothers and mother-figures. Today’s doodle for Mother’s Day harkens back to a time in my youth when following Mom around was all I knew,” writes Diao on the Google Doodle blog.

The Mother’s Day-themed logo leads to a search for “What day is Mother’s Day 2016?” and includes the usual sharing icon so that users can post the doodle on their social pages.

Search Engine Land wishes all the moms out there a very Happy Mother’s Day!


(Some images used under license from Shutterstock.com.)

Let's block ads! (Why?)

Friday, May 6, 2016

SearchCap: Google APIs, global SEO & public relations

Standard
searchcap-header-v2-scap

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

From Search Engine Land:

Recent Headlines From Marketing Land, Our Sister Site Dedicated To Internet Marketing:

Search News From Around The Web:

Local & Maps

Link Building

SEO

SEM / Paid Search

Search Marketing


(Some images used under license from Shutterstock.com.)

Let's block ads! (Why?)

5 reasons to keep doing mobile SEO even though ads are everywhere

Standard

ss-mobile-seo“Look,” says your boss, directing your attention to the Google search results on her iPhone. She just searched on “life insurance quotes,” and the organic listings are nowhere to be seen. She actually has to scroll down past the third ad before she gets to the first organic listing.

Screenshot_20160428-145815

“Why would we continue to pay any attention to organic search on mobile when Google is only showing ads?” she says. “I just read a report from a prominent agency that said organic search visits were down seven percent year over year in Q1, as increased monetization of mobile results is pushing more traffic to paid listings, and that mobile traffic share has been flat for organic search in the past year, but it is up 10 points for paid search. Let’s just shift the budget into paid and be done with it.”

If you’re interested in growing your traffic overall, you should resist that suggestion.

While it’s true that organic search visits are down overall, according to recent reports, there are many reasons you should continue doing mobile SEO in 2016. Here are five of them.

1. The first organic listing in mobile still gets 73% more clicks than the first and second sponsored listings combined.

The first three points I’m going to mention come from research done last month by Mediative, a Montreal-based digital marketing agency and originator of the Golden Triangle study. Their white paper called “How do consumers conduct searches on Google using a mobile device?” is definitely worth downloading (registration required) if you’re interested in solid research on mobile search behavior.

[See the full story on Marketing Land]

Some opinions expressed in this article may be those of a guest author and not necessarily Search Engine Land. Staff authors are listed here.


(Some images used under license from Shutterstock.com.)

Let's block ads! (Why?)

Search in Pics: Karl Urban I’m not an SEO, make SEO great again hat & Google wine club

Standard

In this week’s Search In Pictures, here are the latest images culled from the web, showing what people eat at the search engine companies, how they play, who they meet, where they speak, what toys they have and more.

Google wine club:

Google wine club
Source: Twitter

Make SEO great again hat:

Make SEO great again hat
Source: Twitter

Karl Urban in Star Trek: I’m a doctor, not an SEO:

I'm a doctor, not an SEO
Source: Facebook

Google New York LEGO wall:

Google New York LEGO wall
Source: Twitter


(Some images used under license from Shutterstock.com.)

Let's block ads! (Why?)

Google updates the Google My Business API to version 3.0

Standard
google-world-maps5-ss-1920

Google has released version 3.0 of the Google My Business API. This has not yet been announced by Google but you can see version 3.0 marked as new in the changelog and the new features page, it documents the new API.

The original Google My Business API was released in December of last year bringing the ability for businesses to automate managing their business listings.

Version 3.0 “adds new functionality for people who manage locations at scale,” Google said. The “key new features include the ability to read and respond to customer reviews and provide additional attributes for locations, such as whether a restaurant accepts reservations, serves brunch, or has outdoor seating,” Google added.

Here is the changelog for version 3.0:

  • Attributes Provide additional, category-specific information about locations.
  • Find Matching Location Find and manually associate existing maps locations with your business location.
  • Transfer Location New action on Location :transfer. Allows transferring a location from one account (business or personal) to another.
  • Preferred Photo Indicate which photo you’d prefer to show up first in Google Maps and Search.
  • New Search Filters New search filters include any_google_updates, is_suspended, and is_duplicate.
  • New Location States Location states now also include is_verified and needs_reverification.
  • Photo URL Improvements The API now accepts photo URLs without an image format suffix.
  • Backwards incompatible changes Photos can now only be updated for locations with a Google+ page (these were accepted and silently dropped before). The location_name and category_name fields are now output only. Only use category IDs when setting categories. Field masks no longer require the location. prefix for included fields. Create/update operations now take the location as the body payload, other parameters are moved to the query string.

(Some images used under license from Shutterstock.com.)

Let's block ads! (Why?)

Thursday, May 5, 2016

SearchCap: Google featured snippets, Viv assistant & AdWords redesign preview

Standard
searchcap-header-v2-scap

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

From Search Engine Land:

Recent Headlines From Marketing Land, Our Sister Site Dedicated To Internet Marketing:

Search News From Around The Web:

Local & Maps

Link Building

Searching

SEO

Search Marketing


(Some images used under license from Shutterstock.com.)

Let's block ads! (Why?)

Son of Siri: Viv aims to go way beyond today’s digital assistants

Standard

There’s a confluence of technology advancements that are dramatically changing “search”: mobile, artificial intelligence, big data and natural language processing. From Siri and Alexa to Facebook M and Jibo, voice UIs and virtual assistants are the future.

Ahead of its public unveiling on Monday, the WashingtonPost ran a story on next-generation virtual assistant Viv. Viv could be described as Son of Siri or Siri 2.0, with much more focus on AI and commerce. It’s built by the same people who launched Siri before Apple acquired it, including co-founder Dag Kittlaus.

Believe it or not Siri launched way back in 2009 with the goal of advancing the search experience using a natural language interface and delivering actionable/transactional results rather than a SERP. The Post article uses the example of ordering pizza from a nearby restaurant to showcase Viv’s conversational-transactional potential:

“Get me a pizza from Pizz’a Chicago near my office,” one of the engineers said into his smartphone. It was their first real test of Viv, the artificial-intelligence technology that the team had been quietly building for more than year. Everyone was a little nervous. Then, a text from Viv piped up: “Would you like toppings with that?”

In fact, this was always the vision for Siri. The idea was to enable people to speak their questions and objectives, which would then be fulfilled by third party providers via back-end API integration thereby cutting out the SERP. However that vision was only partly realized before Apple acquired Siri. And while Cupertino has certainly improved Siri’s functionality and usability, it hasn’t invested to enable Siri to achieve its full potential.

Now Viv hopes to pick up where Siri left off.

The company has been building its technology for several years. But rather than present itself as a next-gen search engine or even a digital assistant, Viv’s positioning is much more focused on the AI angle. The company’s website says it “radically simplifies the world by providing an intelligent interface to everything.”

If this doesn’t sound like Google or a replacement for Google I’m not sure what does.

The Post article says that there have already been acquisition offers from Google and others. It also says that Facebook’s Mark Zuckerberg is an indirect investor. If Viv can deliver anything approaching its lofty ambitions it will be bought in short order. Though Kittlaus and his co-founders might resist, hoping to see what the technology can achieve further along.

Another intriguing angle in the Post story is the way that Viv (and related technologies) might not only displace search but might equally disrupt apps. With a voice-powered virtual assistant that can can fulfill transactions (“order a pizza,” “get Uber,” “make a hotel reservation”), apps hypothetically become less necessary, if not unnecessary.

The issue, as with Siri, is deciding who fulfills the request. However, I’m sure Kittlaus and his team have thought carefully about this.

In the original vision for Siri, users would specify a favorite provider (e.g., OpenTable, Kayak, etc.) to handle fulfillment. But because voice is an imperfect interface and complex transactions cannot be fulfilled by speech and voice prompts alone it’s likely that apps (and the mobile web) will stick around for the foreseeable future.

According to 2015 research from MindMeld, use of voice search and virtual assistants is growing dramatically. In addition, Amazon Echo (with assistant Alexa) has proven to be the company’s most popular hardware device. And Microsoft just announced that Cortana “has helped answer over 6 billion questions since launch.”

All these developments show significant momentum for voice and virtual assistants. As that continues, powered by AI and better results (including predictive results), major questions will arise for publishers, developers and advertisers. For example, what will happen to SEM and the search ad model? How can publishers and brands optimize content for voice search?

Nothing will change in the near term. Yet the combination of the technology developments I mentioned above will all but guarantee that search, content retrieval and commerce will look radically in a few years than they do today.


(Some images used under license from Shutterstock.com.)

Let's block ads! (Why?)

Google featured snippets now with related topics, extending the information in those snippets

Standard
google-code-seo-algorithm9-ss-1920

Google is now showing extended featured snippets, where they add more information to the top featured snippet box for some queries. Featured snippets are often seen in the Google search results when Google is confident they can answer your query by extracting content from a specific web page.

Now, it seems Google is showing extended versions of them showing “related topics” that hyperlink to additional queries in Google.

Here are two screen shots, one for [birth control] and the other for [personal loan]. Both those queries currently bring up extended featured snippets for me:

extended-featured-snippet-2-1462452998extended-google-featured-snippet-1462448970

It is not surprising to see Google place more content and information in the featured snippet box.


(Some images used under license from Shutterstock.com.)

Let's block ads! (Why?)

Wednesday, May 4, 2016

Updated: Google penalizes mobile sites using sneaky redirects

Standard
google-penalty-blue-ss-1920

In October 2015, Google warned webmasters not to trick mobile users by redirecting them to an unsuspecting web site. Well, today, Google announced on Google+ and Twitter that they have been “taking action on sites that sneakily redirect mobile users to spammy domains.” Google issued a correction with Search Engine Land that they did not issue any new manual actions recently, that this post on Twitter was just to remind webmasters not to use sneaky redirects.

Google wrote, “As mentioned in Webspam Report 2015, spam reports from users are an important part of our spam-fighting efforts. They often help us surface issues that frustrate users – like the trend of websites redirecting mobile users to other, often spammy domains.” Google added, “to combat this trend, we have been taking action on sites that sneakily redirect users in this way.”

Sneaky redirects are never a good thing and Google has penalized web sites for directing the user to a site they do not expect to go to from the search results. The same for those using a mobile device and searching on Google’s mobile results. Google wants the user to land on a site they expect to land on based on the snippet Google shows.

Here is an illustration of that behavior:

google-sneaky-redirects

Google added that “if your site has been affected, read this Help Center article on detecting and getting rid of unwanted sneaky mobile redirects.

We are in the process of getting more details from Google on this announcement.

Postscript: Google has updated us telling us that this an old notification and no new manual actions have been sent out today.


(Some images used under license from Shutterstock.com.)

Let's block ads! (Why?)

SearchCap: Google rich snippet spam, AMP report & Bing ads data

Standard
searchcap-header-v2-scap

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

From Search Engine Land:

Recent Headlines From Marketing Land, Our Sister Site Dedicated To Internet Marketing:

Search News From Around The Web:

Link Building

Searching

SEO

SEM / Paid Search

Search Marketing


(Some images used under license from Shutterstock.com.)

Let's block ads! (Why?)

Bing Ads makes it easier to segment performance data like time of day and device 

Standard
data segmentation in bing ads

Performance analysis in Bing Ads just got easier with the addition of a new Segment tab in the campaigns page.

You can segment your campaign data by time (of day, day of week, and even by month, quarter, or year), network, device, and top vs. other from the main interface rather than having to navigate to the Reports section.

To download segmented campaign data, you’ll need to select the option you want from the Download report window (just like in AdWords).

Note, if you’re looking to see segmentation by time, there are some limits. For example, if you select “Day”, the maximum data range you can look at is 16 days.

The feature is now available to accounts in the US and UK.


(Some images used under license from Shutterstock.com.)

Let's block ads! (Why?)

Google updates the AMP report in the Google Search Console

Standard
google-amp-speed-race-fast-ss-1920

Google’s John Mueller announced on Google+ that the Google Search Console report for AMP pages has been updated to provide improve the categorization and “better group similar issues.” It also will now give you more information on the individual problems that Google discovered while crawling your AMP pages.

Here is a screen shot of the updated report:

google-search-console-amp-report-update

Google first added this report in January 2016 to help publishers debug their AMP errors. This would be the first update to the report since then.

John Mueller added, “if you’ve set up AMP, even if you just installed the AMP plugin for your blog, I’d recommend checking it out!” You can access it by clicking here and selecting a verified site in your Google Search Console profiles.


(Some images used under license from Shutterstock.com.)

Let's block ads! (Why?)

Jane Jacobs’ Google Doodle marks 100th birthday of famous NYC community organizer

Standard

Jane Jacobs Google Doodle
Today’s Google Doodle celebrates urban activist and community organizer Jane Jacobs on what would have been her 100th birthday.

Known for her research and work around urban development, Jacobs believed a city is at its best when its residents can interact with each other on the streets and patronize local businesses.

“She stood by beloved neighborhoods that were unjustly slated for ‘renewal’ and revealed political biases in the permit process for new projects,” reports the Google Doodle blog.

Today’s Doodle honors the 100th birthday of this fierce protector of New York City’s urban landscape.

Referred to as a “self-taught journalist,” Jacobs penned a number of books on urban development, including “The Death and Life of Great American Cities,” “Dark Age” and “The Economy of Cities.”

Google’s Jane Jacobs Doodle leads to a search for “Jane Jacobs” and includes a sharing icon to post the image on social networks.

Along with a brief overview of Jacobs’ work, the Google Doodle Blog also included the following initial sketches of today’s Jane Jacobs Doodle:
Jane Jacobs doodles


(Some images used under license from Shutterstock.com.)

Let's block ads! (Why?)

Performing a manual backlink audit, step by step

Standard
Backlink detective.

It might be every SEO’s least favorite job: the backlink audit. This is not because the work itself is horrible (though it can be tedious on sites with large link footprints), but because it’s almost always performed when a domain is in trouble.

Whether you’re reading this article because you’re an SEO looking at new strategies or a site owner that has received a link-based penalty, I hope you find the methodology below helpful.

I should note before proceeding that I prefer robust datasets, and so I’ll be using four link datasets in the example. They are:

Though I have paid accounts with all of the tools above (except for the Search Console, which is free), each offers a way to get the data for free — either via a trial account or free data for site owners. There are also other link sources you can use, like Spyfu or SEMrush, but the above four combined tend to capture the lion’s share of your backlink data.

Now, let’s begin …

Pulling data

The first step in the process is to pull the data from the above listed sources. Below, I will outline the process for each platform.

Google Search Console

  1. Once logged in, select the property you want to download the backlinks from.
  2. In the left navigation, under “Search Traffic,” click “Links to Your Site.”
  3. Under the Who links the most column, click “More.”
  4. Click the buttons, “Download more sample links” and “Download latest links,” then save each CSV to a folder.
Downloading backlinks from Search Console

Majestic

  1. If you don’t have one, create an account, as you’ll need it to export data. If all you want is access to your own site’s data (which is what we want here), they’ll give you free access to it. You can find more information on that at http://ift.tt/10RfLxL. The rest of these instructions will follow the paid account process, but they are essentially the same.
  2. Enter your domain into the search box.
  3. Click the “Backlinks” tab above the results.
  4. In the options, make sure you have “All” selected for “backlinks per domain” and “Use Historic Index” (rather than “Use Fresh Index”).
  5. Click “Export Data,” and save the file to the folder created earlier.
  6. If you have a lot of data, you will be directed to create an “Advanced Report,” where you then need to create a “Domain Report.”
Exporting backlink data from Majestic.

Ahrefs

  1. If you don’t have an account, you can sign up for a free trial.
  2. Enter your domain into the search box.
  3. Click “Backlinks” at the top of the left navigation.
  4. Select “All links” in the options above the results.
  5. Click “Export,” and save the file to the folder created earlier.
Exporting backlink data from ahrefs.

Moz

  1. If you don’t have an account, sign up for a free trial to get complete data.
  2. From the home page, click “Moz Tools” in the top navigation, then “View all Moz products” in the drop-down.
  3. Click “Open Site Explorer” on the resulting page.
  4. Enter your domain into the URL field.
  5. In the options above the results, select “this root domain” under the target.
  6. Click “Request CSV,” and when it’s available, save it to the same folder as the other files.
Exporting backlink data from Moz's Site Explorer.

Conditioning your data

Next, we need to condition the data by getting all of the backlinks into one list and filtering out the known duplicates. Each spreadsheet you’ve downloaded is a little different. Here’s what you’re looking at:

  • Google Search Console. You’ll have two spreadsheets from the Search Console. Open them both, and copy all the URLs in the first column of each to a new spreadsheet, removing the header rows. Both will go into Column A, one after the other.
  • Majestic. In the Majestic download, you will find the link source URL in Column B. You will want to copy all of these URLs into the same spreadsheet that you’ve copied the Search Console links into. To do this, you will insert the Majestic data directly below the Search Console data in column A.
  • Ahrefs. Ahrefs puts the source URL in column D. Copy all of these URLs to the new spreadsheet, again in Column A, directly below the links you’ve already added.
  • Moz. Moz puts the source URL in Column A. Copy all of these URLs again into Column A of your new spreadsheet directly below the other data you have entered.

Now you should have a list of all the backlinks from all four sources in Column A of a new spreadsheet. You will then select Column A, click on the “Data” tab at the top (assuming you’re working in Excel) and click “Remove Duplicates.” This will remove the links that were duplicated between the various data sources.

The next step is to select all the remaining rows of data and copy them into a Notepad document, then save the document somewhere easily referenced.

And now the fun part…

You’ve now got a list of all of your inbound backlinks, but that’s not particularly useful. What we want to do next is to gather unified data for them all. That’s where URL Profiler comes in. For this step, you’ll have to download URL Profiler. Like the other tools I’ve mentioned thus far, URL Profiler has a free trial; so if it’s a one-off, you can stick with the trial.

Once downloaded and installed, there’s a bit of a setup process designed to aid you in a speedy analysis. The first thing you’ll need to do is click the “Accounts” menu, which will bring up the windows to enter your API keys from the various tools discussed previously.

Helpfully, each tab gives you a link to the step-by-step instructions on getting your various API keys, so I won’t cover that here. That leaves me to get to the good part…

You will now be presented with a screen that looks like this:

URL Profiler

The first step is to right click the large, empty URL List box on the right and select to “Import From File.” From there, choose the Notepad document you created with the links in your spreadsheet above.

You’ll now see a list of all your backlinks in the box, and you’ll need to select all the data that you want to collect from the boxes on the left. The more data you want, the longer it will take, and the more you’ll have to weed though — so you generally only want to select the data relevant to the task at hand. When I am looking for low-quality links, I tend to select the following:

Domain-level data

  • Majestic [Paid]
  • Moz
  • Ahrefs
  • Social Shares
  • Site Type
  • IP Address

URL-level data

  • Majestic [Paid]
  • Moz
  • Ahrefs
  • HTTP Status
  • Social Shares

In the “Link Analysis” field at the bottom, you will enter the your domain. This will leave you with a screen similar to this one:

URL Profiler: Ready to go.

Click “Run Profiler.” At this stage, you can go grab a coffee. Your computer is hard at work on your behalf. If you don’t have a ton of RAM and you have a lot of links to crawl, it can bog things down, so this may require some patience. If you have a lot to do, I recommend running it overnight or on a machine dedicated to the task.

Once it’s completed, you’ll be left with a spreadsheet of your links. This is where combining all of the data from all of the backlink sources and then unifying the information you have on them pays off.

So, let’s move on to the final step…

Performing your backlink audit

Once URL Profiler is done, you can open the spreadsheet with the results. It will look something like this:

URL Profiler output.

Now, the first thing I tend to do is delete all of the tabs except “All.” I love tools that collect data, but I’m not a fan of automated grading systems. I also like to get a visual, even on the items I will be moving back to similar tabs that I am deleting in this stage (more below).

With those extra tabs removed, you are left with a spreadsheet of all your backlinks and unified data. The next step is to remove the columns you don’t want cluttering your spreadsheet. It’s going to be wide enough as-is without extra columns.

While the columns you select to keep will depend on specifically what you’re looking for (and which data you decided to include), I tend to find the following to be globally helpful:

  • URL
  • Server Country
  • IP Address
  • Domains On IP
  • HTTP Status Code (and if you don’t know your codes, HTTP Status)
  • Site Type
  • Link Status
  • Link Score
  • Target URL
  • Anchor Text
  • Link Type
  • Link Location
  • Rel Nofollow
  • Domain Majestic CitationFlow
  • Domain Majestic TrustFlow
  • Domain Mozscape Domain Authority
  • Domain Mozscape Page Authority
  • Domain Mozscape MozRank
  • Domain Mozscape MozTrust
  • Domain Ahrefs Rank
  • URL Majestic CitationFlow
  • URL Majestic TrustFlow
  • URL Mozscape Page Authority
  • URL Mozscape MozRank
  • URL Mozscape MozTrust
  • URL Ahrefs Rank
  • URL Google Plus Ones
  • URL Facebook Likes
  • URL Facebook Shares
  • URL Facebook Comments
  • URL Facebook Total
  • URL LinkedIn Shares
  • URL Pinterest Pins
  • URL Total Shares

And for those who have ever made fun of me that my desk looks like …

Dave Davies' desktop.

… now you know why! While doable on a single monitor, it would require a lot of scrolling. I recommend at least two monitors (and preferably three) if you have a lot of backlinks to go through. But that’s up to you.

Now, back to what we do with all these rows of backlinks.

The first step is to create three new tabs. I name mine: nofollow, nolink and nopage.

  1. Step one: Sort by HTTP status, and remove the rows that don’t produce a 200 code. Essentially, these pages did once exist and don’t anymore. I will occasionally run them through the URL Profiler again just to make sure a site is not temporarily down, but for most uses, this isn’t necessary. Move these to the “nopage” sheet in your Excel doc.
  2. Step two: Sort by Link Status. We only need the links that are actually found. The databases (especially Majestic’s historic) will hold any URL that had a link to you. If that link has been removed, you obviously don’t want to have to think about it in the auditing process. Move these to the “nolink” tab.
  3. Step three: Sort by Rel Nofollow. In most cases, you don’t need to spend time on the nofollowed links, so it’s good to get them out of the data you will be going through. Move these to “nofollow.”

In the site I am using in this example, I started with 10,883 rows of links. After these three steps, I am left with 5,393. I now have less than half the links I initially had to sort through.

Working with the remaining data

What you do with your data now will depend on specifically what you are looking for. I can’t possibly list off all the various use cases here, but following are a few of the common sorting systems I use to speed up the review process and reduce the number of individual pages I need to visit when trying to locate unnatural links:

  • Sort by anchor text first, then URL. This will give you a very solid picture of anchor text overuse. Where you see heavy use of specific anchors or suspicious ones (“payday loans,” anyone?), you know you need to focus in on those and review the links. By grouping by domain secondarily, you won’t accidentally visit 100 links from the same domain just to make the same decision. It will also make run-of-site issues far more apparent.
  • Sort by domains on IP first, then URL. This will give you a very quick grasp of whether your backlink profile is part of a low-end link scheme. If you’ve bought cheap links, you might want to start with this one.
  • Sort by site type first, then Majestic, Ahrefs or Moz scores. I’ll leave it up to you which score you trust more, though none should be taken as gospel. These scores are based on algorithms built by some very smart people, but not Google. That said, if you see good scores on all three, you can at least review the site knowing this.

As I’m reviewing — and before visiting a link — I tend to scan the various scores, the social shares for the URL and the Link Location. This will tell me a lot about what I’m likely to find and where I’m likely to find it.

Over time, you’ll develop instincts on which links you need to visit and which you don’t. I tend to view perhaps more than I need to, but I often find myself working on link penalty audits, so diligence there is the key.

If you are simply wanting to review your backlinks periodically to make sure nothing problematic is in there, then you’ll likely be able to skip more of the manual reviewing and base more decisions on data.

In conclusion

The key to a good audit of any type is to collect reliable data and place it in a format that is as easy as possible to digest and work with. While there’s a lot of data to deal with in these spreadsheets, any less and you wouldn’t have as full a picture.

Though this process isn’t automated (as you’re now well aware), it dramatically speeds up the process of conducting a backlink audit, reduces the time you need to spend on any specific page judging your links (thanks to the developer for adding the “Link Location” in) and allows for faster bulk decisions.

For example, in simply sorting by URL, I quickly scanned through the list and found directory.askbee.net linking 4,342 times due to some major technical issues with a low-quality directory. Now, we’re down to 1,051 links to contend with.

Again, each need requires different filters, but as you play with different sorting based on what you need to accomplish, you’ll fast discover that a manual backlink audit, while taxing and time-consuming, doesn’t have to be the nightmare it can often seem.

Some opinions expressed in this article may be those of a guest author and not necessarily Search Engine Land. Staff authors are listed here.


(Some images used under license from Shutterstock.com.)

Let's block ads! (Why?)

Brand bidding & PPC optimization: future of brand protection (part 8)

Standard
ppc-blue-mobile-ss-1920

Welcome to the conclusion of my eight-part series on brand bidding. If you’ve stuck with me this far, you’ve seen why brand bidding deserves the 10,000+ words I’ve written on the topic.

As an ad monitoring platform with global reach, The Search Monitor (disclosure: my employer) is privy to a wealth of performance data on brand bidding. We’ve had two major “aha” moments lately:

  1. The top PPC performers are vigorously defending their branded terms, while simultaneously bidding smartly on others’ branded searches.
  2. Many marketers pay little attention to their own branded searches and are outright missing opportunities to bid on their competitors’ branded terms.

Before we introduce today’s topic, let’s review what we’ve discussed so far:

  • Part 1: How We Got Here. A nostalgic walk through the history of PPC bidding, starting with the good old five-cent click days.
  • Part 2: Value of Keywords. Filled with juicy stats on the value of brand bidding — make a case for your boss to spend more here!
  • Part 3: Best PracticesProvides detailed implementation tactics — the how-tos of brand bidding. Most important read of the list.
  • Part 4: Partner Relationships. Discusses different partner options, the benefits of working with them and effective tactics for blocking out competitors on branded keywords.
  • Part 5: Reducing Competition. How to deal with brand bidding competitors, including a case study for Avery office supplies with impressive brand protection results.
  • Part 6: Enforcement OptionsDiscusses your legal options and how to enact them, including search engine complaints, pacts and agreements, and the dreaded lawsuit.
  • Part 7: Effective Bidding Techniques. Focuses on the most effective brand bidding techniques we’ve seen in 2016, including screen shots from top PPC advertisers across five different products.

Today’s article focuses on the future of brand protection. I will provide five important trends in brand bidding, each accompanied by a tip for taking advantage of the trend. So, let’s go! 

Trend #1: Agencies will incorporate brand bidding into best practices 

I predict that brand bidding as an optimization tool will grow into an agency best practice. Agencies have a huge opportunity to use brand bidding to improve client campaign performance, increase retention and boost their business development efforts.

For campaign performance, just look at the results produced for Avery in this brand optimization case study. You’ll see how an agency used brand bidding and optimization to boost Avery’s clicks and bring CPC (cost per click) down.

Many of our agency clients have told us how they routinely highlight their brand protection services (and actual results) when approaching a new client and actually challenge the new client to find similar results with other agencies.

I expect demand for brand optimization skills to continue to grow, especially when you pitch stronger brands. So, start working on those results slides now! 

Trend #2: Regulated industries monitor beyond the SERP 

The Search Monitor has seen an uptick in demand for Content Monitoring (That’s what we call it). Content monitoring entails vast monitoring of landing pages, websites, blogs and email based on rules created by the brand owner.

Highly regulated industries like finance (e.g., credit cards, mortgages and educational loans) and pharmaceutical have to abide by strict marketing rules from government agencies such as the Financial Industry Regulatory Agency (FINRA), the Consumer Financial Protection Bureau (CPFB) and the Food & Drug Administration (FDA).

Even retail, to some extent, has to be careful, because the Federal Trade Commission (FTC) now requires disclaimers on review and blog sites that promote retail products.

The volume of content on the internet can quickly put a regulated brand out of compliance. Even if the violations are accidental and performed by an affiliate, the result can be punishments for the advertiser, including heavy fines.

It will be common practice in the future for agencies and marketing departments working with finance, educational, pharma and retail advertisers to monitor beyond the search results. They will need to expand coverage to web pages, blogs and email in order to fully protect themselves against any potential government fines.

Trend #3: Manufacturers will adopt MAP compliance

The Search Monitor has also seen a rise in inquiries from manufacturers who need help monitoring their retailers for minimum advertised price (MAP) violations. Retailers sometimes lower prices below MAP to attract customers and stay competitive. When retailers are in price parity with one another, it causes a ripple effect among other retailers, and very quickly, a premiere brand is selling at prices below MAP.

Each year, we see more vendors providing MAP compliance service. We even saw a Harvard Business School research paper that tested different approaches to increasing MAP compliance (spoiler alert: enhanced monitoring and more credible punishments were the most effective at curbing violations).

I predict that MAP Compliance will be viewed not only as a brand defense tool, but also as a revenue driver for manufacturers, boosting sales by keeping reseller prices in check and preventing parity. 

Trend #4: URL hijacking continues to impact brands

URL hijacking (aka direct linking) is a form of brand bidding where an unauthorized advertiser uses your URL as their display URL.

Who does this? Some (not all) affiliates use this tactic to get easy commissions without having to create a website and brand of their own. Other common offenders are phishing sites who want your traffic and manage to slip by the engine’s editorial review.

If URL hijacking is happening to you, your ads get bumped and replaced by the hijacker’s ads, which will greatly impact your metrics and optimization efforts.

While the search engines could put a stop to this activity with URL ownership verification, I do not see the engines making an effort toward this any time soon.  The best defense, instead, is to monitor, quantify the impact, and then use enforcement techniques discussed in Part 6 of this series.

Brian Wensel, digital media director at R2C Group, shared how his agency quantifies the impact of URL hijacking for their clients: “R2C Group uses The Search Monitor’s Knock-Out statistic to augment our impression share (IS) data from Google. We’ve learned that Google’s IS measurement does not account for hijacker activity, even when we know hijacking is happening. Only the knock-out stat alerts us right away to the possibility of URL hijacking.” 

Trend #5: Hotel brands will adopt price parity compliance

Another brand protection issue we’ve seen on the rise is unique to hotel brands. Similar to manufacturers, they need to make sure their resellers, the online travel agencies (OTAs), are in compliance with their listed room rates.

In particular, we’re seeing increasing adoption of The Search Monitor’s hotel price parity reports. These monitor the hotel listings module on Google, looking for price parity for the same property between resellers such as Expedia, Kayak and Travelocity.

Usually, price parity is an accident or oversight, but because it can cost hotel brands to lose clicks to their OTAs or to lose money on an incorrect room rate, forward-thinking hotel brands will focus on controlling price parity in the future.

Final thoughts on brand bidding & PPC optimization

I started this series by showing how brand bidding is the latest in a long timeline of PPC growth tactics. But this tactic has a defined shelf life, which is why I created this series, so you can jump on board now.

I’ve provided all the tips you need to protect your brand and bid effectively on others. For some last-minute tips before I conclude, check out articles on keyword selection, working with partners, legal options and effective bidding techniques, and an eye-opening brand bidding case study from Avery.

“The Shawshank Redemption” famously told us to “get busy livin’, or get busy dyin’.” Advertisers have a similar choice. They can stand up and actively protect their valuable branded searches and nurture their potential. Or they can remain complacent and watch their competitors steal their clicks, letting their performance deteriorate. So, what are you waiting for?

Some opinions expressed in this article may be those of a guest author and not necessarily Search Engine Land. Staff authors are listed here.


(Some images used under license from Shutterstock.com.)

Let's block ads! (Why?)

5 super-common SEO mistakes content marketers make

Standard
oops-error-mistake-ss-1920

Without a sufficient amount of link authority, Google isn’t going to give your site the time of day. If it seems that despite all your content marketing efforts, the needle just isn’t moving in the Google SERPs, then you’re probably making one of these five avoidable SEO mistakes:

1. Hustling for likes instead of links

Your social strategy is completely misaligned if you aren’t incorporating SEO goals into it. SEO must underpin your social media strategy. It’s not the likes, retweets, shares and plus-ones that are going to prop you up in Google; it’s the links. As such, getting influential bloggers to link to your site should be a primary raison d’être for your social campaigns.

Consider, for example, Old Spice’s gag website TheFlatteringMan.com. The site got plenty of press, but did they put even a single link back to OldSpice.com to leverage their link authority? Nope.

old-spice-muscle-shirtold-spice-muscle-shirt-html-source

2. Misplacing the content

Remarkable content needs a home where it will attract the most links to your main site, where the links lead directly to your site and not through an intermediary site with lots of links to other folks’ sites, and where the links to your site won’t be nofollowed.

Thus, a social site like YouTube, Pinterest, Twitter, LinkedIn or Facebook is a less-than-ideal home for your content because external links are nofollowed. Even a third-party site like Huffington Post or BuzzFeed can be less than ideal, since you are at the mercy of their editorial guidelines and the number of competing links on the page.

If you wrote a great listicle, you’d think it would be a huge win to get it published on BuzzFeed. It’s not, at least not from an SEO perspective. That’s because BuzzFeed doesn’t allow you to drop links to your site, even if you are a paying advertiser such as Victoria’s Secret.

Victoria’s Secret may have received quite a few views on their “12 Things Women Do Every Day That Are Fearless” piece in BuzzFeed, but take a look, and you will see that there isn’t a single link back to the Victoria’s Secret website. It’s only after you click the author link that you see a link to the main site.

For those of us trying to build up our link authority, the best spot for hosting our linkworthy content will almost always be on our own site.

buzzfeed-women-article-html-source

3. Targeting the wrong audience

This can be tough to wrap your head around, but you already blew it if your content marketing campaign is laser-targeted to your ideal customer.

From an SEO perspective, your most important audience isn’t your customers, it’s the linkerati, i.e., the online influencers who have the most authority in the eyes of Google. Yes, you are going to get the most out of your campaign by targeting those who can link back to you from trusted, authoritative, important sites. If you are only writing content pieces for your customers, you are missing the boat.

Now of course, relevance is still a factor here. If you’re getting a link from a big player in the game design community and you sell yoga clothes, it isn’t going to necessarily be as helpful to you.

To target the linkerati, it becomes as simple (and as complex) as creating content that people want to link to. That sometimes means you’re going to need to think outside the box and branch out a bit from your traditional approach to content. Simply being helpful, useful or educational is not going to cut it. You need to create remarkable content — content worth spreading.

Caterpillar hit it out of the park with their giant Jenga campaign, which featured a video of two CAT machines playing the world’s largest game of Jenga with massive wooden blocks weighing eight tons in total. The video was hosted on YouTube (with almost 3.5 million views!), and CAT cleverly created a page on their site featuring the video to attract links.

jenga-stack-challengejenga-stack-challenge-youtube

ShipServ, on the other hand, lost the content game before it even started with their explainer video made with Legos. Cute concept, but a big disconnect: only serious prospects of their software would want to watch a video explaining how their solution worked.

It didn’t appeal to the army of influencers online, and it shows, with views in the thousands rather than the millions.

shipserv-youtube

LifeInsure.com successfully tapped into an unexpected goldmine of links and buzz with their article, “19 Things You Probably Didn’t Know About Death.” It doesn’t seem like the type of politically correct fare a life insurance brokerage might feature, right?

But, brilliantly, that article wasn’t targeted to customers. In fact, you would never find the article by poking around for it on their site. It’s an orphan page. The intended audience for the article were the linkerati, and it was seeded into social media sites where the linkerati hang out.

The approach paid off in spades. The lifeinsure.com home page ranked on page 1 for “life insurance” in Google, Bing and Yahoo — for years.

lifeinsure-website

4. Being activity-focused

Many SEO practitioners, unfortunately, are task-oriented. They believe that just because it’s a “best practice,” it deserves to be on the to-do list. I challenge that thinking. I’d argue you probably have items on your SEO to-do list that aren’t worth doing and should be removed; they simply aren’t going to move the needle enough.

Or they may be a second-order activity for when you have time after finishing all the first-order activities. Meta descriptions would fit in that category. They don’t influence your rankings, thus they don’t deserve to be prioritized up there with title tags.

Instead, I suggest being outcome-focused: creating a big, hairy, audacious goal, making sure everyone on your team is on board with the goal and systematically working to achieve it. Once the desired outcome has been achieved, come up with a new goal, rather than working through the rest of the to-dos.

5. No help from a power user

Power users are a link builder’s secret weapon. Power users are bloggers, social media mavens, journalists or celebrities with a huge following on social media, and thus, huge reach.

The amplification that power users can provide is game-changing. You don’t need an army of them. All you really need is one power user in your hip pocket.

It can be a challenge to recruit that power user, but once you do, that power user can provide the initial push that starts the snowball effect you need to go viral. In fact, the primary reason for the success of the aforementioned LifeInsure.com campaign was a power user.

Be prepared to pay for that power user, either in cash or in favors. Nothing’s free in this world. If you don’t know how to find that power user, look to your SEO or social media consultant. They may already have a relationship with one.

Over the years, I have developed relationships with power users on reddit, StumbleUpon, Facebook, Twitter, Pinterest and Instagram, among others. Those relationships are worth their weight in gold.

Case in point: Power user Jeremy Schoemaker, aka Shoemoney. I asked, and he agreed, to run a contest in conjunction with my client, OvernightPrints.com. My campaign idea was that an entrant could “Win Free Business Cards for Life” by designing Jeremy’s new business card.

Jeremy is a major online influencer. His promoting the contest and my client on his blog, on YouTube and so on, made a huge and lasting impact. That contest got my client to #2 in Google for “business cards” — and they were buried deep in the SERPs prior to this campaign!

shoemoney-website

There’s also a surprisingly effective way to do cold outreach to influencers via email, but I’ll save that for my next column.

In that article, I’ll also discuss the wrong language to use in your campaigns, how to collect intel on your competitors and various inadvertent ways to destroy the SEO value of your campaigns.

Some opinions expressed in this article may be those of a guest author and not necessarily Search Engine Land. Staff authors are listed here.


(Some images used under license from Shutterstock.com.)

Let's block ads! (Why?)