Monday, May 9, 2016

Voice search reporting coming To Google Search Console’s Search Analytics report

Standard
google-tools1-ss-1920

Google webmaster trends analysts, John Mueller, said on Friday in a Google hangout at the 23 minute mark that Google is looking for ways to show webmasters in the Google Search Console how people are finding their pages through voice search.

John explained that Google wants to provide a way to segment out how people search for your site using a keyboard versus voice search in the Search Analytics report. John said, Google wants to “kind of make it easier to pull out what people have used to search on voice and what people are using my typing. Similar how we have desktop and mobile setup separately.”

John added that it is “tricky” because many of the voice searches are done in much longer form sentences and thus, by default, Google Search Analytics may not see enough volume for that query and group it together with the lower volume keywords, thus not showing it in the report. But he said they did have a discussion internally about how to go about separating out voice searches in that report.

Here is the Q&A I transcribed:

(Q) Does Google plan to include Voice Search Search console reports in the future?

(A) I don’t know what the exact plans are there but we have discussed something like that. To kind of make it easier to pull out what people have used to search on voice and what people are using my typing. Similar how we have desktop and mobile setup separately. I think some of that might be trickier because in practice voice queries are more long form, they’re more like sentences and real questions. And sometimes those exact queries don’t get used that often so we might be filtering them out in search console. But it’s definitely something we’ve talked about, we’ve looked into different other types of queries or search results as well to see if there is something that we could be doing their differently. If you have any explicit examples of specifically how you think this type of feature or any other feature in such console could make it easier to to really make high-quality websites, to really get some value out of search console in in a way that makes sense for you to improve your service for users then we’d really love to see those and examples.

You can hear it yourself at the 23 minute mark.

John also added later at the 26 minute mark that Google also wants to segment out AMP results as well.


(Some images used under license from Shutterstock.com.)

Let's block ads! (Why?)

Google widely testing the title links in black, instead of the traditional blue hyperlink color

Standard
lab-test-experiment-ss-1920

Over the weekend, Google began testing a widely noticed change to their search results listing page by changing the color of the titles in the search results listings snippet from the traditional blue color to a black color.

Here is one of the many pictures and screen shots of this in action, this one is from @matibarnes:

google-black-links

Compare the above screen shot to what most people see, the blue links, and you can see why so many people are complaining:

google-blue-title-links

We emailed Google for a comment about this on Saturday but we have yet to hear back. I suspect the response will be something like, “we are constantly testing new ways to improve the user experience, and this is just one of those many tests.”

Overall, it seems like the feedback that we’ve been hearing about the change in the color of the link is mostly negative.


(Some images used under license from Shutterstock.com.)

Let's block ads! (Why?)

Sunday, May 8, 2016

What day is Mother’s Day 2016? Today’s Google Doodle leads to a direct answer

Standard
mothers-day-2016-Google doodle 2016

Google is celebrating moms today with a Mother’s Day doodle illustrated by doodler Sophia Diao.

“As we get older, we forget how heavily we once relied on our mothers and mother-figures. Today’s doodle for Mother’s Day harkens back to a time in my youth when following Mom around was all I knew,” writes Diao on the Google Doodle blog.

The Mother’s Day-themed logo leads to a search for “What day is Mother’s Day 2016?” and includes the usual sharing icon so that users can post the doodle on their social pages.

Search Engine Land wishes all the moms out there a very Happy Mother’s Day!


(Some images used under license from Shutterstock.com.)

Let's block ads! (Why?)

Friday, May 6, 2016

SearchCap: Google APIs, global SEO & public relations

Standard
searchcap-header-v2-scap

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

From Search Engine Land:

Recent Headlines From Marketing Land, Our Sister Site Dedicated To Internet Marketing:

Search News From Around The Web:

Local & Maps

Link Building

SEO

SEM / Paid Search

Search Marketing


(Some images used under license from Shutterstock.com.)

Let's block ads! (Why?)

5 reasons to keep doing mobile SEO even though ads are everywhere

Standard

ss-mobile-seo“Look,” says your boss, directing your attention to the Google search results on her iPhone. She just searched on “life insurance quotes,” and the organic listings are nowhere to be seen. She actually has to scroll down past the third ad before she gets to the first organic listing.

Screenshot_20160428-145815

“Why would we continue to pay any attention to organic search on mobile when Google is only showing ads?” she says. “I just read a report from a prominent agency that said organic search visits were down seven percent year over year in Q1, as increased monetization of mobile results is pushing more traffic to paid listings, and that mobile traffic share has been flat for organic search in the past year, but it is up 10 points for paid search. Let’s just shift the budget into paid and be done with it.”

If you’re interested in growing your traffic overall, you should resist that suggestion.

While it’s true that organic search visits are down overall, according to recent reports, there are many reasons you should continue doing mobile SEO in 2016. Here are five of them.

1. The first organic listing in mobile still gets 73% more clicks than the first and second sponsored listings combined.

The first three points I’m going to mention come from research done last month by Mediative, a Montreal-based digital marketing agency and originator of the Golden Triangle study. Their white paper called “How do consumers conduct searches on Google using a mobile device?” is definitely worth downloading (registration required) if you’re interested in solid research on mobile search behavior.

[See the full story on Marketing Land]

Some opinions expressed in this article may be those of a guest author and not necessarily Search Engine Land. Staff authors are listed here.


(Some images used under license from Shutterstock.com.)

Let's block ads! (Why?)

Search in Pics: Karl Urban I’m not an SEO, make SEO great again hat & Google wine club

Standard

In this week’s Search In Pictures, here are the latest images culled from the web, showing what people eat at the search engine companies, how they play, who they meet, where they speak, what toys they have and more.

Google wine club:

Google wine club
Source: Twitter

Make SEO great again hat:

Make SEO great again hat
Source: Twitter

Karl Urban in Star Trek: I’m a doctor, not an SEO:

I'm a doctor, not an SEO
Source: Facebook

Google New York LEGO wall:

Google New York LEGO wall
Source: Twitter


(Some images used under license from Shutterstock.com.)

Let's block ads! (Why?)

Google updates the Google My Business API to version 3.0

Standard
google-world-maps5-ss-1920

Google has released version 3.0 of the Google My Business API. This has not yet been announced by Google but you can see version 3.0 marked as new in the changelog and the new features page, it documents the new API.

The original Google My Business API was released in December of last year bringing the ability for businesses to automate managing their business listings.

Version 3.0 “adds new functionality for people who manage locations at scale,” Google said. The “key new features include the ability to read and respond to customer reviews and provide additional attributes for locations, such as whether a restaurant accepts reservations, serves brunch, or has outdoor seating,” Google added.

Here is the changelog for version 3.0:

  • Attributes Provide additional, category-specific information about locations.
  • Find Matching Location Find and manually associate existing maps locations with your business location.
  • Transfer Location New action on Location :transfer. Allows transferring a location from one account (business or personal) to another.
  • Preferred Photo Indicate which photo you’d prefer to show up first in Google Maps and Search.
  • New Search Filters New search filters include any_google_updates, is_suspended, and is_duplicate.
  • New Location States Location states now also include is_verified and needs_reverification.
  • Photo URL Improvements The API now accepts photo URLs without an image format suffix.
  • Backwards incompatible changes Photos can now only be updated for locations with a Google+ page (these were accepted and silently dropped before). The location_name and category_name fields are now output only. Only use category IDs when setting categories. Field masks no longer require the location. prefix for included fields. Create/update operations now take the location as the body payload, other parameters are moved to the query string.

(Some images used under license from Shutterstock.com.)

Let's block ads! (Why?)

Thursday, May 5, 2016

SearchCap: Google featured snippets, Viv assistant & AdWords redesign preview

Standard
searchcap-header-v2-scap

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

From Search Engine Land:

Recent Headlines From Marketing Land, Our Sister Site Dedicated To Internet Marketing:

Search News From Around The Web:

Local & Maps

Link Building

Searching

SEO

Search Marketing


(Some images used under license from Shutterstock.com.)

Let's block ads! (Why?)

Son of Siri: Viv aims to go way beyond today’s digital assistants

Standard

There’s a confluence of technology advancements that are dramatically changing “search”: mobile, artificial intelligence, big data and natural language processing. From Siri and Alexa to Facebook M and Jibo, voice UIs and virtual assistants are the future.

Ahead of its public unveiling on Monday, the WashingtonPost ran a story on next-generation virtual assistant Viv. Viv could be described as Son of Siri or Siri 2.0, with much more focus on AI and commerce. It’s built by the same people who launched Siri before Apple acquired it, including co-founder Dag Kittlaus.

Believe it or not Siri launched way back in 2009 with the goal of advancing the search experience using a natural language interface and delivering actionable/transactional results rather than a SERP. The Post article uses the example of ordering pizza from a nearby restaurant to showcase Viv’s conversational-transactional potential:

“Get me a pizza from Pizz’a Chicago near my office,” one of the engineers said into his smartphone. It was their first real test of Viv, the artificial-intelligence technology that the team had been quietly building for more than year. Everyone was a little nervous. Then, a text from Viv piped up: “Would you like toppings with that?”

In fact, this was always the vision for Siri. The idea was to enable people to speak their questions and objectives, which would then be fulfilled by third party providers via back-end API integration thereby cutting out the SERP. However that vision was only partly realized before Apple acquired Siri. And while Cupertino has certainly improved Siri’s functionality and usability, it hasn’t invested to enable Siri to achieve its full potential.

Now Viv hopes to pick up where Siri left off.

The company has been building its technology for several years. But rather than present itself as a next-gen search engine or even a digital assistant, Viv’s positioning is much more focused on the AI angle. The company’s website says it “radically simplifies the world by providing an intelligent interface to everything.”

If this doesn’t sound like Google or a replacement for Google I’m not sure what does.

The Post article says that there have already been acquisition offers from Google and others. It also says that Facebook’s Mark Zuckerberg is an indirect investor. If Viv can deliver anything approaching its lofty ambitions it will be bought in short order. Though Kittlaus and his co-founders might resist, hoping to see what the technology can achieve further along.

Another intriguing angle in the Post story is the way that Viv (and related technologies) might not only displace search but might equally disrupt apps. With a voice-powered virtual assistant that can can fulfill transactions (“order a pizza,” “get Uber,” “make a hotel reservation”), apps hypothetically become less necessary, if not unnecessary.

The issue, as with Siri, is deciding who fulfills the request. However, I’m sure Kittlaus and his team have thought carefully about this.

In the original vision for Siri, users would specify a favorite provider (e.g., OpenTable, Kayak, etc.) to handle fulfillment. But because voice is an imperfect interface and complex transactions cannot be fulfilled by speech and voice prompts alone it’s likely that apps (and the mobile web) will stick around for the foreseeable future.

According to 2015 research from MindMeld, use of voice search and virtual assistants is growing dramatically. In addition, Amazon Echo (with assistant Alexa) has proven to be the company’s most popular hardware device. And Microsoft just announced that Cortana “has helped answer over 6 billion questions since launch.”

All these developments show significant momentum for voice and virtual assistants. As that continues, powered by AI and better results (including predictive results), major questions will arise for publishers, developers and advertisers. For example, what will happen to SEM and the search ad model? How can publishers and brands optimize content for voice search?

Nothing will change in the near term. Yet the combination of the technology developments I mentioned above will all but guarantee that search, content retrieval and commerce will look radically in a few years than they do today.


(Some images used under license from Shutterstock.com.)

Let's block ads! (Why?)

Google featured snippets now with related topics, extending the information in those snippets

Standard
google-code-seo-algorithm9-ss-1920

Google is now showing extended featured snippets, where they add more information to the top featured snippet box for some queries. Featured snippets are often seen in the Google search results when Google is confident they can answer your query by extracting content from a specific web page.

Now, it seems Google is showing extended versions of them showing “related topics” that hyperlink to additional queries in Google.

Here are two screen shots, one for [birth control] and the other for [personal loan]. Both those queries currently bring up extended featured snippets for me:

extended-featured-snippet-2-1462452998extended-google-featured-snippet-1462448970

It is not surprising to see Google place more content and information in the featured snippet box.


(Some images used under license from Shutterstock.com.)

Let's block ads! (Why?)

Wednesday, May 4, 2016

Updated: Google penalizes mobile sites using sneaky redirects

Standard
google-penalty-blue-ss-1920

In October 2015, Google warned webmasters not to trick mobile users by redirecting them to an unsuspecting web site. Well, today, Google announced on Google+ and Twitter that they have been “taking action on sites that sneakily redirect mobile users to spammy domains.” Google issued a correction with Search Engine Land that they did not issue any new manual actions recently, that this post on Twitter was just to remind webmasters not to use sneaky redirects.

Google wrote, “As mentioned in Webspam Report 2015, spam reports from users are an important part of our spam-fighting efforts. They often help us surface issues that frustrate users – like the trend of websites redirecting mobile users to other, often spammy domains.” Google added, “to combat this trend, we have been taking action on sites that sneakily redirect users in this way.”

Sneaky redirects are never a good thing and Google has penalized web sites for directing the user to a site they do not expect to go to from the search results. The same for those using a mobile device and searching on Google’s mobile results. Google wants the user to land on a site they expect to land on based on the snippet Google shows.

Here is an illustration of that behavior:

google-sneaky-redirects

Google added that “if your site has been affected, read this Help Center article on detecting and getting rid of unwanted sneaky mobile redirects.

We are in the process of getting more details from Google on this announcement.

Postscript: Google has updated us telling us that this an old notification and no new manual actions have been sent out today.


(Some images used under license from Shutterstock.com.)

Let's block ads! (Why?)

SearchCap: Google rich snippet spam, AMP report & Bing ads data

Standard
searchcap-header-v2-scap

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

From Search Engine Land:

Recent Headlines From Marketing Land, Our Sister Site Dedicated To Internet Marketing:

Search News From Around The Web:

Link Building

Searching

SEO

SEM / Paid Search

Search Marketing


(Some images used under license from Shutterstock.com.)

Let's block ads! (Why?)

Bing Ads makes it easier to segment performance data like time of day and device 

Standard
data segmentation in bing ads

Performance analysis in Bing Ads just got easier with the addition of a new Segment tab in the campaigns page.

You can segment your campaign data by time (of day, day of week, and even by month, quarter, or year), network, device, and top vs. other from the main interface rather than having to navigate to the Reports section.

To download segmented campaign data, you’ll need to select the option you want from the Download report window (just like in AdWords).

Note, if you’re looking to see segmentation by time, there are some limits. For example, if you select “Day”, the maximum data range you can look at is 16 days.

The feature is now available to accounts in the US and UK.


(Some images used under license from Shutterstock.com.)

Let's block ads! (Why?)

Google updates the AMP report in the Google Search Console

Standard
google-amp-speed-race-fast-ss-1920

Google’s John Mueller announced on Google+ that the Google Search Console report for AMP pages has been updated to provide improve the categorization and “better group similar issues.” It also will now give you more information on the individual problems that Google discovered while crawling your AMP pages.

Here is a screen shot of the updated report:

google-search-console-amp-report-update

Google first added this report in January 2016 to help publishers debug their AMP errors. This would be the first update to the report since then.

John Mueller added, “if you’ve set up AMP, even if you just installed the AMP plugin for your blog, I’d recommend checking it out!” You can access it by clicking here and selecting a verified site in your Google Search Console profiles.


(Some images used under license from Shutterstock.com.)

Let's block ads! (Why?)

Jane Jacobs’ Google Doodle marks 100th birthday of famous NYC community organizer

Standard

Jane Jacobs Google Doodle
Today’s Google Doodle celebrates urban activist and community organizer Jane Jacobs on what would have been her 100th birthday.

Known for her research and work around urban development, Jacobs believed a city is at its best when its residents can interact with each other on the streets and patronize local businesses.

“She stood by beloved neighborhoods that were unjustly slated for ‘renewal’ and revealed political biases in the permit process for new projects,” reports the Google Doodle blog.

Today’s Doodle honors the 100th birthday of this fierce protector of New York City’s urban landscape.

Referred to as a “self-taught journalist,” Jacobs penned a number of books on urban development, including “The Death and Life of Great American Cities,” “Dark Age” and “The Economy of Cities.”

Google’s Jane Jacobs Doodle leads to a search for “Jane Jacobs” and includes a sharing icon to post the image on social networks.

Along with a brief overview of Jacobs’ work, the Google Doodle Blog also included the following initial sketches of today’s Jane Jacobs Doodle:
Jane Jacobs doodles


(Some images used under license from Shutterstock.com.)

Let's block ads! (Why?)

Performing a manual backlink audit, step by step

Standard
Backlink detective.

It might be every SEO’s least favorite job: the backlink audit. This is not because the work itself is horrible (though it can be tedious on sites with large link footprints), but because it’s almost always performed when a domain is in trouble.

Whether you’re reading this article because you’re an SEO looking at new strategies or a site owner that has received a link-based penalty, I hope you find the methodology below helpful.

I should note before proceeding that I prefer robust datasets, and so I’ll be using four link datasets in the example. They are:

Though I have paid accounts with all of the tools above (except for the Search Console, which is free), each offers a way to get the data for free — either via a trial account or free data for site owners. There are also other link sources you can use, like Spyfu or SEMrush, but the above four combined tend to capture the lion’s share of your backlink data.

Now, let’s begin …

Pulling data

The first step in the process is to pull the data from the above listed sources. Below, I will outline the process for each platform.

Google Search Console

  1. Once logged in, select the property you want to download the backlinks from.
  2. In the left navigation, under “Search Traffic,” click “Links to Your Site.”
  3. Under the Who links the most column, click “More.”
  4. Click the buttons, “Download more sample links” and “Download latest links,” then save each CSV to a folder.
Downloading backlinks from Search Console

Majestic

  1. If you don’t have one, create an account, as you’ll need it to export data. If all you want is access to your own site’s data (which is what we want here), they’ll give you free access to it. You can find more information on that at http://ift.tt/10RfLxL. The rest of these instructions will follow the paid account process, but they are essentially the same.
  2. Enter your domain into the search box.
  3. Click the “Backlinks” tab above the results.
  4. In the options, make sure you have “All” selected for “backlinks per domain” and “Use Historic Index” (rather than “Use Fresh Index”).
  5. Click “Export Data,” and save the file to the folder created earlier.
  6. If you have a lot of data, you will be directed to create an “Advanced Report,” where you then need to create a “Domain Report.”
Exporting backlink data from Majestic.

Ahrefs

  1. If you don’t have an account, you can sign up for a free trial.
  2. Enter your domain into the search box.
  3. Click “Backlinks” at the top of the left navigation.
  4. Select “All links” in the options above the results.
  5. Click “Export,” and save the file to the folder created earlier.
Exporting backlink data from ahrefs.

Moz

  1. If you don’t have an account, sign up for a free trial to get complete data.
  2. From the home page, click “Moz Tools” in the top navigation, then “View all Moz products” in the drop-down.
  3. Click “Open Site Explorer” on the resulting page.
  4. Enter your domain into the URL field.
  5. In the options above the results, select “this root domain” under the target.
  6. Click “Request CSV,” and when it’s available, save it to the same folder as the other files.
Exporting backlink data from Moz's Site Explorer.

Conditioning your data

Next, we need to condition the data by getting all of the backlinks into one list and filtering out the known duplicates. Each spreadsheet you’ve downloaded is a little different. Here’s what you’re looking at:

  • Google Search Console. You’ll have two spreadsheets from the Search Console. Open them both, and copy all the URLs in the first column of each to a new spreadsheet, removing the header rows. Both will go into Column A, one after the other.
  • Majestic. In the Majestic download, you will find the link source URL in Column B. You will want to copy all of these URLs into the same spreadsheet that you’ve copied the Search Console links into. To do this, you will insert the Majestic data directly below the Search Console data in column A.
  • Ahrefs. Ahrefs puts the source URL in column D. Copy all of these URLs to the new spreadsheet, again in Column A, directly below the links you’ve already added.
  • Moz. Moz puts the source URL in Column A. Copy all of these URLs again into Column A of your new spreadsheet directly below the other data you have entered.

Now you should have a list of all the backlinks from all four sources in Column A of a new spreadsheet. You will then select Column A, click on the “Data” tab at the top (assuming you’re working in Excel) and click “Remove Duplicates.” This will remove the links that were duplicated between the various data sources.

The next step is to select all the remaining rows of data and copy them into a Notepad document, then save the document somewhere easily referenced.

And now the fun part…

You’ve now got a list of all of your inbound backlinks, but that’s not particularly useful. What we want to do next is to gather unified data for them all. That’s where URL Profiler comes in. For this step, you’ll have to download URL Profiler. Like the other tools I’ve mentioned thus far, URL Profiler has a free trial; so if it’s a one-off, you can stick with the trial.

Once downloaded and installed, there’s a bit of a setup process designed to aid you in a speedy analysis. The first thing you’ll need to do is click the “Accounts” menu, which will bring up the windows to enter your API keys from the various tools discussed previously.

Helpfully, each tab gives you a link to the step-by-step instructions on getting your various API keys, so I won’t cover that here. That leaves me to get to the good part…

You will now be presented with a screen that looks like this:

URL Profiler

The first step is to right click the large, empty URL List box on the right and select to “Import From File.” From there, choose the Notepad document you created with the links in your spreadsheet above.

You’ll now see a list of all your backlinks in the box, and you’ll need to select all the data that you want to collect from the boxes on the left. The more data you want, the longer it will take, and the more you’ll have to weed though — so you generally only want to select the data relevant to the task at hand. When I am looking for low-quality links, I tend to select the following:

Domain-level data

  • Majestic [Paid]
  • Moz
  • Ahrefs
  • Social Shares
  • Site Type
  • IP Address

URL-level data

  • Majestic [Paid]
  • Moz
  • Ahrefs
  • HTTP Status
  • Social Shares

In the “Link Analysis” field at the bottom, you will enter the your domain. This will leave you with a screen similar to this one:

URL Profiler: Ready to go.

Click “Run Profiler.” At this stage, you can go grab a coffee. Your computer is hard at work on your behalf. If you don’t have a ton of RAM and you have a lot of links to crawl, it can bog things down, so this may require some patience. If you have a lot to do, I recommend running it overnight or on a machine dedicated to the task.

Once it’s completed, you’ll be left with a spreadsheet of your links. This is where combining all of the data from all of the backlink sources and then unifying the information you have on them pays off.

So, let’s move on to the final step…

Performing your backlink audit

Once URL Profiler is done, you can open the spreadsheet with the results. It will look something like this:

URL Profiler output.

Now, the first thing I tend to do is delete all of the tabs except “All.” I love tools that collect data, but I’m not a fan of automated grading systems. I also like to get a visual, even on the items I will be moving back to similar tabs that I am deleting in this stage (more below).

With those extra tabs removed, you are left with a spreadsheet of all your backlinks and unified data. The next step is to remove the columns you don’t want cluttering your spreadsheet. It’s going to be wide enough as-is without extra columns.

While the columns you select to keep will depend on specifically what you’re looking for (and which data you decided to include), I tend to find the following to be globally helpful:

  • URL
  • Server Country
  • IP Address
  • Domains On IP
  • HTTP Status Code (and if you don’t know your codes, HTTP Status)
  • Site Type
  • Link Status
  • Link Score
  • Target URL
  • Anchor Text
  • Link Type
  • Link Location
  • Rel Nofollow
  • Domain Majestic CitationFlow
  • Domain Majestic TrustFlow
  • Domain Mozscape Domain Authority
  • Domain Mozscape Page Authority
  • Domain Mozscape MozRank
  • Domain Mozscape MozTrust
  • Domain Ahrefs Rank
  • URL Majestic CitationFlow
  • URL Majestic TrustFlow
  • URL Mozscape Page Authority
  • URL Mozscape MozRank
  • URL Mozscape MozTrust
  • URL Ahrefs Rank
  • URL Google Plus Ones
  • URL Facebook Likes
  • URL Facebook Shares
  • URL Facebook Comments
  • URL Facebook Total
  • URL LinkedIn Shares
  • URL Pinterest Pins
  • URL Total Shares

And for those who have ever made fun of me that my desk looks like …

Dave Davies' desktop.

… now you know why! While doable on a single monitor, it would require a lot of scrolling. I recommend at least two monitors (and preferably three) if you have a lot of backlinks to go through. But that’s up to you.

Now, back to what we do with all these rows of backlinks.

The first step is to create three new tabs. I name mine: nofollow, nolink and nopage.

  1. Step one: Sort by HTTP status, and remove the rows that don’t produce a 200 code. Essentially, these pages did once exist and don’t anymore. I will occasionally run them through the URL Profiler again just to make sure a site is not temporarily down, but for most uses, this isn’t necessary. Move these to the “nopage” sheet in your Excel doc.
  2. Step two: Sort by Link Status. We only need the links that are actually found. The databases (especially Majestic’s historic) will hold any URL that had a link to you. If that link has been removed, you obviously don’t want to have to think about it in the auditing process. Move these to the “nolink” tab.
  3. Step three: Sort by Rel Nofollow. In most cases, you don’t need to spend time on the nofollowed links, so it’s good to get them out of the data you will be going through. Move these to “nofollow.”

In the site I am using in this example, I started with 10,883 rows of links. After these three steps, I am left with 5,393. I now have less than half the links I initially had to sort through.

Working with the remaining data

What you do with your data now will depend on specifically what you are looking for. I can’t possibly list off all the various use cases here, but following are a few of the common sorting systems I use to speed up the review process and reduce the number of individual pages I need to visit when trying to locate unnatural links:

  • Sort by anchor text first, then URL. This will give you a very solid picture of anchor text overuse. Where you see heavy use of specific anchors or suspicious ones (“payday loans,” anyone?), you know you need to focus in on those and review the links. By grouping by domain secondarily, you won’t accidentally visit 100 links from the same domain just to make the same decision. It will also make run-of-site issues far more apparent.
  • Sort by domains on IP first, then URL. This will give you a very quick grasp of whether your backlink profile is part of a low-end link scheme. If you’ve bought cheap links, you might want to start with this one.
  • Sort by site type first, then Majestic, Ahrefs or Moz scores. I’ll leave it up to you which score you trust more, though none should be taken as gospel. These scores are based on algorithms built by some very smart people, but not Google. That said, if you see good scores on all three, you can at least review the site knowing this.

As I’m reviewing — and before visiting a link — I tend to scan the various scores, the social shares for the URL and the Link Location. This will tell me a lot about what I’m likely to find and where I’m likely to find it.

Over time, you’ll develop instincts on which links you need to visit and which you don’t. I tend to view perhaps more than I need to, but I often find myself working on link penalty audits, so diligence there is the key.

If you are simply wanting to review your backlinks periodically to make sure nothing problematic is in there, then you’ll likely be able to skip more of the manual reviewing and base more decisions on data.

In conclusion

The key to a good audit of any type is to collect reliable data and place it in a format that is as easy as possible to digest and work with. While there’s a lot of data to deal with in these spreadsheets, any less and you wouldn’t have as full a picture.

Though this process isn’t automated (as you’re now well aware), it dramatically speeds up the process of conducting a backlink audit, reduces the time you need to spend on any specific page judging your links (thanks to the developer for adding the “Link Location” in) and allows for faster bulk decisions.

For example, in simply sorting by URL, I quickly scanned through the list and found directory.askbee.net linking 4,342 times due to some major technical issues with a low-quality directory. Now, we’re down to 1,051 links to contend with.

Again, each need requires different filters, but as you play with different sorting based on what you need to accomplish, you’ll fast discover that a manual backlink audit, while taxing and time-consuming, doesn’t have to be the nightmare it can often seem.

Some opinions expressed in this article may be those of a guest author and not necessarily Search Engine Land. Staff authors are listed here.


(Some images used under license from Shutterstock.com.)

Let's block ads! (Why?)