Competitive Link Building Analysis For Your Industry

Contributor Patrick Stox argues that the death of link building has been greatly exaggerated and shares a process to analyze a given industry's links in advance of performing outreach.

Chat with SearchBot

link-hand-fist-ss-1920

It seems like every day someone in our industry is saying link building is dead or that we should stop link building. Many of the current recommendations for modern SEO involve highly subjective suggestions like “Write quality content” and “Provide a good user experience,” along with messaging such as “Content is king.”

What do those even mean? Who determines “quality” and “good?”

Many future SEOs are being set up for failure. They are being told that SEO is easy and are misled into believing that all you have to do is create great content. Think of all the signals that Google has to rank a website and tell me why they would choose only a few of those to use in their algorithms.

There is no one strategy or tactic that guarantees success in SEO, and if one did exist, Google would nerf it as it has done in the past. With more than 200 ranking factors, some of which have more than 50 variations within a single factor, it’s all the little things that add up. Each piece of SEO forms part of the puzzle, and it’s wrong to tell people to put the puzzle together with missing pieces.

Before I come off as the ultimate hater of content and writers in general, I want to say that I am a fan of, and have seen many times over the absolute power of, properly targeted content.

Take a look at my post, Planting The Seeds For An Evergreen Content Strategy, for a view on how I target content. I have increased several well-established companies’ organic search traffic and conversions over twenty-fold in the last year.

While I do credit much of the success to content, I’ve been doing this long enough to know that if I had only been writing content for those campaigns, I would have seen a fraction of the results.

Why Are People Afraid Of Links?

The core of Google, that which made it great, was its search algorithm that factored in links. The PageRank algorithm, which relies on links as a quality signal and a measure of the importance of a website, really set Google’s search results apart from competing search engines that relied more heavily on on-page signals or human editing.

It was almost unanimously agreed that Google had the best search results, but it wasn’t just one thing that made Google a success. Google’s simple design and speed also set it apart, along with wanting to always improve and provide the best results. The company also managed to avoid the paid placement scandals that plagued some of the other search engines.

Any time one thing worked too well to influence the search results, Google made a change. Keyword stuffing, meta-tag stuffing, invisible text, hidden content, duplicate content, thin content, dynamic content, reciprocal links, link farms, link networks, directories, article directories, press releases, guest posts, footer links, infographics, blogroll/sidebar links, widgets, paid links, blog comments, forum posts, social bookmarking and more were all abused — and then nerfed because they worked too well.

Today, even the mention of submitting to a directory will get you called lots of “fun” names by other SEOs, and guest posting is one of the most controversial topics in SEO. I could make a legitimate use case for half of the tactics listed above, and I don’t think Google would mind my use at all.

Algorithms like Panda and Penguin put the fear of God into SEOs. Google took away many easy wins, where you could just do “one thing” and run a successful SEO campaign, and penalized websites that were being manipulative or spammy.

The Truth About Links

Directory links are not all bad. Local citation websites, which are at the core of local SEO strategy, could be classified as directories; I wouldn’t consider an association with a list of all the licensed dentists in the US to be a bad link, and I don’t believe Google does, either.

Even blog comments and forum posts are fine for me, and I encourage them as long as they are on topic, industry-relevant, and I’m contributing something useful to the conversation.

The problem really comes down to abuse. All of the tactics listed earlier were done at scale and in a ridiculous way. You could submit to tens of thousands of general directories that all had bad practices like reciprocal linking, paid placements, allowing keyword-rich anchor text, thin content and more. You could make hundreds of thousands of blog comments on completely unrelated sites and sign up for tens of thousands of forums, even if they weren’t in your niche or relevant to you at all.

Article directories would take any barely readable content with any links you wanted, and many press release sites would take garbage that wasn’t actually newsworthy or in the format of a press release and allow you to add links to it. Which is why we can’t have nice things.

canthavenicethings

The Future Of Links

Google viewed links as a quality signal and a sign of trust when it started; it does now, and I’d argue that it will for the foreseeable future. I believe Google has many negative signals for links now, which makes perfect sense when trying to determine the best quality results. But it doesn’t mean that targeted links that are niche-specific and relevant are bad just because they are of a certain type.

When I imagine the future, I think Google will still be using links in its algorithm — some will be weighted more heavily, and others will be nerfed further. But I think the biggest factor in determining good links from bad in the future will be relevance. Relevance, not just from one page linking to another, but from one domain linking to another and all of the signals that indicate the two are related and in the same niche.

I personally believe larger sites are given too much weight in their current rankings just because of who they are, and I hope that topical relevance will play more of a role in the future. I hate seeing some of the major news sites rank well in search results when they only have a few mediocre articles about a topic. Just because they have a well-known brand name, they shouldn’t outrank a smaller guy who wrote better content on a domain that only writes about that one niche.

Think about it. Should Forbes be on the first page for [SEO] with an article that’s only 1,300 words, has fewer than 500 social shares, and really only covers local citations? And what about the article from The Atlantic that really just talks about headlines? I only see 71 articles on The Atlantic total that even mention SEO.

I could think of 1,000 sites and articles more deserving of that position, and the fact that sites like these can rank for anything they want just because of their overall strength is a joke. It’s a travesty, and it perpetuates the myth that all you need to do is create great content.

Write Quality Content To Attract Links

I loathe the phrase “write quality content to attract links.” It’s not an invalid observation, though I would say you also need to do outreach to promote the content, but it leaves a lot to be desired in terms of efficacy for SEO.

There are so many types of links that you will never attract because of great content. Do you believe you’ll get a local citation or a link from the members’ directory for an industry association because of your great content?

With SEO, you have to find the gaps between you and your competitors, whether it’s on-site or off-site. If you’re missing out on many of the links in your industry because you expect them to magically happen, your competitors who have these links have a competitive advantage over you.

Even Moz, in its latest Search Engine Ranking Factors Study, found that links, of all the ranking factors, still show the highest correlation to rankings.

[pullquote]Content may be king, but links still reign supreme.[/pullquote]

Analyzing What Links Make A Competitor Rank

Before I go further, I do want to clarify that I don’t believe all links are good links, and I wouldn’t even want links from some of the sites that link to the top guys in a niche. I think my fellow author and North Carolinian Julie Joyce covered some of the pitfalls in her article: Why I Never Mine Competitors’ Backlinks, but she also made many valid points on why you should analyze your competitor’s links.

I think my methodology, which is shared below, takes a similar approach at a greater scale and tends to weed out many of the bad sites. Most people will just look at the links of a few competitors to copy their links, while I’m trying to look at the links in an entire industry to understand why Google ranks these websites the way it does, and to figure out what I’m missing.

From my own experience, the links that move the needle the most in a campaign are those that are relevant in the industry and are shared by the top competitors in an industry. What I look for are sites that are in the same link neighborhoods, where links intersect between top competitors. All else being equal, if links were the determining factor I’d want to make sure I was in good company and that my link profile had many similarities to the top sites in my niche.

For example, for my own SEO business, I’m going to look at searches for [SEO + cityname] in the top 50 US markets and pull the links of any company in the top 10 search results. I’m going to combine their linking domains using Excel to obtain a list of the most common links that these top companies have, which are likely the driving force behind their rankings.

The way I will describe this, anyone with access to any of the major link indexes such as OSE, Majestic, Ahrefs, or SEMrush (I have no affiliation to any of these tools) could follow along. Plan for a few minutes, or a few hours, depending on the scale of research you want. All of these tools have different indexes and only provide a fraction of the total picture but are indicative of the overall landscape.

If you want to be seriously thorough, you could get the links from more than one or all of these tools. With a SERP scraper and API access to one or more of these tools, you could do more research more quickly, but that would also involve a higher monetary investment.

A few quick notes. This approach isn’t limited to local businesses, as you can compare your own site to all sites in your niche nationally or internationally, as well. This method is also highly flexible in that you could just look at your local competitors or you could go deeper in the search results, cover more cities, cover more keywords in each of the areas, or even pull links from companies in different niches in your city to find any city-specific links that may add local relevancy.

How deep you want to go down the rabbit hole of research is up to you, but what we will find are link intersects, or websites that link to multiple competitors, to other sites in our niche, or to sites in local markets. You can also use the data about each individual site to figure out their link-building strategy, and you can even sometimes intuit their general business strategy.

You’ll find different types of links from associations, magazines, trade shows, tools used by your niche, conferences, forums, blogs, niche directories and many more. If you find some really outstanding sites during the research phase, it may be worth checking into what type of content or resources they are using to attract links.

How To Perform The Analysis

I’m going to use the Ahrefs toolbar to make this easier. Basically, just Google a city and your key phrase. With Ahrefs toolbar active, you can either open the links and click RD, as shown below, or you can click the RD right in the search results to get to the page where we need to be.
Ahrefs Toolbar Linking Root Domains For Link Building

On the landing page, you’ll want to Export these linking Root Domains as shown below.

Ahrefs Export Competitor Backlinks For Link Building Analysis

Once you have all of the .csv files exported and saved, you’ll want to open Command Prompt.

On a Windows OS machine, you’d do so by clicking “Start” and typing “cmd” in the “Search box” or you can go “Start” > “All Programs” > “Accessories” > “Command Prompt.” You’ll need to navigate in Command Prompt to the folder where you saved your exported .csv files and then type:

copy *.csv filename.csv

To do the same thing on an OSX machine, you would open the Terminal application and navigate to the folder where you saved your exported .csv files. Then you’d type:

cat *.csv >filename.csv

The copy *.csv or cat *.csv commands basically say to copy all .csv files here to one file, which you can call filename.csv, as in the examples above, or choose whatever name you want. This simple technique will allow you to merge hundreds, even thousands of .csv files in seconds.

If you check in the folder where you downloaded the .csv files, you should find this new file you created.

  1. Open the new file you created in Excel and delete extra columns you won’t need, such as “Backlinks Count,” “Referring Pages,” “First Seen” and “Last Check Column.”
  2. Go to “Insert” > “Table,” and make sure “My table has headers” is checked.
  3. Under the drop-down box for the “Domain Rating” column, uncheck “Domain Rating” so that these are filtered out.
  4. To get the count of appearances, in cell C2 enter the formula =COUNTIF(B:B,B2), assuming your root domains are in Column B now. Give this time to run, as it can take awhile, and interrupting it will stop processing the count early.
  5. Once this has calculated, copy the whole of “Column C” and paste it right over itself —  use “Paste Special” as “Values,” and what was once a formula will now be a number.
  6. In “Column B” or Root Domains, use the “Remove Duplicates” tool under the “Data” tab to leave only one listing of each Root Domain.
  7. Finally, use the “Sort” tool under the “Data” tab and sort by “Column C” or the count and “Largest to Smallest.” This leaves us with the websites that link to sites in our industry the most, prioritized by how many of the top websites have a link from that root domain.

If you have the .csv export of your target company, you can do a quick gap analysis with a vlookup function to determine which domains already link to you and which ones you are still lacking.

Here are the top 50 referring websites for SEOs in the top 50 cities. The entire list is linked below. There were 445 .csv exports combined to make this list, and while a lot of the sites are ones I expected, there are quite a few pure spam sites in this list, too.

Domain Rating Root Domain Intersect
94 plus.google.com 148
76 moz.com 137
60 prlog.ru 118
77 prweb.com 98
71 searchenginejournal.com 97
46 madaali.de 92
79 bizjournals.com 85
73 business2community.com 78
40 the-globe.com 74
39 theglobe.net 74
36 theglobe.org 74
54 lipperhey.com 73
53 mylocally.com 73
67 swkong.com 69
50 intently.co 69
40 schreifels.github.io 69
50 seo-ranking.com 68
35 wesipa.com 65
61 upcity.com 64
38 seo-company-in.com 64
64 topseos.com 62
62 inbound.org 59
76 dmoz.org 58
72 scoop.it 57
72 dir.yahoo.com 57
60 ezlocal.com 57
40 zhyaoke.cn 56
67 folkd.com 55
46 hostgeni.com 55
1 allbuilt.com.br 54
74 diigo.com 51
64 seobythesea.com 51
79 hubspot.com 46
53 dig.do 46
62 mirabilis.com 45
58 joeant.com 45
72 digitaljournal.com 44
60 freewebsitedirectory.com 44
50 viettrendss.appspot.com 44
79 meetup.com 43
65 botw.org 43
52 getfreelisting.com 43
43 5brh.com 43
73 visual.ly 42
65 clicksor.com 42
69 redorbit.com 41
67 kiwibox.com 41
55 expressbusinessdirectory.com 41
68 quicksprout.com 40
67 blogarama.com 40

You can find the entire list of SEO Company Links here in a Google Sheet.

The challenge will be figuring out how to get links on certain websites and, of course, determining which websites you don’t want links from. With all the information available from your competitors, how could you willingly choose not to use it?


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About the author

Patrick Stox
Contributor
Patrick Stox is a Product Advisor, Technical SEO, & Brand Ambassador at Ahrefs. He was the lead author for the SEO chapter of the 2021 Web Almanac and is a reviewer for the 2022 SEO chapter. He’s an organizer for the Raleigh SEO Meetup, Raleigh SEO Conference, Beer & SEO Meetup. He also runs a Technical SEO Slack group and is a moderator for /r/TechSEO on Reddit.

Get the must-read newsletter for search marketers.