Penguin 4, what we know and how to react

Since the release of the latest version of Google Penguin last week the industry has been awash with speculation and opinion on what this means for how we handle link related issues. In this post I hope to set out what we know, what that means for you and what we have yet to learn.

As we are all busy people there is a one sentence summary on each point highlighted so you can absorb the key points and then come back and read the rest at your leisure. I strongly suggest you read the whole piece when you can, this is an important change from Google and it’s vital that we all understand how it will impact what we do.

What has Google actually confirmed about this latest Penguin release?

Major points

  1. Penguin used to operate at site level but is now much more granular, operating at a page or link level
  2. Penguin now runs when a page is crawled rather than as a separate routine
  3. Penguin no longer demotes pages, it removes the value from links that are considered spammy

In previous versions of Penguin, Google used to run the Penguin routine and sites would be swept into or out of its influence. The impact of that would be that the whole site would typically be suppressed in the search results.

This latest version takes that external routine and moves it into the core search algorithms. That means that when a page is crawled Penguin now makes an assessment and applies its results directly at that point.

Google have also now confirmed that the way they apply Penguin has changed. Formerly Penguin used to set a flag against a domain as a whole and the site would fall in rankings until such time as the issues were corrected and Google ran the routine again (which was often many many months between instances). Often even when the issues had been corrected sites would struggle to regain their previous rankings even after Penguin was run again. Now though Google says that when they crawl a page they make their Penguin assessment and if they see links that violate their webmaster guidelines the value that those links would pass is removed. They devalue the link rather than demote the site as a whole.

The intention being that, as links are such a core part of the search algorithm, removing that link equity should have the impact of suppressing that page’s ability to rank.

Google has also indicated that the need to use their Disavow Tool to help correct Penguin issues has now reduced but that their advice on its use in general has not changed.

The original Penguin announcement post in 2012 (interestingly linked to in the new announcement post too) gave us some information on what Penguin would impact. That original post indicated that the reach of Penguin was not just to target violations with regard to links (link schemes) but also other manipulative tactics like keyword stuffing etc, there is no evidence so far that the new Penguin 4 release targets anything other than link violations. This may be because they haven’t confirmed it or it may be that tactics like keyword stuffing are now handled well by other parts of the algorithm (it is 2016 after all).

So does that mean that the disavow is now no longer needed?

Major points

  1. Google’s advice is still to create and submit a disavow file when you become aware of links that you feel violate their guidelines.
  2. Google confirm that manual actions for link related problems still exist, Penguin may make it easier for them to detect and punish these issues.
  3. Even if Google are devaluing links they consider problematic they will still be issuing manual actions for people who’s link profile contains a lot of violations.

The disavow tool was one of the ways in which Google suggested webmasters deal with violations of their guidelines, in relation to links. You were advised to audit the links you had and then create a file containing the links or domains you no longer wanted to contribute to your rankings and Google would ‘disavow’ you of those links.

The disavow tool has long been the main method for webmasters to correct link problems, actual removal of bad links had proven to be hard for many and the disavow provided a convenient way to turn off that risk.

Now that Google are devaluing links that they consider spammy before they can impact a site the need to disavow for Penguin related issues has diminished. That does NOT mean that the disavow is not needed, it seems likely that if Google see you persistently violating their link guidelines and you haven’t issued them with a disavow to correct it manual actions will still be issued.

twitter-1

The disavow tool remains your insurance against Google action and the process of producing the file still requires a full audit of your link profile and that gives huge insight into what issues you might have and what you should do to correct any potential threats to your ranking ability.

But if Google are devaluing links rather than demoting sites now why aren’t we seeing movement and recoveries?

Major points

  1. In previous versions of Penguin recovery was often slow or non-existent.
  2. Webmasters have not reported big recoveries from this version so far
  3. Google seem to retain the memory of the original intent of the link placed, even after correction the value doesn’t seem to come back in many cases.

Now that Google are removing the equity that spammy links could pass before they impact the page linked to’s ability to rank when issues are corrected you would expect to see ranking improve. Sadly we haven’t seen much of that so far, that could be for a number of reasons.

The impact of Penguin 4 is only applied once the page is recrawled. Google have not confirmed that Penguin applies when the page linked to is crawled or when the link itself is crawled.

This is an important point to consider: –

If we have to wait until the page is recrawled then the visibility impact of any problematic links would be in line with when each page was cached again by Google. If this was the case we would see incremental movements as the pages were crawled, typically days or weeks for most sites.

If we have to wait until the link pointing at the page is crawled again then the impact of Penguin 4 will take months or even years to be fully felt. Some of the links targeted by Penguin will only be crawled by Google rarely as they have already determined them as low value.

I believe it’s likely that it will be impacting at link level and therefore we will see a very slow slide for pages that rely on problematic links over the extended period as the links are recached by Google.

This will make it harder to diagnose as a specific problem and therefore heightens the need to gain complete visibility over the issue now rather than wait.

Can we rely on Google to do the discounting of links?

Major points

  1. In short no.
    Google have been shown to be quite inconsistent on how they diagnose problematic links.
  2. If you rely on Google alone you will find it hard to diagnose Penguin as the reason for any slow trending loss in visibility.

Having looked at more manual action reconsideration requests than probably anyone it is clear to me that Google aren’t as good at understanding problematic links as we might hope. When a site is given a manual action (you don’t get notifications for Penguin but for link issues the issue is the same) you are invited to submit a reconsideration request once the problem has been corrected. If that request is not successful Google would often include some example links they consider problematic as a guide to the webmaster on what areas had not yet been resolved.

This data has been our only real insight into what Google actually thinks about problematic links and it’s from that data we could see that their opinion on what was actually manipulation was patchy. We would often see links that were perfectly natural flagged as part of the problem, some examples: –

  • Links from forum posts where the link was in context and in a post which was on topic and genuinely helpful.
  • Links from resource pages on sites, for example local authorities linking from their tourism section to sites that provided information or services that would help tourists.
  • Links from sites where the link was clearly editorially given, in context and without any evidence (or likely expectation) that the link was provided in return for some incentive.

If what we know about the hit and miss nature of these example links is also what Penguin operates off then it is likely that webmasters will still need to understand what they have in their link profile and take steps to insure themselves against the risks posed by the worst links found.

Link quality is now central to the ability of sites to rank

Major points

  1. Link quality is now a central part of the Google search algorithm
  2. Link quality is now a massively important part of the SEO’s work

For many years we suspected that Google had the ability and did prevent sites from passing link equity or even that they could turn off the value at link level. It was believed that some of that was automated and some of that was manual exclusion by search quality engineers. Penguin 4 means we have confirmation that equity is removed for links that violate the Google Webmaster guidelines and as such we are left with some unknowns: –

  • How many of the links we have pointing at our sites will actually pass value?
  • What risk do the links we have pose to our visibility?

It is likely that many sites may now have some links that Google has stripped the value from (via Penguin or other methods). It is also likely that even if Google is stripping the value that those links can pass any accumulation of bad links will still put the site at risk of manual action.

Even if you don’t consider manual action to be a significant threat to your sites visibility the fact that Penguin is removing equity from the links you have can have a similar impact on your visibility.

It may be that the impact takes a long time to slowly show itself but we do now certainly have another reason why certain sites or pages suffer long term downward or static trends in their visibility.

Knowing the quality of the links you have is now more important than it’s ever been, without visibility of that how can you understand the risk or the reason for visibility changes if they are related to links?

People are saying that this represents a return to the days of volume links and hoping that some count, it’s not.

Pre 2012 and the first Penguin some people (me included to some extent) would see links as a volume game. Some counted and some didn’t so you gathered as many as you could and hoped that enough would count to mean you came out on top.

There have been a number of people suggesting that this latest Penguin update takes us back to that world.

It does not

Google will remove the ability of the bad links to pass value, so no benefit to your rankings will be gained. Google will also continue to use the accumulation of bad links as a signal to apply manual actions to your page or domain.

The rules haven’t changed, the reality hasn’t changed. All that has changed is that we now have the action of Penguin acting in real time rather than periodically.

It’s as hard to spam Google now as it’s ever been, if not harder.

I’ve also seen people saying that negative SEO is now easier, is it?

In short, not really but we will have to wait to see

Negative SEO assumes the ability to negatively impact a sites rankings through external factors.

For many years Google insisted that you couldn’t impact another sites visibility by  simply throwing bad links at them (or redirecting in bad sites). They have modified that advice since to suggest there is ‘almost’ nothing that you can do to impact a sites visibility through such tactics.

In reality I see a viable negative SEO attempt probably once each year, I see a lot of attempts but I don’t see them doing what they intend from the attack.

The theory goes, you create a raft of really bad links pointed at a site. Google spots these bad links and penalises the site for the infringement.

An additional theory is that you find sites already penalised by Google and you redirect them to a site you want to knock out of the SERPs.

In practice, gaining links that Google will view as a genuine intent to manipulate their results is expensive and unreliable (See earlier points about Google being patchy on what they seem to consider problematic).

It also seems very likely that the penalty will not follow a 301 redirect (the link value may just be removed for penalised domains) so no impact would be observed. I think that view is reinforced by the observation that there isn’t a thriving black market for penalised domains to use for this.

I think therefore that Penguin 4 is unlikely to impact the naughty webmaster from performing negative SEO attacks against rival sites.

What you should take from this latest update

There is much still to test and consider from this latest update but there are some clear suggestions for every webmaster: –

  • Whilst the disavow is now not required as much in Penguin related issues it is still your best insurance against all link related risks and issues.
  • Knowing what possible exposure you have to bad links is even more relevant now than it was before Penguin 4, it’s this visibility and understanding that dictates a lot of what we plan as SEO’s
  • The impact of Penguin 4 is likely to be felt slowly over an extended period of months so it now has to be one of the considered reasons for a steady reduction in visibility for any site.
  • Google still has all the processes in place to prevent and punish the accumulation of bad links and so it is vitally important that all site owners continue to identify and manage their exposure to those risks.
  • Anyone looking after the organic visibility of a brand or an agency client needs to make sure that they have in place processes and plans to manage their risk and to protect their site from factors which could impact visibility.

paul-madden

 

This is post from Paul Madden, Co-Founder of Kerboo. Paul has been making a living from links and the management of links for over a decade. As a co-founder of Kerboo (a link analysis and data platform) Paul has extensive access to the data on Google’s most important signal, links.

Post from State of Digital

Mike Deets - Living

 

 

 

Have an incredible day!

 

Mike

Deetslist – SEO, SOCIAL MEDIA, VIDEO

Source link

Optimizing for RankBrain… Should We Do It? (Is It Even Possible?) – Whiteboard Friday

Posted by randfish

If you’ve been stressing over how to optimize your SEO for RankBrain, there’s good news: you can’t. Not in the traditional sense of the word, at least. Unlike the classic algorithms we’re used to, RankBrain is a query interpretation model. It’s a horse of a different color, and as such, it requires a different way of thinking than we’ve had to use in the past. In today’s Whiteboard Friday, Rand tackles the question of what RankBrain actually is and whether SEOs should (or can) optimize for it.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat about RankBrain SEO and RankBrain in general. So Google released this algorithm or component of their algorithm a while ago, but there have been questions for a long time about: Can people actually do RankBrain SEO? Is that even a thing? Is it possible to optimize specifically for this RankBrain algorithm?

I’ll talk today a little bit about how RankBrain works just so we have a broad overview and we’re all on the same page about it. Google has continued to release more and more information through interviews and comments about what the system does. There are some things that potentially shift in our SEO strategies and tactics around it, but I’ll show why optimizing for RankBrain is probably the wrong way to frame it.

What does RankBrain actually do?

So what is it that RankBrain actually does? A query comes in to Google. Historically, classically Google would use an algorithm, probably the same algorithm, at least they’ve said sort of the same algorithm across the board historically to figure out which pages and sites to show. There are a bunch of different ranking inputs, which we’ve talked about many times here on Whiteboard Friday.

But if you search for this query today, what Google is saying is with RankBrain, they’re going to take any query that comes in and RankBrain is essentially going to be a query interpretation model. It’s going to look at the words in that query. It’s potentially going to look at things possibly like location or personalization or other things. We’re not entirely sure whether RankBrain uses those, but it certainly could. It interprets these queries, and then it’s going to try and determine the intent behind the query and make the ranking signals that are applied to the results appropriate to that actual query.

So here’s what that means. If you search today — I did this search on my mobile device, I did it on my desktop device — for “best Netflix shows” or “best shows on Netflix” or “What are good Netflix shows,” “good Netflix shows,” “what to watch on Netflix,” notice a pattern here? All five of these searches are essentially asking for the very same thing. We might quibble and say “what to watch on Netflix” could be more movie-centric than shows, which could be more TV or episodic series-centric. That’s okay. But these five are essentially, ” What should I watch on Netflix?”

Now, RankBrain is going to help Google understand that each of these queries, despite the fact that they use slightly different words and phrasing or completely different words, with the exception of Netflix, that they should all be answered by the same content or same kinds of content. That’s the part where Google, where RankBrain is determining the searcher intent. Then, Google is going to use RankBrain to basically say, “Now, what signals are right for me, Google, to enhance or to push down for these particular queries?”

Signals

So we’re going to be super simplistic, hyper-simplistic and imagine that Google has this realm of just a few signals, and for this particular query or set of queries, any of these, that…

  • Keyword matching is not that important. So minus that, not super important here.
  • Link diversity, neither here nor there.
  • Anchor text, it doesn’t matter too much, neither here nor there.
  • Freshness, very, very important.

Why is freshness so important? Well, because Google has seen patterns before, and if you show shows from Netflix that were on the service a year ago, two years ago, three years ago, you are no longer relevant. It doesn’t matter if you have lots of good links, lots of diversity, lots of anchor text, lots of great keyword matching. If you are not fresh, you are not showing searchers what they want, and therefore Google doesn’t want to display you. In fact, the number one result for all of these was published, I think, six or seven days ago, as of the filming of this Whiteboard Friday. Not particularly surprising, right? Freshness is super important for this query.

  • Domain authority, that is somewhat important. Google doesn’t want to get too spammed by low-quality domains even if they are publishing fresh content.
  • Engagement, very, very important signal here. That indicates to Google whether searchers are being satisfied by these particular results.

This is a high-engagement query too. So on low-engagement queries, where people are looking for a very simple, quick answer, you expect engagement not to be that big. But for something in-depth, like “What should I watch on Netflix,” you expect people are going to go, they’re going to engage with that content significantly. Maybe they’re going to watch a trailer or some videos. Maybe they’re going to browse through a list of 50 things. High engagement, hopefully.

  • Related topics, Google is definitely looking for the right words and phrases.

If you, for example, are talking about the best shows on Netflix and everyone is talking about how hot — I haven’t actually seen it — “Stranger Things” is, which is a TV program on Netflix that is very much in the public eye right now, well, if you don’t have that on your best show list, Google probably does not want to display you. So that’s an important related topic or a concept or a word vector, whatever it is.

  • Content depth, that’s also important here. Google expects a long list, a fairly substantive page of content, not just a short, “Here are 10 items,” and no details about them.

As a result of interpreting the query, using these signals in these proportions, these five were basically the top five or six for every single one of those queries. So Google is essentially saying, “Hey, it doesn’t matter if you have perfect keyword targeting and tons of link diversity and anchor text. The signals that are more important here are these ones, and we can interpret that all of these queries essentially have the same intent behind them. Therefore, this is who we’re going to rank.”

So, in essence, RankBrain is helping Google determine what signals to use in the algorithm or how to weight those signals, because there’s a ton of signals that they can choose from. RankBrain is helping them weight them, and they’re helping them interpret the query and the searcher intent.

How should SEOs respond?

Does that actually change how we do SEO? A little bit. A little bit. What it doesn’t do, though, is it does not say there is a specific way to do SEO for RankBrain itself. Because RankBrain is, yes, helping Google select signals and prioritize them, you can’t actually optimize for RankBrain itself. You can optimize for these signals, and you might say, “Hey, I know that, in my world, these signals are much more important than these signals,” or the reverse. For a lot of commercial, old-school queries, keyword matching and link diversity and anchor text are still very, very important. I’m not discounting those. What I’m saying is you can’t do SEO for RankBrain specifically or not in the classic way that we’ve been trained to do SEO for a particular algorithm. This is kind of different.

That said, there are some ways SEOs should respond.

  1. If you have not already killed the concept, the idea of one keyword, one page, you should kill it now. In fact, you should have killed it a long time ago, because Hummingbird really put this to bed way back in the day. But if you’re still doing that, RankBrain does that even more. It’s even more saying, “Hey, you know what? Condense all of these. For all of these queries you should not have one URL and another URL and another URL and another URL. You should have one page targeting all of them, targeting all the intents that are like this.” When you do your keyword research and your big matrix of keyword-to-content mapping, that’s how you should be optimizing there.
  2. It’s no longer the case, as it was probably five, six years ago, that one set of fixed inputs no longer governs every single query. Because of this weighting system, some queries are going to demand signals in different proportion to other ones. Sometimes you’re going to need fresh content. Sometimes you need very in-depth content. Sometimes you need high engagement. Sometimes you don’t. Sometimes you will need tons of links with anchor text. Sometimes you will not. Sometimes you need high authority to rank for something. Sometimes you don’t. So that’s a different model.
  3. The reputation that you get as a website, a domain earns a reputation around particular types of signals. That could be because you’re publishing lots of fresh content or because you get lots of diverse links or because you have very high engagement or you have very low engagement in terms of you answer things very quickly, but you have a lot of diverse information and topics on that, like a Dictionary.com or an Answers.com, somebody like that where it’s quick, drive-by visits, you answer the searcher’s query and then they’re gone. That’s a fine model. But you need to match your SEO focus, your brand of the type of SEO and the type of signals that you hit to the queries that you care about most. You should be establishing that over time and building that out.

So RankBrain, yes, it might shift a little bit of our strategic focus, but no, it’s not a classic algorithm that we do SEO against, like a Panda or a Penguin. How do I optimize to avoid Panda hitting me? How do I optimize to avoid Penguin hitting me? How do I optimize for Hummingbird so that my keywords match the query intent? Those are very different from RankBrain, which has this interpretation model.

So, with that, I look forward to hearing about your experiences with RankBrain. I look forward to hearing about what you might be changing since RankBrain came out a couple of years ago, and we’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Mike Deets - Living

 

 

 

Have an incredible day!

 

Mike

Deetslist – SEO, SOCIAL MEDIA, VIDEO

Source link

Let’s drop the agency nonsense

Let me start off by saying that I’m definitely an agency girl. I’ve worked on both sides, and through this I know that I’m much more suited to working on multiple clients in a creative environment. I’m not one for the often more corporate approach of working where there are discussions around the water cooler, and there are physical locks holding all computer equipment to the desk.

I’m exaggerating here a little bit, but you know what I mean. Some people are suited to this environment, but my personality just doesn’t fit. Then again, agencies aren’t all sweetness and light either, and can be pretty tough places at times.

Many of the places I’ve worked in have seen fairly inexperienced people getting positions they probably wouldn’t get if the business was in another industry. Some teams have been made up of people mostly fresh out of university or internships, bringing down the average number of years of experience.

Hiring young, passionate people is no bad thing though; they can bring different views and skills to the table. Yet if there’s a change to the team structure, or more senior people leave the business, it’s not unusual for juniors to be promoted into managers or to senior executive level. With a potential lack of training for their new role, it can cause issues to the wider team.

Whilst things won’t get better overnight, there must be more effort made to improve training, management and processes. Otherwise the sad reality is poor staff retention and a less than satisfactory working environment. After all, agencies are supposed to be vibrant places full of wonderful ideas to inspire staff to work their hardest for their clients.

Nobody is perfect, but really – let’s cut some of the agency bullsh!t. I’ve asked around and have identified some of the biggest culprits around in agencies today, so see if you can identify with any of them throughout your agency career. Oh, and make sure you push for training where you see it lacking, particularly for managers!

Micromanagement

My ultimate foe is a colleague or manager who is determined to micromanage at all costs. Look at the signs that you’re a micromanager in this article from the Harvard Business Review:

  • You’re never quite satisfied with deliverables.
  • You often feel frustrated because you would’ve gone about the task differently.
  • You laser in on the details and take great pride and /or pain in making corrections.
  • You constantly want to know where all your team members are and what they’re working on.
  • You ask for frequent updates on where things stand.
  • You prefer to be cc’d on emails.

There’s a good chance that you’ve experienced this in the past, or maybe you put up with this now. Or you could even be the micromanager. If so, listen to the article. “The bottom line is: you need to stop. It’s harming your team’s morale and – ultimately – their productivity.”

Making promises that can’t be kept

In some agencies, it is common practice to promise clients the world without really having the resources to do so. In most cases, this is to do with deadlines. It will be something along the lines of “Sure, we can get that (huge) technical audit (that we haven’t started) to you by mid-week”. Then all staff become overworked and irritated, as planning should be better than this.

Does the work really need to be done for mid-week? Would it be better quality if there were a couple more days spent on it? There are always cases where work is agreed upon at the last minute, but many other times when a bit of organisation and project management could’ve worked a treat.

Glory hunters

Picture the scene: four people work together on an audit. Sweat, tears and hours of investigation go into it. Then the document goes across to the client with one name on it, a middle finger to the other three. This has definitely happened to me in the past. It did to someone who wrote into Business Insider with this problem too; read what happened and the advice for what to do if you’re in this position too.

I’m a big fan of shouting about the great work done by all members of the team, both junior and senior, so it makes me sad when this happens. Building confidence (not cockiness) is important to actually develop the managers of tomorrow, and this can start with giving people the credit they deserve.

Awkward communication

With the introduction of platforms such as Slack in offices, it’s made it easier than ever for teams to stay connected when they’re on the move or working externally. It’s also made it easier for tone to be interpreted wrongly; nothing beats a face-to-face chat.

Then again, the skill of holding an effective meeting or discussion doesn’t come easily either; many people come away not fully understanding actions and objectives, and what comes next. It can also be awkward when you’re not involved in a private chat, but those who are thought that a good place to do it was in the corridor or in the office kitchen.

office kitchen chat

(If you like to fill your days with catch-ups and chats, pick your time and place properly please.)

Offering perks over happiness

It takes a lot to get an agency operating as one; with so many different personalities and approaches, things are bound to get heated from time-to-time. I do believe that taking a look at some of the fundamentals above could help businesses on their way though. I mean, how many times have you heard about someone leaving their job just because they had a conflict with their manager?

According to a new story from Employee Benefits, 57% of millennials cite reward for effort as a priority at work. Reward comes in many forms, but it isn’t simply a case of making perks available. In my mind, something like a Perkbox isn’t a substitution for proper feedback and progression plans for the team.

What do you think?

Whether you are agency or client side, there will always be little niggles when you’re working with your wider team, but it’d be interesting to know the areas where you draw the line.

How can agencies actually go about making improvements in this area? Do you think we need to move away from the trend of promoting people too early to try and avoid some of the common agency pitfalls? Let me know in the comments or on Twitter.

Post from Hannah Butcher

Mike Deets - Living

 

 

 

Have an incredible day!

 

Mike

Deetslist – SEO, SOCIAL MEDIA, VIDEO

Source link

How to Build Backlinks Using Your Competitors’ Broken Pages

Posted by TomCaulton

We all know building backlinks is one of the most important aspects of any successful SEO and digital marketing campaign. However, I believe there is an untapped resource out there for link building: finding your competitors’ broken pages that have been linked to by external sources.

Allow me to elaborate.

Finding the perfect backlink often takes hours, and it can can take days, weeks, or even longer to acquire. That’s where the link building method I’ve outlined below comes in. I use it on a regular basis to build relevant backlinks from competitors’ 404 pages.

Please note: In this post, I will be using Search Engine Land as an example to make my points.

Ready to dive in? Great, because I’m going to walk you through the entire link building process now.

First, you need to find your competitor(s). This is as easy as searching for the keyword you’re targeting on Google and selecting websites that are above you in the SERPs. Once you have a list of competitors, create a spreadsheet to put all of your competitors on, including their position in the rankings and the date you listed them.

Next, download Screaming Frog SEO Spider [a freemium tool]. This software will allow you to crawl all of your competitors website, revealing all their 404 pages. To do this, simply enter your competitors’ URLs in the search bar one at a time, like this:

Once the crawl is complete, click “Response Codes.”

e4LciHG.png

Then, click on the dropdown arrow next to “filter” and select “Client Error 4xx.”

HYi6TWa.png

Now you’ll be able to see the brand’s 404 pages.

Once you’ve completed the step above, simply press the “Export” button to export all of their 404 pages into a file. Next, import this file into to a spreadsheet in Excel or Google Docs. On this part of the spreadsheet, create tabs called “Trust Flow,” “Citation Flow,” “Referring Domains,” and “External Backlinks.”

Now that you’ve imported all of their 404 pages, you need to dissect the images and external links if there are any. A quick way to do this is to highlight the cell block by pressing on the specific cell at the top, then press “Filter” under the “Data” tab.H3YN9BG.pngLook for the drop-down arrow on the first cell of that block. Click the drop-down arrow, and underneath “Filter by values,” you will see two links: “Select all” and “Clear.”

Press “Clear,” like this:

ZERYiSm.pngThis will clear all preset options. Now, type in the URL of the competitor’s website in the search box and click “Select all.”SKqXxQ2.png

This will filter out all external links and just leave you with their 404 pages. Go through the whole list, highlighting the pages you think you can rewrite.

Now that you have all of your relevant 404 pages in place, run them through Majestic [a paid tool] or Moz’s Open Site Explorer (OSE) [a freemium tool] to see if their 404 pages actually have any external links (which is what we’re ultimately looking for). Add the details from Majestic or Moz to the spreadsheet. No matter which tool you use (I use OSE), hit “Request a CSV” for the backlink data. (Import the data into a new tab on your spreadsheet, or create a new spreadsheet altogether if you wish.)

Find relevant backlinks linking to (X’s) website. Once you’ve found all of the relevant websites, you can either highlight them or remove the ones that aren’t from your spreadsheet.

Please note: It’s worth running each of the websites you’re potentially going to be reaching out to through Majestic and Moz to find out their citation flow, trust flow, and domain authority (DA). You may only want to go for the highest DA; however, in my opinion, if it’s relevant to your niche and will provide useful information, it’s worth targeting.

With the 404s and link opportunities in hand, focus on creating content that’s relevant for the brands you hope to earn a link from. Find the contact information for someone at the brand you want the link from. This will usually be clear on their website; but if not, you can use tools such as VoilaNorbert and Email Hunter to get the information you need. Once you have this information, you need to send them an email similar to this one:


Hi [THEIR NAME],

My name is [YOUR NAME], and I carry out the [INSERT JOB ROLE – i.e., MARKETING] at [YOUR COMPANY’S NAME or WEBSITE].

I have just come across your blog post regarding [INSERT THEIR POST TITLE] and when I clicked on one of the links on that post, it happened to go to a 404 page. As you’re probably aware, this is bad for user experience, which is the reason I’m emailing you today.

We recently published an in-depth article regarding the same subject of the broken link you have on your website: [INSERT YOUR POST TITLE].

Here’s the link to our article: [URL].

I was wondering if you wouldn’t mind linking to our article instead of the 404 page you’re currently linking to, as our article will provide your readers with a better user experience.

We will be updating this article so we can keep people provided with the very latest information as the industry evolves.

Thank you for reading this email and I look forward to hearing from you.

[YOUR NAME]


Disclaimer: The email example above is just an example and should be tailored to your own style of writing.

In closing, remember to keep detailed notes of the conversations you have with people during outreach, and always follow up with people you connect with.

I hope this tactic helps your SEO efforts in the future. It’s certainly helped me find new places to earn links. Not only that, but it gives me new content ideas on a regular basis.

Do you use a similar process to build links? I’d love to hear about it in the comments.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Mike Deets - Living

 

 

 

Have an incredible day!

 

Mike

Deetslist – SEO, SOCIAL MEDIA, VIDEO

Source link

Top Tips for Creating your Integrated Digital Marketing Strategy for 2017

Now is the perfect time for planning and refining your 2017 digital marketing strategy, and before putting pen to paper (or more likely mindmap to spreadsheet), take a step back and think about the integrated opportunities that can drive your strategic digital success in 2017 and beyond.

In this article, I share some of my favourite and successful, practical tips for making the most out of integration within digital marketing, planning, and strategic thinking. At this stage, let me set expectations, of what I mean by ‘integrated marketing’ and offer some information on why businesses need a strategy:

Integrated Marketing is an approach to creating a unified and seamless experience for consumers to interact with the brand/enterprise; it attempts to meld all aspects of marketing communication such as advertising, sales promotion, public relations, direct marketing, and social media, through their respective mix of tactics, methods, channels, media, and activities, so that all work together as a unified force. It is a process designed to ensure that all messaging and communications strategies are consistent across all channels and are centered on the customer. The DMA.org.

Why integration matters online

Before exploring tips and advice for creating an integrated strategy for your digital marketing, it’s important to consider the value that an integrated thinking can bring to your business. This is even more imperative with the online industry, as many experts are by design, very much single specialist orientated, which can lead to isolated working environments, and segmented team approaches to marketing.

What you are looking to achieve with integrated working is, the added value derived by combining the expertise of specialism with the added outcomes of group think, and effective team working.

Here’s an example of what integrated working can look like in practice, and where digital specialists can sit within the information seeking and buying process.

Integrated working model - Vertical Leap

Source: Vertical Leap ‘How companies win when SEO, UX and CX talk‘.

Creating an integrated working environment

When you effectively have distinct industry specialists  (SEO, PPC, CRO, etc.) working together it is not unrealistic to achieve a force multiplier effect through the outcomes gained. Some of the core focus areas to support successful digital expert working include:

  • Making data accessible to all parties.
  • Confirming expected project working and ‘team think’ requirements.
  • Having an identified cross team / project leader.
  • Being proactive with communication and facilitate easy team collaboration.
  • Ensuring everyone understands team roles, responsibilities, and tasks.

In addition to the above, make it clear from the outset what an integrated approach looks like and set/manage the expectations. You will need to propose, agree, and communicate expected end results, ensure you have the right people in place needed to deliver on them, and throughout the process reassess how successfully you are progressing towards achieving them.

Planning to succeed from the outset

Whiteboard strategy planning with teams

The basics of an integrated website, or digital strategy are built using the same foundations as any other business strategy – you need to:

Have a clear mission statement / objective. What do you want to achieve? If this is longer than a single paragraph (or two at most), you are likely over-thinking this. The more refined the objective the greater the chance of people understanding it, and being able to contribute towards it. An example objective could be ‘Increase total unique visitors to your website in 2017 by 70%’.

Set the overarching strategy. The strategy provides an easy to understand overview of how you will deliver the objective. An example strategy tied to the above objective could be – ‘Turn the website into a content rich, information hub for business to grow non-brand visibility and traffic’.

Decide core tactics. This will enable experts to add value at the ideation stages of an integrated digital strategy, support cross-team buy-in from the start, and set expectations on what people are required to deliver. Remaining on this theoretical example, tactics may include; creating new content hubs, paid content promotion, social PR, email marketing, SMS marketing, and expanding the current affiliate network.

Agree KPI’s and metrics. It’s important that everyone involved in the project understand what success looks like, and how it is measured. Key Performance Indicators for this project would likely include benchmarks and targets set for:

  • traffic by source
  • social sharing
  • non-brand impressions
  • topic/theme traffic
  • topic/theme rankings
  • more

Understanding and maximising specialist roles

Whether you are working in-house or with an external agency, freelancers, or other resource areas, it is unlikely that every digital marketing quality is fully fulfilled by a single person. Your ability to understand, and play on the strengths of the resources you have at your disposal (and by contrast overcome many of the barriers) is important from the start of your digital strategy planning.

Through understanding and making the most from specialist roles at your disposal you are able to create the perfect search anatomy for the project needs.

The anatomy of the perfect search marketer - Vertical Leap

Practical integrated strategy creation tips

Marketing professionals sharing tips and advice

Here are a few of my integrated strategy creation tips which I hope you will find useful when creating your next digital strategy. Please do comment and add to this with tactics and tips you have found beneficial too.

  • Make sure that you are placing the end user at the forefront of the strategy. It can be surprisingly easy to focus so much on integrated working that you overlook the end goal.
  • Have a plan, but don’t rigidly adhere to it. An important element of any strategy is evaluation and refinement. Changing course is not failure, it is an imperative aspect of making the most out of changeable data, tactic successes and potential impact.
  • Remember integration requires teamwork. Integrated working is not a dictatorship, but a collaborative environment for sharing and nurturing ideas, enriching end results, and empowering skilled staff to deliver results.
  • Set review deadlines throughout the delivery of your integrated digital strategy. You need to know what you are achieving, and how everyone in the project team is contributing towards the objective throughout. Team scrums can work well for this as they are informal, require minimal planning, and often take 15 minutes of less to complete.

Images:
Images included have been sourced from https://www.pexels.com/discover/ – “All photos on Pexels are free for any personal and commercial purpose.” – see https://www.pexels.com/photo-license/.

All other images have been used with permission from Vertical Leap.

Post from Lee Wilson

Mike Deets - Living

 

 

 

Have an incredible day!

 

Mike

Deetslist – SEO, SOCIAL MEDIA, VIDEO

Source link

How a Single Piece of Content Increased Our DA by +7 Points [Case Study]

Posted by sergeystefoglo

Content marketing has been discussed and researched more in the last 5 years than ever before.

Source: Google Trends

There are various kinds of content marketing strategies out there. Blog promotion, infographics, video strategies, and creative content are some. Depending on your goals, some are more effective than others.

At Distilled, we’ve been fortunate enough to work on many creative content pieces with some incredible clients. This article is going to focus on a piece of content that my team and I created for a client. We’ll take a look at both the creation process and the tangible results of the piece we made.

Note: In general, you don’t want to rely on one piece of content for link acquisition. It’s recommended to focus on multiple pieces throughout the year to add link diversity and give your content pieces a good chance to succeed. The following is simply a case study of one piece of content that worked well for my client.


Client backstory: We need links!

Our client is Ginny’s (shoutout to Matt and Cailey). Ginny’s is an ecommerce business based in the beautiful state of Wisconsin.

We knew that regardless of how much optimization was done on the site, their lack of incoming links would be a huge barrier to success. This quickly became a topic of discussion for us.

The general rule of thumb: the more linking root domains (LRDs) your site has, the stronger the domain authority should be. And the stronger the linking root domains are, the better it is for your DA. In other words, it’s better to get 1 strong link (DA 80+) than 10 weak links (DA 20-). Kudos if the links are topically relevant to your website/brand.

So, my team and I sat down and started thinking of different ways we could accomplish the task of increasing LRDs and (hopefully) DA for my client.


The process of creating a link-worthy story

Here are the steps my team and I went through for this particular client.

Note: For an extensive look at creating creative content, please see the following articles:

  • How to Make Award-Winning Creative Content – Part 1
  • How to Make Award-Winning Creative Content – Part 2

Ideation

The first step in the creative process is ideation, because without great ideas you can’t a have a great piece of content. It’s important to give yourself enough time for ideation. Don’t rush it, and be sure to include various team members with different backgrounds to get as many ideas as possible. Note: stock up on coffee/Red Bull and snacks for this.

Validation

Typically after an ideation session you’ll have many potential ideas. It’s important to go through and validate them. When I say “validate,” I mean making sure others haven’t already done something similar, or that creating the piece is actually possible (you have access to the right data, etc.)

Note: For more information on researching and validating your creative ideas, read this post titled “Researching Creative Ideas: 10 Dos and Don’ts.”

Pitching

At this point you’ll have a handful of ideas that are not only on-brand and interesting, but have great potential in being picked up by various sources. Put together a nice deck and pitch your ideas to the client. The goal is to get your client to pick one (or a few, depending on the budget).

Note: Here’s an awesome write-up on a framework for pitching creative ideas to your clients.

Gathering the data

Once your client signs off on a piece, it’s time to dive into the data! Depending on the piece you’re creating, this might look like scraping websites and doing a ton of research to get the right data you need. Take your time on this, as you want to make sure your data is accurate and relevant.

Design

During this part of the process, it’s a great idea to start mocking up some potential designs. If your piece is smaller, this might be a quick and simple task. If you have a data visualization, this will be longer. Typically, it’s a good idea to create 2–3 mockups and give your client some options.

Development

Once your client signs off on a particular design, it’s time to dive into development.

Copy

The actual copy for the piece doesn’t have to happen after the development, but it’s usually a good idea to allow the copywriter to see how much space they have to work with. What you don’t want is for your copywriter to write 500 words when the designer has made space for 100. Communication is key in this process.

Testing

Once the piece is built, it’s important to test it out on various browsers and devices. Ask people to give it a run and try to fix as many errors/bugs as possible.

Promotion

Depending on your timeline, you might want to start promotion sooner than this. The important thing to note is to consider pre-pitching and reaching out to contacts to gauge their interest in the piece as soon as possible. Keep your contacts updated and be sure to give them everything they need for their stories.

Note: For further reference on pitching journalists, please see this post titled, “Beyond the Media List: Pro-Active Prospecting for Pitching Creative Content.”

Launch

It’s time to launch!

Push

On the day the piece launches, be sure that you are reminding journalists, reaching out to contacts, sharing the piece on social media, and making your social campaigns live.

Celebrate

There are a lot of steps to building a creative piece, so don’t underestimate the work that goes into it! After you launch the piece be sure to have a beer, give yourself a pat on the back, or do whatever it is you need to do to celebrate.


Post-ideation: What we came up with

After the process outlined above, our team came up with 50 States of Bacon.

The idea was simple: Everyone likes bacon, but who likes it the most? Ginny’s caters to a lot of people who love deep frying, so this was on-brand. We decided to use Instagram’s (now difficult to access) API to extract 33,742 photos that were tagged with #bacon and located within the USA. To normalize for population distribution and Instagram usage, we also collected 64,640 photos with the tags #food, #breakfast, #lunch, and #dinner.

To make this data more visual, we made it interactive and included some fun facts for each state.


What happened after we launched the piece?

So, what happened after we launched the piece? Let’s dive in.

Here are some of the larger websites 50 States of Bacon got picked up on.

Website

Domain Authority

Other

US News

94

Tweeted from account (115K+)

Mashable

96

Tweeted from account (6.95M+)

AOL Lifestyle

98

Referred 1,200+ visitors

Eater

85

N/A

Daily Dot

85

Tweeted from account (274K+)

Here is what the LRDs and DA looked like before we launched the piece, and then after 4 months of it being live:

Before Launch

4 Months Later

Linking Root Domains

450

600

Domain Authority

29

36

Let’s break this down by metric. Here’s a graph of the LRDs over time (we launched the piece at about the start of the uplift).

The domain authority didn’t budge until about 4 months after we launched the piece. We weren’t actively pursuing any other link-based campaigns during this time, so it’s safe to say the creative piece had a lot to do with this boost in DA.

Note: Since DA is refreshed with new pools of data, this observation wouldn’t have been as valid if the DA only moved one or two positions. But, since it moved 7 positions so close to the launch of this piece, I feel like it’s safe to assume the piece contributed greatly.

Does this mean if you do a similar piece that your DA will also increase? No. Does it give us a good example on what can happen? Absolutely.


A note on LRDs, DA, and setting expectations

Setting expectations with clients is hard. That’s even more true when you both know that links may be even more important than user engagement with your campaign. To make sure expectations are reasonable, you may want to encourage them to see this campaign as one of many over a long period of time. Then there’s less pressure on any individual piece.

So, it’s important to set expectations upfront. I would never tell a client that we can guarantee a certain number of links, or that we guarantee an increase in domain authority.

Instead, we can guarantee a piece of content that is well-built, well-researched, and interesting to their target audience. You can go one step further and guarantee reaching out to X amount of contacts, and you can estimate how many of those contacts will respond with a “yes” or “no.”

In fact, you should set goals. How much traffic would you like the piece to bring? What about social shares? What seems like a reasonable amount of LRD’s you could gain from a piece like this? Benchmark where you currently are, and make some reasonable goals.

The point I’m trying to make is that you shouldn’t promise your client a certain amount of links because, frankly, you’d be lying to them. Be upfront about what this looks like and show examples of work you’ve done before, but make sure to set their expectations correctly up front to avoid any conflicts down the road.


Conclusion

There’s a lot to be learned from the results of creative campaigns. The goal of this article is to share one piece that I’ve worked on with a client while highlighting some things that I learned/observed along the way. If you’d like to see more campaigns we’ve worked on at Distilled, take a look at our creative roundup for last year.

To wrap things up, here are the key takeaways:

  • Creative pieces take a lot of thought, work, and time. Don’t underestimate the task at hand.
  • Don’t frame the project as only focused on gaining links. Instead, aim for creating a compelling piece of content that is on-brand and has the potential to gain traction.
  • Oftentimes it’s best not to put all your eggs in one basket. Plan multiple pieces throughout the year.
  • If your research is right and you pitch the piece to the correct people, this is a strategy that can gain your domain some very strong LRDs. In this particular case, 110 linking root domains (and counting).
  • …But those links won’t come easy. You need to pre-pitch, remind, and re-pitch your contacts. There are many great pieces of content being published daily; you need to be proactive about ensuring your spots online.
  • There are other benefits to doing pieces like this aside from links. Social shares, brand awareness, and referral traffic are some other metrics to look at.
  • It is possible to increase your DA by doing a piece like this, but it takes time. Be patient, and continue doing great work in the meantime.

Other thoughts

  • There are some arguments to be made that a piece of content like this only has spikes and doesn’t do any good for a brand. I don’t believe this to be true. The way I see it, if a piece is too evergreen, it might not gain as many strong links. At the same time, if a piece is completely left-field and doesn’t fit with the brand, the links might not be as impactful. I think there’s a fine line here; it should be up to your best judgment on the pieces you should create.
  • This piece could potentially be updated every year to gain more links or traction (although it would be a lot more difficult with Instagram drastically limiting their API).
  • It’s possible that this piece didn’t have a direct impact on DA, but because there were no other link acquisition strategies during the 4 months, we can safely assume the two are correlated.
  • There’s an argument to be made that jumping from the 20s to the 30s is much easier than from 40s to 50s when you’re speaking of DA. We know that it gets more difficult to increase DA as it gets higher, so do keep that in mind.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Mike Deets - Living

 

 

 

Have an incredible day!

 

Mike

Deetslist – SEO, SOCIAL MEDIA, VIDEO

Source link

46 Updated DeepCrawl tweaks to Make a Website Soar in Google Search Results

Optimizing a website for your target audience can be tricky. Optimizing a large website for your target audience can be even trickier. Since the last article, DeepCrawl recently launched a significant update to a brand new crawler which is a lot faster and there are now a bunch of new reports available.

Below are 46 new and updated tips and tricks to optimise the search signals for your websites in Google’s organic search.

Spend more time making recommendations and changes

(and less time trawling through data)

1. Crawl MASSIVE Sites

If you have a really large website with millions of pages you can scan unlimited amounts with the custom setting – so long as you have enough credits in your account!

How To Do It:

  • From the crawl limits in step 3 of crawl set up, adjust the crawl limit to suit your target domain’s total URLs
  • Crawl up to 10 million using pre-fabricated options from the dropdown menu
  • For a custom crawl, select “custom” from the dropdown menu and adjust max URLs and crawl depth to suit your reporting needs

2. Compare Previous Crawls

Built into DeepCrawl is the ability to compare your current crawl with the most recent crawl, this is useful for tracking changes as they are implemented, and for providing rich data for your organization to show the (hopefully positive) impacts of your SEO changes on the site. You’ll also be able to see all of your previous crawls as well.

How To Do It:

  • On step 4 of your crawl set up, you can select to compare your new crawl to a previous one

02-compare-previous-crawls

3. Monitor Trends Between Crawls

Tracking changes between crawls gives you powerful data to gauge site trends, get ahead of any emerging issues, and spot potential opportunities. DeepCrawl highlights these for you through the Added, Removed, and Missing reports. These are populated once a project has been re-crawled, and appear in every metric reported upon.

Once a follow-up crawl is finished, the new crawl shows your improved stats in green and potential trouble areas in red.

In addition to calculating the URLs which are relevant to a report, DeepCrawl also calculates the changes in URLs between crawls. If a URL appears in a report where it did not appear in the previous crawl, it will be included in the ‘Added report’. If the URL was included in the previous crawl, and is present in the current crawl, but not in this specific report, then it is reported within the ‘Removed report’. If the URL was in the previous crawl, but was not included in any report in the current crawl, it is included in the ‘Missing report’ (e.g. the URL may have been unlinked since last crawled).

03-monitor-trends-between-crawls

4. Filters, filters, filters!

DeepCrawl reports are highly customisable. With well over 120 filtering options, you can really drill down into the data and isolate exactly what you are looking for – in relation to your specific SEO strategy and needs. Whether highlighting pages with high load times/GA bounce rates, missing social tags, broken JS/CSS, or low content: HTML ratio. You can even save your filters by creating tasks within reports, so the SEO issues that matter to you most will be flagged every time you recrawl, making the thorns in your side easy to monitor, as well as your progress!

04-filters-filters-filters

5. Assign Tasks, Ease Workflows

After you’ve gone through a big audit, assigning and communicating all your to-do’s can be challenging for any site owner. By using the built-in task manager, a button on the top right of each area of your crawl, you can assign tasks as you go along to your team members, and give each task a priority. This system helps you track actions from your site audit in the Issues area, easing team workflows by showing you what is outstanding, what you’ve achieved so far etc. You can also add deadlines and mark projects as fixed, all from the same screen in the DeepCrawl projects platform. Collaborators receiving tasks do not need a DeepCrawl account themselves, as they’ll have access to the specific crawl report shared as guests.

05-assign-issues-to-team-members

6. Share Read-Only Reports

This is one of my favourite options: sharing reports with C-levels and other decision makers without giving them access to other sensitive projects is easily doable with DeepCrawl. Generate an expiring URL to give them a high-level view of the site crawl as a whole or to kick out a PDF that focuses on a granular section, including content, indexation and validation. This also doubles up for sharing links externally when prospecting clients, especially if you’ve branded your DeepCrawl reports with your name and company logo.

06-share-read-only-reports

7. DeepCrawl is now Responsive

With DeepCrawl’s new responsive design, crawl reports look great across devices. This means you can also set off crawls on the go, useful when setting off a quick site audit or pitch. Whilst you set off crawls from the palm of your hand you can also edit the crawls while you monitor them in real-time, in case you need to alter the speed or levels etc.

07-deepcrawl-is-now-responsive

8. Brand your Crawl Reports

Are you a freelancer or an agency? In order to brand DeepCrawl reports with your business information/logo (or a client’s logo), and serve data to your team or client that is formatted to look as though it came directly from your shop, you can white-label them.

How To Do It:

  • Go to Account Settings
  • Select from Theme, Header colour, Menu colour, Logo and Custom proxy
  • Make the report yours!

08-brand-your-crawl-reports

Optimise your Crawl Budget

9. Crawl Your Site like Googlebot

Crawl your website just like search engine bots do. Getting a comprehensive report of every URL on your website is a mandatory component for regular maintenance. Crawl and compare your website, sitemap and landing pages to identify orphan pages, optimise your sitemaps and prioritise your workload.

09-crawl-your-site-like-googlebot

10. Optimise your Indexation

DeepCrawl gives you the versatility to get high-level and granular views of indexation across your entire domain. Check if search engines can see your site’s most important pages from the Indexation report, which sits just under the Dashboard on the left hand navigation area. Investigate no-indexed pages to make sure you’re only blocking search engines from URLs when it’s absolutely necessary.

10-optimise-your-indexation

11. Discover Potentially Non-indexable Pages

To stop you wasting crawl budget and/or identify wrongly canonicalised content, the Indexation reports show you a list of all no-indexed pages on your site, and gives you details about their meta-tags e.g. nofollowed, rel canonical, noindex etc. Pages with noindex directives in the robots meta tag, robots.txt or X-Robots-Tag in the header should be reviewed, as they can’t be indexed by search engines.

11-non-indexable-pages

12. Discover Disallowed URLs

The Disallowed URLs report, nested under Uncrawled URLs within Indexation, contains all the URLs that were disallowed in the robots.txt file on the live site, or from a custom robots.txt file in the Advanced Settings. These URLs cannot be crawled by Googlebot, which prevents their appearance in search results, these should be reviewed to ensure that none of your valuable pages are being disallowed. It’s good to get an idea of which URLs may not be crawled by search engines.

12-check-disallowed-urls-for-errors

13. Check Pagination

Pagination is crucial for large websites with lots of products or content, by making sure the right pages display for relevant categories on the site. You’ll find First Pages in a series in the pagination menu, you can also view unlinked paginated pages which is really useful for hunting down pages which might have rel=next and rel=prev implemented wrongly.

13-hunt-down-pages-that-have-relnext-and-relprev-implemented-wrongly-copy

Understand your Technical Architecture

14. Introducing Unique Internal Links & Unique Broken Links

The Unique Internal Links report shows you the number of instances for all the anchor texts DeepCrawl finds in your crawl, so you can maximise your internal linking structure and spread your link juice out to rank for more terms! The links in this report can be analysed to understand for anchor text, as well as the status of the redirect target URL.

DeepCrawl’s Unique Broken Links report shows your site’s links with unique anchor text and target URL where the target URL returns a 4xx or 5xx status. Naturally, these links can result in poor UX and waste crawl budget, so they can be updated to a new target page or removed from the source page. This nifty new report is unique to DeepCrawl!

14-introducing-unique-internal-links-_-unique-broken-links

15. Find Broken Links

Fixing 404 errors reduces the chance of users landing on broken pages and makes it easier on the crawlers, so they can find the most important content on your site more easily. Find 404s in DeepCrawl’s Non-200 Pages report. This gives you a full list of a 4xx errors on site at the time of audit, including their page title, URL, source code and the link on the page found to return a 404.

You’ll also find pages with 5xx errors, any unauthorised pages, non-301 redirects, 301 redirects, and uncategorised HTTP response codes, or pages returning a text/html content type and an HTTP response code which isn’t defined by W3C – these pages won’t be indexed by search engines and their body content will be ignored.

15-find-broken-links

16. Fix Broken Links

Having too many broken links to internal and external resources on your website can lead to a bad user experience, as well as give the impression your website is out of date. Go to the Broken Pages report from the left hand menu to find them. Go fix them.

16-fixing-broken-links

17. Check Redirects – Including Redirection Loops

Check the status of temporary and permanent redirects on site by checking the Non-200 Status report, where your redirects are nested. You can download 301 and 302 redirects or share a project link with team members to start the revision process

You can also find your Broken Redirects, Max Redirects and Redirect Loops. The Max Redirects report is defaulted to show pages that hop more than 4 times, this number can be customised on step 4 of your crawl set up in Report Settings, nested under Report Setup within the Advanced Settings.

The new Redirect Loop report contains URL chains which redirect back to themselves. These chains will result in infinite loops, causing errors in web browsers for users, and prevent crawling by search engines. In short, fixing them helps bots and, users, and prevent the loss of important authority. Once found you can update your redirect rules to prevent loops!

17-check-redirects-including-redirection-loops

18. Verify Canonical Tags

Incorrect canonical tags can waste crawl budget and cause the bots to ignore parts of your site, leaving your site in danger, as search engines may have trouble correctly identifying your content. View canonicalised pages in your DeepCrawl audit from the Indexation report. The Non-Indexable report gives you the canonical’s location and page details (like hreflang, links in/out, any duplicate content/size, fetch time etc).

18-see-all-canonical-tags-and-url-locations

19. Verify HREFLANG Tags

If your website is available in multiple languages, you need to validate the site’s HREFLANG tags. You can test HREFLANG tags through the validation tab in your universal crawl dashboard.

If you have HREFLANG tags in your sitemaps, be sure to include Web Crawl in your crawl set up, as this includes crawling your XML sitemaps. DeepCrawl reports on all HREFLANG combinations, working/broken, and/or unsupported links as well as pages without HREFLANG tags.

How To Do It:

  • The Configuration report gives you an overview of HREFLANG implementation
  • In the lower left menu, the HREFLANG section breaks down all the aspects of HREFLANG implementation into categorised pages

19-verify-hreflang-tags

20. Optimise Image Tags

By using the custom extraction tool you can extract a list of images that don’t have alt tags across the site which can help you gain valuable rankings in Google Image Search.

How To Do It:

  • Create custom extraction rules using Regular Expressions
  • Hint: Try “/(<img(?!.*?alt=([‘”]).*?2)[^>]*)(>)/” to catch images that have alt tag errors or don’t have alt tags altogether
  • Paste your code into “Extraction Regex” from the Advanced Settings link on step 4 of your crawl set up
  • Check your reports from your projects dashboard when the crawl completes. DeepCrawl gives two reports when using this setting: URLs that followed at least one rule from your entered syntax and URLs that returned no rule matches

20-optimise-image-tags

21. Is Your Site Mobile-Ready?

Since “Mobilegeddon”, SEO’s have all become keenly aware of the constant growth of mobile usage around the world, where 70% of site traffic is from a mobile device. To optimise for the hands of the users holding those mobile devices, and the search engines connecting them to your site, you have to send mobile users content in the best way possible, and fast!

DeepCrawl’s new Mobile report shows you whether pages have any mobile configuration, and if so whether they are configured responsively, or dynamically. The Mobile report also shows you any desktop mobile configurations, mobile app links, and any discouraged viewport types.

21-is-your-site-mobile-ready_

22. Migrating to HTTPS?

Google has been calling for “HTTPS everywhere” since 2014, and it has been considered a ranking signal. It goes without saying that sooner or later most sites may to switch to the secure protocol. By crawling both http/https, DeepCrawl’s HTTPS report will show you:

  • HTTP resources on HTTPS pages
  • Pages with HSTS
  • HTTP pages with HSTS
  • HTTP pages
  • HTTPs pages

Highlighting any HTTP resources on HTTPS pages enables you to make sure your protocols are set up correctly, avoiding issues when search engines and browsers are identifying whether your site is secure or not. Equally, your users won’t see a red lock appear in the URL instead of a green lock, and won’t get a warning message from browsers saying proceed with caution, site insecure. Which is not exactly what people want to see when they are about to make a purchase, because when they see it, they probably won’t…

22-migrating-to-https

23. Find Unlinked Pages driving Traffic

DeepCrawl really gives you a holistic picture of your site’s architecture. You can incorporate up to 5 sources into each crawl. By combining a website crawl with analytics you’ll get a detailed gap analysis, and find URLs which have generated traffic but aren’t linked – also known as orphans – will be highlighted for you.

Pro tip: even more so if you add backlinks and lists to your crawl too! You can link your Google Analytics account and sync 7 or 30 days of data, or you can manually upload up to 6 months worth of GA or any other analytics (like Omniture) for that matter. This way you can work on your linking structure, and optimise pages that would otherwise be missed opportunities.

23-find-unlinkedin-pages-driving-traffic

24. Sitemap Optimisation

You can opt to manually add sitemaps into your crawl and/or let DeepCrawl automatically find them for you. It’s worth noting that if DeepCrawl does not find them, then it’s likely that search engines won’t either! By including sitemaps into your website crawl, DeepCrawl identifies the pages that either aren’t linked internally or are missing from sitemaps.

By including analytics in the crawl you can also see which of these generate entry visits, revealing gaps in the sitemaps. Moreover, shedding light on your internal linking structure by highlighting where they don’t match up, as you’ll see the specific URLs that are found in the sitemaps but aren’t linked, likewise those that are linked but are not found in your sitemaps.

Performing a gap analysis to optimise your sitemaps with DeepCrawl enables you to visualise your site’s structure from multiple angles, and find all of your potential areas and opportunities for improvement. TIP: You can also use the tool as an XML sitemaps generator.

24-sitemap-optimisation

25. Control Crawl Speed

You can crawl at rates as fast as your hosting environment can handle, which should be used with caution to avoid accidentally taking down a site. DeepCrawl boasts one of the most nimble audit spiders available for online marketers working with enterprise level domains.

Whilst appreciating the need for speed, accuracy is what’s most important! That said, you can change crawl speeds by URLs crawled per second when setting up, or even during a live crawl. Speeds range from 0-50 URLs crawled per second.

25-control-crawl-speed

Is your Content being found?

26. Identifying Duplicate Content

Duplicate content is an ongoing issue for search engines and users alike, but can be difficult to hunt down. Especially on really, really large sites. But, these troublesome pages are easily identified using DeepCrawl.

Amongst DeepCrawl’s duplicate reporting features, lies the Duplicate Body Content report. Unlike the previous version of DeepCrawl, the new DeepCrawl doesn’t require the user to adjust sensitivity. All duplicate content is flagged, to help avoid repeated content that can confuse search engines, make original sources fail to rank, and aren’t really giving your readership added value.

Finding Duplicate Titles, Descriptions, Body Content, and URLs with DeepCrawl is an effortless time-saver.

26-identifying-duplicate-content

27. Identify Duplicate Pages, Clusters, Primary Duplicates & Introducing True Uniques

Clicking into Duplicate Pages from the dashboard gives you a list of all the duplicates found in your crawl, which you can easily download or share. DeepCrawl now also gives you a list of Duplicate Clusters so you can look at groups of duplicates to try find the cause/pattern of these authority-draining pages.

There is also a new report entitled True Uniques. These pages have no duplicates coming off them in any way shape or form, are the most likely to be indexed, and naturally are very important pages in your site.

Primary Duplicates have duplicates coming off them – as the name implies – but have the highest internal link weight from each set of duplicated pages. Though signals like traffic and backlinks need be reviewed to assess the most appropriate primary URL, these pages should be analysed – as they are the most likely to be indexed.

Duplicate Clusters are pages sharing an identical title and near identical content with another page found in the same crawl. Duplicate pages often dilute authority signals and social shares, affecting potential performance and reducing crawl efficiency on your site. You can optimise clusters of these pages, by removing internal links to their URLs and redirecting duplicate URLs to the primary URL, or adding canonical tags to another one.

How To Do It:

  • Click “Add Project” from your main dashboard
  • Under the crawl depth setting tell DeepCrawl to scan your website at all its levels
  • Once the crawl has finished, review your site’s duplicate pages from the “issues” list on your main dashboard or search for ‘duplicate’ in the left nav search bar

27-identify-duplicate-pages-clusters-primary-duplicates-_-introducing-true-uniques

28. Sniff Out Troublesome Body Content

Troublesome content impacts UX and causes negative behavioral signals like bouncing back to the search results. Review your page-level content after a web crawl by checking out empty or thin pages, and digging into duplication. DeepCrawl gives you a scalpel’s precision in pinpointing the problems right down to individual landing pages, which enables you to direct your team precisely to the source of the problem.

How To Do It:

  • Click on the Body Content report in the left hand menu
  • You can also select individual issues from the main dashboard
  • Assign tasks using the task manager or share with your team

28-sniff-out-troublesome-body-content

29. Check for Thin Content

Clean, efficient code leads to fast loading sites – a big advantage in search engines and for users. Search engines tend to avoid serving pages that have thin content and extensive HTML in organic listings. Investigate these pages easily from the Thin Pages area nested within the Content report.

29-find-pages-with-bad-html_content-ratios

30. Avoid Panda, Manage Thin Content

In a post-Panda world it’s always good to keep an eye on any potentially thin content which can negatively impact your rankings. DeepCrawl has a section dedicated to thin and empty pages in the body content reports.

The Thin Pages report will show all of your pages with less than the minimum to content size specified in Advanced Settings > Report Settings (these settings are defaulted at 3 kb, you can also choose to customise them). Empty pages are all your indexable pages with less content than the Content Size setting specified (default set at 0.5 kilobytes) in Advanced Settings > Report Settings.

How To Do It:

  • Typing content in the main menu will give you the body content report
  • Clicking on the list will give you a list of pages you can download or share

30-avoid-a-thin-content-penalty-now

31. Optimise Page Titles & Meta Descriptions

Page titles and meta descriptions are often the first point of contact for users to your site coming from the search results, well written and unique descriptions can have a big impact on click through rates and user experience. Through the Content report, DeepCrawl gives you an accurate count of duplicate, missing and short meta descriptions and titles.

31-optimise-page-title-_-meta-description

32. Clean Up Page Headers

Cluttered page headers can impair the click through rate if users’ expectations are not being managed well. CTRs can vary by wide margins, which makes it difficult to chart the most effective path to conversion.

If you suspect your page headers are cluttered by running a crawl with Google Analytics data, you can assess key SEO landing pages, and gain deeper insights, by combining crawl data with powerful analytics data including bounce rate, time on page, and load times.

32-clean-up-page-headers

Other nuggets that DeepCrawl gives

33. How Does Your Site Compare to Competitors?

Set up a crawl using the “Stealth Crawl” feature to perform an undercover analysis of your competitor’s site, without them ever noticing. Stealth Crawl randomises IPs, user agents with delays between requests – making it virtually indistinguishable from regular traffic. Analyse their site architecture and see how your site stacks up in comparison, insodoing discovering areas for improvement.

How To Do It:

  • Go to the Advanced Settings in step 4 of your crawl setup and select and tick Stealth Mode Crawl nested under the Spider Settings

33-how-does-your-site-compare-to-competitors_

34. Test Domain Migration

There’s always issues with newly migrated websites which usually generate page display errors and the site going down. By checking status codes post-migration in DeepCrawl you can keep an eye on any unexpected server-side issues as you crawl.

In the Non-200 Pages report you can see the total number of non-200 status codes, including 5xx and 4xx errors that DeepCrawl detected during the platform’s most recent crawl.

34-test-domain-migration

35. Test Individual URLs

Getting a granular view over thousands of pages can be difficult, but DeepCrawl makes the process digestible with an elegant Pages Breakdown pie chart on the dashboard that can be filtered and downloaded for your needs. The pie chart (along with all graphs and reports) can be downloaded in the format of your choice, whether CSV/PNG/PDF etc.

View Primary Pages by clicking the link of the same name (primary pages) in your dashboard overview. From here, you can see a detailed breakdown of each and every unique and indexable URL of up to 200 metrics, including DeepRank (an internal ranking system), clicks in, load time, content/HTML ratio, social tags, mobile-optimization (or lack thereof!), pagination and more.

35-test-individual-urls

36. Make Landing Pages Awesome and Improve UX

To help improve conversion and engagement, use DeepCrawl metrics to optimise page-level factors like compelling content and pagination, which are essential parts of your site’s marketing funnel that assist in turning visitors into customers.

You can find content that is missing key parts through the page content reports to help engage visitors faster, deliver your site’s message in a clearer way and increasing chances for conversions and exceeding user expectations.

36-make-landing-pages-awesome-and-improve-ux

37. Optimise your Social Tags

To increase shares on Facebook (Open Graph) and Twitter and get the most out of your content and outreach activities, you need to make sure your Twitter Cards and Open Graph tags are set up and set up correctly.

Within DeepCrawl’s Social Tagging report you will see pages with or without social tags, whether those that do are valid, and OG:URL Canonical Mismatch or, pages where the Open Graph URL is different to the Canonical URL. These should be identical, otherwise shares and likes might not be aggregated for your chosen URL in your Open Graph data but be spread across your URL variations.

37-optimise-your-social-tags

Want Customised Crawls?

38. Schedule Crawls

You can schedule crawls using DeepCrawl to automatically run them in the future and adjust their frequency and start time. This feature can also be used to avoid times of heavy server load. Schedules range from every hour to every quarter.

How To Do It:

  • In step 4 of your crawl set up click on Schedule crawl

38-schedule-crawls

39. Run Multiple Crawls at Once

You can crawl multiple sites (20 at any given time) really quickly as DeepCrawl is cloud-based, spanning millions of URLs at once while still being able to use your computer to evaluate other reports. Hence with DeepCrawl you can perform your Pitch, SEO Audit and your Competitor Analysis at the same time.

39-run-multiple-crawls-at-once

40. Improve Your Crawls with (Google) Analytics Data

By authenticating your Google Analytics accounts into DeepCrawl you can understand the combined SEO and analytics performance of key pages in your site. By overlaying organic traffic and total visits on your individual pages you can prioritise changes based on page performance.

How To Do It:

  • On step 2 of your crawl set up, go to Analytics, click add Google account
  • Enter your Google Analytics name and password to sync your data permissions within the DeepCrawl
  • Click the profile you want to share for the DeepCrawl project
  • DeepCrawl will sync your last 7 or 30 days of data (your choice), or you can choose to upload up to 6 months worth of data by uploading the data as a CSV file, whether from Google Analytics or Omniture or any providers

40-improve-your-crawls-with-google-analytics-data

41. Upload Backlinks and URLs

Identify your best linked sites by uploading backlinks from Google Search Console, or lists of URLs from other sources to help you track the SEO performance of the most externally linked content on your site.

41-upload-backlinks-and-urls

42. Restrict Crawls

Restrict crawls for any site using DeepCrawl’s max URL setting, using the exclude URL list or the page grouping feature which lets you restrict pages based on their URL patterns. With page grouping you can chose to crawl say 10% of a particular folder or of each folder on your site if you’re looking for a quick snapshot. Once you’ve re-crawled (so long as you keep the same page grouping settings), DeepCrawl will recrawl the same 10% so you can monitor changes.

Aside from Include/Exclude Only rules you can restrict your crawls by Starting URLs and by limited the depth and/or number of URLs you’d like to crawl on your given site.

How to Do It:

  • In Advanced Settings nested in step 4 of your crawl set up click “Included / Excluded URLs” or “Page Grouping” and/or “Start URLs”

42-restrict-crawls

43. Check Implementation of Structured Data

Access Google’s own Structured Data Testing Tool to validate Schema.org markup by adding a line or two of code to your audit through DeepCrawl’s Custom Extraction. This tool helps you see how your rich snippets may appear in search results, where errors in your markup prevent it from being displayed, and whether or not Google interprets your code, including rel=publisher and product reviews, correctly.

How To Do It:

  • In Advanced Settings of step 4 of your crawl set up
  • Click on “custom extraction”
  • Add the Custom Extraction code found here to get DeepCrawl to recognise Schema markup tags and add the particular line of code you want for your crawl: ratings, reviews, person, breadcrumbs, etc

itemtype=”http://schema.org/([^”]*)
itemprop=”([^”]*)
(itemprop=”breadcrumb”)
(itemtype=”http://schema.org/Review”)

43-check-implementation-of-structured-data

44. Using DeepCrawl as a Scraper

Add custom rules to your website audit with DeepCrawl’s Custom Extraction tool. You can tell the crawler to perform a wide array of tasks, including paying more attention to social tags, finding URLs that match a certain criteria, verifying App Indexing deeplinks, or targeting an analytics tracking code to validate product information across category pages.

For more information about Custom Extraction syntax and coding, check out this tutorial published by the DeepCrawl team.

How To Do It:

  • Enter your Regular Expressions syntax into the Extraction Regex box in the advanced settings of step 4 of your crawl
  • View your results by checking the Custom Extraction tab in your project’s crawl dashboard or at the bottom of the navigation menu

44-using-deepcrawl-as-a-scraper

45. Track Migration Changes

Migrations happen for a lot of reasons, and are generally challenging. To aid developers, SEOs and decision makers from the business coming together and trying to minimise risks, use DeepCrawl to compare staging and production environments in a single crawl to spot any issues before you migrate and make sure no-one missed their assignments, and ensure the migration goes smoothly.

For site migrations and/or redesigns, testing changes before going live can show you whether your redirects are correctly set up, whether you’re disallowing/no-indexing valuable pages in your robots.txt etc., being careful does pay off!

45-track-migration-changes

46. Crawl as Googlebot or Your Own User Agent

If your site auditor can’t crawl your pages as a search engine bot, then you have no chance of seeing the site through the search engine’s eyes. DeepCrawl can also mimic spiders from other search engines, social networks and browsers. Select your user agent in the advanced settings when setting up or editing your crawl.

Your options are:

  • Googlebot (7 different options)
  • Applebot
  • Bingbot
  • Bingbot mobile
  • Chrome
  • Facebook
  • Firefox
  • Generic
  • Internet Explorer 6 & 8
  • iPhone
  • DeepCrawl
  • Custom User Agent

46-crawl-as-googlebot-or-your-own-user-agent

Last but not least

This top 46 list is by no means complete. In fact, there are many more possibilities to utilise DeepCrawl for enhancing site performance and the tool undergoes constant improvements. This list is a starting point to understanding your website as search engines do and making improvements for users and search engines alike.

DeepCrawl is a very powerful tool and conclusions drawn using the data it provides must be based on experience and expertise. If applied to its full effect DeepCrawl can bring an online business to the next level and significantly contribute to user expectations management, brand building and most importantly driving conversions.

What are your favourite DeepCrawl features? Your opinion matters, share it in the comments section below.

Post from Fili Wiese

Mike Deets - Living

 

 

 

Have an incredible day!

 

Mike

http://blog.deetslist.com

Source link

Why Google celebrates its 18th birthday, not on its birthday

If you go to Google’s homepage today you will see a small celebration happening. It’s Google celebrating their own birthday. Every year since 2002 they celebrate it with a doodle:

Today marks the day that Google turns 18. In some countries, that means you’re an adult. There is some confusion about the date, though. Doesn’t Google know its own birthday? That doesn’t sound like an adult.

The confusion

In sports, with some athletes, their birthdate are a reason for discussion. Some are older than they appear to be. With Google it’s not about years, it’s about days. Because when is Google’s birthday really? On the 27th? Or was it the fourth? Or was it an entirely different day?

You would say Google knows its own birthday. So we could ‘assume’ that the 27th is the right day. But why did Google then celebrate their birthday on the 26th in 2005? It only started celebrating on the 27th the year after. To make it even stranger, in 2004 Google was 20 days earlier on September 7. Which was a day later than when they celebrated a year earlier on September 6…

What is the right date?

Well actually, the right date is September 4th… at least, that’s the day Google filed for incorporation in California (according to their own history timeline). Most businesses will see that as the ‘birth date’.

So why the 27th?

It’s a bit of a mystery, not many sites actually seem to have the answer. Back in 2007, Danny Sullivan tried to answer the question but only got to “Google’s birthday is whenever it wants to be”. And today many sites are wondering about it again.

The real answer seems to be in Marketing and PR.

In 2005 Google wanted to announce something big. A nice “PR trick” or “marketing trick” is to make things come together. So they decided to “attach” their birthday to a big announcement. That of the new index.

And not just any index. In a blog post on the Google blog, Software Engineer Anna Patterson announced:

“For our seventh birthday, we are giving you a newly expanded web search index that is 1,000 times the size of our original index.”

Interestingly enough, the size of the index isn’t a ‘thing’ anymore these days. Where in the past Google proudly showed the size of the index, they don’t mention that anymore.

Back then it was a big thing, though. And since they wanted to announce the increase in the size of the index, they needed a ‘reason’ so they picked the birthday.

As you can see, though, September 26 they wrote the article, not on the 27th. Which they then ‘made’ their birthday so they had a gift for themselves to be able to celebrate on the 27th. For convenience reasons, they stuck to that date. That’s why they celebrate on the 27th, not the 4th of September.

So we should say today:


Belated happy birthday Google!
Click To Tweet


Post from Bas van den Beld

 

 

 

Have an incredible day!

 

Mike

Deetslist – SEO, SOCIAL MEDIA, VIDEO

Source link

3 Surprising Lessons From Building 26,000 Links

Posted by KelseyLibert

The Fractl team has worked on hundreds of content marketing projects. Along the way, we’ve kept track of a lot of data, including everywhere our client campaigns have been featured, what types of links each campaign attracted, and how many times each placement was shared.

While we regularly look back on our data to evaluate performance per campaign and client, until now we’d never analyzed all of these data in aggregate. After combing through 31,000 media mentions and 26,000 links, here’s what we found.

Most high-authority links don’t receive a lot of social shares.

Most marketers assume that if they build links on high-authority sites, the shares will come. In a Whiteboard Friday from last year, Rand talks about this trend. BuzzSumo and Moz analyzed 1 million articles and found that over 75 percent received no social shares at all. When they looked at all links – not just articles – this number rose to around 90 percent.

We (wrongfully) assumed this wouldn’t be the case with high-quality links we’ve earned. It turns out, even the majority of our links on sites with a high Domain Authority (DA) didn’t get any social shares:

  • 52 percent of links with a DA over 89 received zero shares.
  • 50 percent of links with a DA over 79 received zero shares.
  • 54 percent of links with a DA over 59 received zero shares.

On average, our campaigns get 110 placements and 11,000 social shares, yet a single link accounts for about 63 percent of total shares. This means that if you exclude the top-performing link from every campaign, our average project would only get 4,100 social shares.

Since most links don’t yield social shares, marketers with goals of both link building and social engagement should consider a strategy for gaining social traction in addition to a strategy for building a diverse link portfolio.

The social strategy can be as simple as targeting a few key websites that routinely yield high social shares. It’s also helpful to look at target sites’ social media accounts. When they post their own articles, what kind of engagement do they get?

Of all the sites that covered our campaigns, the following five sites had the highest average social shares for our content. We know we could depend on these sites in the future for high social engagement.

sites-with-social-shares.jpg

Exceptions to the rule

Some content can definitely accomplish both high engagement and social shares. The BuzzSumo and Moz study found that the best types of content for attracting links and social shares are research-backed content or opinion pieces. Long-form content (more than 1,000 words) also tends to attract more links and shares than shorter content. At Fractl, we’ve found the same factors – an emotional hook, a ranking or comparison, and a pop culture reference – tend to encourage both social sharing and linking.

Few sites will always link to you the same way.

To ensure you’re building a natural link portfolio, it’s important to keep track of how sites link to your content. You’ll learn if you’re earning a mix of dofollow links, nofollow links, cocitation links, and brand mentions for each campaign. We pay close attention to which types of links our campaigns earn. Looking back at these data, we noticed that publishers don’t consistently link the same way.

The chart below shows a sample of how 15 high-authority news sites have linked to our campaigns. As you can see, few sites have given dofollow links 100 percent of the time. Based on this, we can assume that a lot of top sites don’t have a set editorial standard for link types (although plenty of sites will only give nofollow links).

link type.png

While getting a site to cover your content is something to be celebrated, not every placement will result in a dofollow link. And just because you get a dofollow link from a site once doesn’t mean you should always expect that type of link from that publisher.

Creating a lot of visual assets is a waste of time in certain verticals.

There’s an ongoing debate within Fractl’s walls over whether or not creating a lot of visual assets positively impacts a campaign’s reach enough to justify the additional production time. To settle this debate, we looked at our 1,300 top placements to better understand how publishers covered our campaigns’ visual assets (including both static image and video). This sample was limited to articles on websites with a DA of 70 or higher that covered our work at least four times.

We found that publishers in different verticals had divergent tendencies regarding visual asset coverage. The most image-heavy vertical was entertainment, and the least was education.

assets-per-vertical.jpg

Some of the variation in asset counts is based on how many assets were included in the campaign. Although this does skew our data, we do receive useful information from this analysis. The fact that top entertainment publishers used an average of nine assets when they cover our campaigns indicates a high tolerance for visual content from outside sources. Verticals with lower asset averages may be wary of external content or simply prefer to use a few key visuals to flesh out an article.

Keeping these publisher vertical preferences in mind when developing content can help your team better allocate resources. Rather than spending a lot of effort designing a large set of visual assets for a campaign you want to be placed on a finance site, your time may be better spent creating one or two awesome visualizations. Similarly, it’s worthwhile to invest in creating a variety of visual assets if you’re pitching entertainment and health sites.

Analyzing our entire link portfolio taught us a few new things that challenged our previous assumptions:

  • High DA sites don’t necessarily attract a lot of social engagement. Just because a site that linked to you has a huge audience doesn’t mean that audience will share your content.
  • Most sites don’t consistently use the same types of links. Got a dofollow link from a site one time? Don’t expect it to be the norm.
  • Certain publisher verticals are more likely to feature a lot of visual assets. Depending on which verticals you’re targeting, you might be wasting time on designing lots of visuals.

While I hope you’ve learned something from Fractl’s internal study, I want you to see the broader lesson: the value of measuring and analyzing your own content campaign data as a means to improve your process. If you’ve done a similar analysis of links earned from content marketing, I’d love to hear from you in the comments.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Mike Deets - Living

 

 

 

Have an incredible day!

 

Mike

Deetslist – SEO, SOCIAL MEDIA, VIDEO

Source link

Android Antitrust Cases in the EU and Russia: Updates & Overview

State of Digital readers in the know are already familiar with a number of events relating to antitrust cases against Google for its practices on Android. Since 2013, this topic has drawn our public attention as search advertisers, as mobile app developers, as handset manufacturers, and most importantly, as users.

A number of investigations in different jurisdictions worldwide question whether Google’s restrictions on Android breach competition laws. At the core of the subject lies concern over the future of the mobile ecosystem, including platform development, innovation and consumer choice.

In recent news, we have seen updates in the EU and Russia and new concerns arising in other jurisdictions. In the EU, Google recently asked for a final deadline extension to respond the EU Commission’s statement of objections, which was granted until October 7th. Over a week ago in Russia, the FAS officially dismissed Google’s request for an extension to change their practices in Russia. Google is now required to commence execution of Russian FAS decisions or face additional fines for incompliance. In the past week Japan’s Ministry of Economy, Trade and Industry also alleged that Google’s practices undermine competition in the smartphone app market.

As Russia’s number one search provider and one of Europe’s largest Internet companies with a focus on empowering the consumer with choice, Yandex has been involved with both the Russian and EU case. Given the timely topic, involvement of Yandex, and similarity of these particular cases, I’m taking this opportunity to clarify developments with the cases that otherwise haven’t been summarized together.

Investigations into Google’s practices on Android:

Androids overwhelming popularity make these investigations especially important.  In 2013, Yandex first made submissions to the European Commission that has been investigating this topic since when FairSearch filed a similar complaint against Google the same year. The EU opened a formal investigation in April 2015, which outlined the following major interests: licensing of Play under condition that Google Chrome is pre-installed and Google search is set as a default search service, anti-fragmentation agreements that prevent consumers from access to other potential operating systems, and exclusivity of Google search pre-installation with manufacturers.

Back in February 2015 Yandex submitted a request to the Russian Federal Antimonopoly Service (FAS) to investigate Google’s restrictive practices on Android. The request focused on Google tying its mobile search engine and other end-user applications and services with its Google Play mobile application store for Android OS. In particular, in 2014 Google prohibited pre-installation of any Yandex services on Fly, Explay, and Prestigio mobile devices in its agreements with respective Android device manufacturers. These pre-installation restraints were not limited to specific Yandex applications competing with Google apps but all Yandex applications.

Investigations in the US , South Korea, and most recently, Japan question Android practices for various reasons.

Russian FAS Decision & European Commission Statement of Objections:

The case filed with FAS resulted in the historical September 2015 decision that Google did break Russian antitrust laws through their abuse of dominance by providing Google Play to handset manufacturers under the conditions that a bundle of other applications would be pre-installed on certain beneficial terms.  As a result, last month, FAS also fined Google $6.8 million for violation of Russian competition laws.

As per FAS remedies order, Google is now obligated to adapt its practices in Russia as outlined by FAS. that device manufacturers that are producing Android-operated smartphones and tablets for the Russian market should be able to obtain Google Play from Google without any restrictions but also be entitled to:

  • Pre-install Google Play without any other applications and services by Google
  • Set another search engine as the default, including on the default home screen;
  • Select apps and services for pre-installation at their own discretion;
  • Place any other applications and widgets on the default home screen, including search
  • Enter into deals with developers of apps and services by Google’s competitors to replace any of the pre-installed Google’s apps, including Google Search, without restrictions.

In the case of the EU, Google has until Octover 7th to respond to the Commission’s April Statement of Objections alleging that Google has breached EU antitrust rules by:

  • Requiring manufacturers to pre-install Google Search and Google’s Chrome browser and requiring them to set Google Search as default search service on their devices, as a condition to license certain Google proprietary apps;
  • Preventing manufacturers from selling smart mobile devices running on competing operating systems based on the Android open source code;
  • Giving financial incentives to manufacturers and mobile network operators on condition that they exclusively pre-install Google Search on their devices.

The below image published by the EU Commission outlines their concern.

In Russia, Google is running out of extensions to respond and is under pressure to make adaptations while more developments are anticipated in the EU and elsewhere.

Impact of the Android Antitrust Cases:

Since the FAS ruling, minor progress developed in regards to Android dynamics in Russia. While there haven’t been any changes relating to Google’s bundling practices, there are new distribution deals and some adaptations with the strictness of exclusivity agreements. For Yandex, these new distribution deals allow Yandex apps to be installed on the second and third screen of Android phones and in very limited cases, pre-loaded on the default home screens. The change in exclusivity agreements has positively impacted the general ban on all Yandex apps.

Both cases are setting an example for increased global recognition of an ongoing issue of unprecedented market power and control gained by owners of global mobile platforms. The ruling in Russia demonstrates a historical event that a fair and reasonable solution to the problem is achievable. To date, the EU and other cases have brought more attention to the topic.

According to a press release from the European Commission, Commissioner Margrethe Vestager, in charge of competition policy, explained the weight of the case in the EU,  “A competitive mobile internet sector is increasingly important for consumers and businesses in Europe. Based on our investigation thus far, we believe that Google’s behaviour denies consumers a wider choice of mobile apps and services and stands in the way of innovation by other players, in breach of EU antitrust rules. These rules apply to all companies active in Europe. Google now has the opportunity to reply to the Commission’s concerns.”

As we await change and Google responses, it’s important to remember the focus of these cases irrespective of any politicized or sensationalized stories. Google is a well-respected company that has quite clearly brought significant value to the Internet ecosystem and set many positive examples but there must be opportunity for others to do the same. These cases are not unique to Yandex but of course relevant to other app developers and handset manufactures. Regardless of a tech company’s global reach or size, the Internet ecosystem should provide fair opportunities for all businesses to offer their products and services in addition to choice for the consumer. Especially in today’s globalized world and interconnected online community, the gravity of these cases comes down to ensuring widespread equality.

Post from Melissa McDonald

Mike Deets - Living

 

 

 

Have an incredible day!

 

Mike

Deetslist – SEO, SOCIAL MEDIA, VIDEO

Source link