The Best On-Page SEO Tool in the Business Now Has Unlimited Access via MozBar

Posted by Roxana_Nelson

[Estimated read time: 4 minutes]

This is going to be an exciting year for MozBar. I’m happy to announce that we’ve got something new that will help you save a ton of time tuning your on-page SEO.

(New to MozBar? Learn more about our free SEO toolbar extension available in the Chrome store.)

We’ve added Page Optimization to MozBar Premium!

If you want to dive right in, download MozBar for Chrome. To access the Page Optimization feature, you’ll need to be logged in to your Moz Pro account. If you don’t have a Moz Pro account, you can take a free 30-Day trial to check out all the new goodies.

Currently in Moz Pro, you can run a Page Optimization report for any of your tracked keywords and campaigns to get detailed information on how to optimize your page for a keyword, and where you’re getting things right.

We’ve taken this functionality and put it in the MozBar — except now you can check on-page optimization for ANY keyword on ANY page, not just ones you’re tracking. It’s like having Moz Pro on-the-go. As always, the keywords and pages you research via MozBar are limitless — that’s right, they won’t affect any of your limits in Moz Pro.

MozBar_Large.gif

You now have the flexibility to check any keyword on any page. This opens the door to tons of new ideas for page optimization, paving the way for you to rank higher in the SERPs. Quickly check any page on your site for keywords you’re tracking and see how optimized (or not) your page is for target keywords, in real time. You’ll also get detailed suggestions on how to optimize your page. This can help you decide whether or not a keyword is worth optimizing for, or maybe you’ll find you’re already doing a great job and can move on to the next one.

Optimizing your page using MozBar

I’m always on the lookout for an article with good parenting tips, especially when it comes to traveling with children. I like the Honestly blog by the Honest Company, so let’s use this post about flying with kids as an example of how to optimize your page using MozBar.

Let’s say I’m targeting the keyword “flying with kids” and want to see how well this page is optimized. I chose this keyword because it feels like a query that can be answered by the content well, and after researching it in Keyword Explorer, I found that it has good search volume and a good Potential score.

1. First, we’ll open MozBar on the page. We’ll click on the new Page Optimization icon next to the Highlight tool.

2. Then, we’ll type in “flying with kids” in the text field. Click on “Get Score.”

3. Looking good! This page has a Page Optimization Score of 94 out of 100, which is great. There are three suggestions for improving this score even more, such as adjusting keyword placement in the page title and using targeted keywords in the H1 headers periodically on the page.

Overall, a pretty well-optimized page!

Here are some other ways to use Page Optimization in MozBar:

  • Review competitors’ pages: Another benefit of being able to check any keyword + page URL combo is that you can see how well your competitors are optimizing for certain keywords, too. This insight could help you learn which keywords your competitors optimize for and how.
  • Improve client proposals: Prospecting for potential clients has gotten easier, too. Now you can audit a prospect’s pages and find opportunities for on-page optimization improvement — a valuable resource for any SEO putting together a site report for a potential client.

How it works

For each keyword + page URL combo you check, you’ll get a Page Optimization Score, just like in the campaign-based version of Page Optimization in Moz Pro. The Page Optimization Score is a metric we use to gauge how optimized a page is for the keyword you’ve paired it with.

  • We take twenty-seven contributing factors to a well-optimized page and score your page for that particular keyword.
  • We’ve broken down your score by Suggestions (items you may want to tweak or optimize) and Solved (a list of factors that are positively influencing your score).
  • You’ll get the full list of both Suggestions and Solved factors that contribute to your score under the All Factors tab.
  • We also highlight the highest-priority items under Most Important Fixes. If you need more insight, expand the suggestion and you’ll get real advice from one of our SEO experts.

What’s coming next?

And we’ve got another cool feature on the horizon. We’ll be adding Related Topics to MozBar very soon. In addition to all the data you get from Page Optimization, Related Topics will surface a list of popular topics that your audience may be in interested in based on other topics that appear in the SERP. You’ll also see the pages that talk about those topics. Pretty sweet!

Check it out!

You can start using Page Optimization in MozBar Premium today with your Moz Pro subscription, or try it free for 30 days. Also, starting today, you’ll need to be logged in with either your Community account or your Moz Pro account to access link metrics. Read more about it here in my Q&A post.

Enough talk — I want to download MozBar!

Give Page Optimization for MozBar Premium a spin today, and as always, we’d love to hear what you think of it!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Mike Deets - Living

 

 

 

Have an incredible day!

 

Mike

Deetslist – SEO, SOCIAL MEDIA, VIDEO

Source link

What Google Trends Tells Us About Brexit, People & Politics

Unless you are either completely disinterested in news or have been living under a rock for the past 12 months, you will know that the people of UK have recently voted in the European Union referendum and they have chosen to exit the EU. This news itself has caused economic shockwaves across the world, has jolted pound to its all time low in decades, has divided the country and caused national panic.

Let me be clear, this article will not go into the debate itself nor will it try to predict what will happen over the next few months. What it will cover is something that The Express covered and I am now delving deeper to see if this shows anything about people and politics.

I will start by explaining what the article on the Express says. In summary, it shows that just hours after the confirmed result of the EU referendum, there was a huge surge in UK searches around “what is the EU” and “which countries are in the EU”. On top of this there was also a surge of searches from London around “move to Gibraltar”.

I will let you all make up your own opinion and decisions on what this could mean. The one search I want to focus on is “what is the EU”. The Express article says the spike in searches could be from people of both sides of the referendum potentially researching to get as much information as they could to understand what will happen.

Looking at it from another direction, this could also mean that people may be voting without fully knowing all the details about what they are voting for. They could be relying on broadcast media messages rather than researching themselves. This article is going to look into this further and see if it really means something.

All Google Trends data shown here are taken from the region/ country where the public vote occurred.

EU Referendum 2016 Trends

The Google trends screenshot below shows the “what is the EU” spike and shows it happened around 7.30AM which is around the time the official results were announced.

Let’s see if there are any other searches around the EU that spiked. Here is the search term “what is the EU referendum”

what is the eu referendum

Interestingly this search term did not peak at 7.30 am, after the public announcement. It actually peaked at around 4.30 am which is when it was announced that there was no chance the remain vote would be possible. This was the first unofficial confirmation that the UK was leaving the EU.what happens if we leave the eu

Again another search “what happens if we leave the EU” spiked as soon as the unofficial leave the EU confirmation was announced.

what happens if we stay in the eu

Another point worth mentioning is that “what happens if we stay in the EU” peaked around midnight in which the announcement was made that it was likely we were to remain in the EU. You can see a trend appearing around people searching for information after the events have occurred.

what is brexit

Finally, “what is brexit” peaked at 7.30 once the officially confirmation was given.

I know Google trends is not the most accurate confirmation since no actual numbers are given but this has provided one statement that would be worth investigating to see if has happened in other public votes. That statement is:

People in general tend to research what the implications are of a political vote after the announcement is made

Let’s find out if this is true. To do this, I am going to compare Google trends data across other large public votes to see if there is a similar trend.

UK General Election 2015

Polling Day: May 7th 2015

Let’s look at the general election in 2015. To keep this as simple as possible I have selected a few common searches that people search for around that time so we can see when those searches peak and if there are any common trends across all public votes. The common search terms are:

  • What is the general election
  • Who should I vote for
  • What is *party*
  • Who is *party leader*

The first two questions should show spikes around the date that people decided to gather information about the public vote and who to vote for. The final search will show when people started searching specifics about a party.

For this search I am going to look at the month of May 2015. The polling day for the election was May 7th and the official announcement of the new political party was May 8th.

Let’s look into each search.

Search term: What is the general election?

what is the general election

Because we are going back more than a few days on Google trends, unfortunately Google does not provide us with hourly data. We only have the day now. However, you can see that searches spiked for “what is the general election” on the day of voting which makes sense.

Search term: Who should I vote for

who should i vote for

Again as expected, searches spiked on the polling day to show people searching who to vote for on the day.

Search term: Who is David Cameron

who is david cameron

Now we are looking at the other questions around the party leaders. Interestingly this shows searches spiked the day after voting the announcement was made. This again shows a spike in searches of people researching finer details of their votes after the confirmation.

will respect that this could be a spike in people searching for David Cameron because he won the general election and was voted into power. So let’s look at search trends for his main opposition.

Search term: Who is Ed Milliband

who is ed milliband

Search term: Who is Nigel Farage

who is nigel farage

As you can see for both the Labour party leader and UKIP party leader, all searches spiked on the 8th, the day after voting.

Finally, we will look at searches around people searching specifically around the actual party.

Search term: What is conservative

what is conservative

Search term: What is UKIP

what is ukip

Search term: What is labour

what is labour

“Who are the conservatives” searches spiked on May 8th, the day after voting. However, “what is UKIP” and “who are labour” actually spiked on May 7th. This shows that searches spiked for the conservative when they won the election and people searched for them.

Summary of Comparison

To summarise, the search trends when comparing to the EU don’t quite match up in some ways. More people searched “what is the EU” AFTER the polling day whereas in the election people searched “what is the general election” ON the polling day to help make their decisions. People are also searching for information about the specific party on the voting day also.

However, people still searched for information about the party leaders and their party’s after the election. This does aid the statement made that people look for the full details after the voting day.

Let’s look at another public vote further afield to compare.

The US Election 2012

To give a good comparison, lets looks at the US presidential election in 2012.

Election day: November 6th 2012

Common searches:

  • What is the US election?
  • Who should I vote for?
  • What is *party
  • Who is *party leader

Search term: What is the US Election?

what is the us election

Interestingly searches spiked for “what is the US election” the day after voting has closed. This is similar to what happened in the EU referendum.

Search term: Who should I vote for

who should i vote for

As expected the search term “who should I vote for” spiked on the day of the election.

Search term: What is republican

what is republican

Search term: What is democratic

what is democratic

Looking at searches around the actual political parties themselves. Again people are searching the day after voting.

Search term: Who is Barack Obama

who is barack obama

Search term: Who is Mitt Romney

who is mitt romney

When looking at searches around the party leaders. Again we can see searches spike after the election day for both the winner and loser of the election.

What does this all mean?

The main reason for this article is because of the recent EU referendum and the announcement about the rise of people searching “what is the EU” AFTER the actual voting has taken place. This led me to see if Google trends could predict a common trend on people’s behaviour around voting. Whilst this is in no way a full researched article this does start to make some suggestions about how people vote.

People may be searching for information about their chosen party after they have voted

The data shown here does show that the searches around the party’s themselves and their leaders spike after the voting has taken place. This could be people looking for information about the event. However, it could suggest that people are voting first and then researching in more detail once they hear the result. If you combine this with recent coverage around the UKs EU referendum where some people who voted to leave the EU are regretting their decision because of the effects it has caused. This does start to support the fact that people need to be better educated on how important their vote is and they should really do their research.

As I said, Google trends is not a conclusive dataset since Google are very selective on how much data they give you. However, I do think party leaders and their marketing teams should look at this. This could imply there are a lot of people that don’t research enough and don’t take the decision they vote for seriously. A lot of untrue information is paraded across all parties in elections and referendums. I think as a world we need to find a way of ensuring that the population have a way of easily finding the factual information they need to make a decision. There are charities set up such as Full Fact that offer true unbiased information but I think the world needs more. I think there needs to be a neutral body that sits alongside the campaigning parties and ensures they do not get carried away with the information they provide. Hopefully this will let the public better research to make their decision before the voting day and to stop any regret.

Post from Adam Mason

Mike Deets - Living

 

 

 

Have an incredible day!

 

Mike

Deetslist – SEO, SOCIAL MEDIA, VIDEO

Source link

The Functional Content Masterplan – Own the Knowledge Graph Goldrush with this On-Page Plan

Posted by SimonPenson

[Estimated read time: 17 minutes]

On-page content is certainly not one of the sexier topics in digital marketing.

Lost in the flashing lights of “cool digital marketing trends” and things to be seen talking about, it’s become the poor relative of many a hyped “game-changer.”

I’m here to argue that, in being distracted by the topics that may be more “cutting-edge,” we’re leaving our most valuable assets unloved and at the mercy of underperformance.

This post is designed not only to make it clear what good on-page content looks like, but also how you should go about prioritizing which pages to tackle first based on commercial opportunity, creating truly customer-focused on-page experiences.

What is “static” or “functional” content?

So how am I defining static/functional content, and why is it so important to nurture in 2016? The answer lies in the recent refocus on audience-centric marketing and Google’s development of the Knowledge Graph.

Whether you call your on-page content “functional,” “static,” or simply “on-page” content, they’re all flavors of the same thing: content that sits on key landing pages. These may be category pages or other key conversion pages. The text is designed to help Google understand the relevance of the page and/or help customers with their buying decisions.

Functional content has other uses as well, but today we’re focusing on its use as a customer-focused conversion enhancement and discovery tactic.

And while several years ago it would have been produced simply to aid a relatively immature Google to “find” and “understand,” the focus is now squarely back on creating valuable user experiences for your targeted audience.

Google’s ability to better understand and measure what “quality content” really looks like — alongside an overall increase in web usage and ease-of-use expectation among audiences — has made key page investment as critical to success on many levels.

We should now be looking to craft on-page content to improve conversion, search visibility, user experience, and relevance — and yes, even as a technique to steal Knowledge Graph real estate.

The question, however, is “how do I even begin to tackle that mountain?”

Auditing what you have

For those with large sites, the task of even beginning to understand where to start with your static content improvement program can be daunting. Even if you have a small site of a couple of hundred pages, the thought of writing content for all of them can be enough to put you off even starting.

As with any project, the key is gathering the data to inform your decision-making before simply “starting.” That’s where my latest process can help.

Introducing COAT: The Content Optimization and Auditing Tool

To help the process along, we’ve been using a tool internally for months — for the first time today, there’s now a version that anyone can use.

This link will take you to the new Content Optimisation and Auditing Tool (COAT), and below I’ll walk through exactly how we use it to understand the current site and prioritize areas for content improvement. I’ll also walk you through the manual step-by-step process, should you wish to take the scenic route.

The manual process

If you enjoy taking the long road — maybe you feel an extra sense of achievement in doing so — then let’s take a look at how to pull the data together to make data-informed decisions around your functional content.

As with any solid piece of analysis, we begin with an empty Excel doc and, in this case, a list of keywords you feel are relevant to and important for your business and site.

In this example, we’ll take a couple of keywords and our own site:

Keywords:

Content Marketing Agency
Digital PR

Site:

www.zazzlemedia.co.uk

Running this process manually is labor-intensive (hence the need to automate it!) and to add dozens more keywords creates a lot of work for little extra knowledge gain, but by focusing on a couple you can see how to build the fuller picture.

Stage one

We start by adding our keywords to our spreadsheet alongside a capture of the search volume for those terms and the actual URL ranking, as shown below (NOTE: all data is for google.co.uk).

Next we add in ranking position…

We then look to the page itself and give each of the key on-page elements a score based on our understanding of best practice. If you want to be really smart, you can score the most important factors out of 20 and those lesser points out of 10.

In building our COAT tool to enable this to be carried out at scale across sites with thousands of pages, we made a list of many of the key on-page factors we know to affect rank and indeed conversion. They include:

  • URL optimization
  • Title tag optimization and clickability
  • Meta description optimization and clickability
  • H1, H2, and H3 optimization and clickability (as individual scores)
  • Occurences of keyword phrases within body copy
  • Word count
  • Keyword density
  • Readability (as measured by the Flesch-Kincaid readability score)

This is far from an exhaustive list, but it’s a great place to start your analysis. The example below shows an element of this scored:

Once you have calculated score for every key factor, your job is to then to turn this into an average, weighted score out of 100. In this case, you can see I’ve done this across the listed factors and have a final score for each keyword and URL:

Stage two

Once you have score for a larger number of pages and keywords, it’s then possible to begin organizing your data in a way that helps prioritise action.

You can do this simply enough by using filters and organising the table by any number of combinations.

You may want to sort by highest search volume and then by those pages ranking between, say, 5th and 10th position.

Doing this enables you to focus on the pages that may yield the most potential traffic increase from Google, if that is indeed your aim.

Working this way makes it much easier to work in a way that delivers the largest positive net impact fastest.

Doing it at scale

Of course, if you have a large site with tens (or even hundreds) of thousands of pages, the manual option is almost impossible — which is why we scratched our heads and looked for a more effective option. The result was the creation of our Content Auditing and Optimisation Tool. Here’s how you can make use of it to paint a fuller picture of your entire site.

Here’s how it works

When it comes to using COAT, you follow a basic process:

  • Head over to the tool.
  • Enter your domain, or a sub-directory of the site if you’d like to focus on a particular section
  • Add the keywords you want to analyze in a comma-separated list
  • Click “Get Report,” making sure you’ve chosen the right country

Next comes the smart bit: by adding target keywords to the system before it crawls, it enables the algorithm to cross-reference all pages against those phrases and then score each combination against a list of critical attributes you’d expect the “perfect page” to have.

Let’s take an example:

You run a site that sells laptops. You enter a URL for a specific model, such as /apple-15in-macbook/, and a bunch of related keywords, such as “Apple 15-inch MacBook” and “Apple MacBook Pro.”

The system works out the best page for those terms and measures the existing content against a large number of known ranking signals and measures, covering everything from title tags and H1s to readability tests such as the Flesch-Kincaid system.

This outputs a spreadsheet that scores each URL or even categories of URLs (to allow you to see how well-optimized the site is generally for a specific area of business, such as Apple laptops), enabling you to sort the data, discover the pages most in need of improvement, and identify where content gaps may exist.

In a nutshell, it’ll provide:

  • What the most relevant target page for each keyword is
  • How well-optimized individual pages are for their target keywords
  • Where content gaps exist within the site’s functional content

It also presents the top-level data in an actionable way. An example of the report landing page can be seen below (raw CSV downloads are also available — more on that in a moment).

You can see the overall page score and simple ways to improve it. This is for our “Digital PR” keyword:

The output

As we’ve already covered in the manual process example, in addition to pulling the “content quality scores” for each URL, you can also take the data to the next level by adding in other data sources to the mix.

The standard CSV download includes data such as keyword, URL, and scores for the key elements (such as H1, meta, canonical use and static content quality).

This level of detail makes it possible to create a priority order for fixes based on lowest-scoring pages easily enough, but there are ways you can supercharge this process even more.

The first thing to do is run a simple rankings check using your favorite rank tracker for those keywords and add them into a new column in your CSV. It’ll look a little like this (I’ve added some basic styling for clarity):

I also try to group keywords by adding a third column using a handful of grouped terms. In this example, you can see I’m grouping car model keywords with brand terms manually.

Below, you’ll see how we can then group these terms together in an averaged cluster table to give us a better understanding of where the keyword volume might be from a car brand perspective. I’ve blurred the keyword grouping column here to protect existing client strategy data.

As you can see from the snapshot above, we now have a spreadsheet with keyword, keyword group, search volume, URL, rank, and the overall content score pulled in from the base Excel sheet we have worked through. From this, we can do some clever chart visualization to help us understand the data.

Visualizing the numbers

To really understand where the opportunity lies and to take this process past a simple I’ll-work-on-the-worst-pages-first approach, we need to bring it to life.

This means turning our table into a chart. We’ll utilize the chart functionality within Excel itself.

Here’s an example of the corresponding chart for the table shown above, showing performance by category and ranking correlation. We’re using dummy data here, but you can look at the overall optimization score for each car brand section alongside how well they rank (the purple line is average rank for that category):

If we focus on the chart above, we can begin to see a pattern between those categories that are better optimized and generally have better rankings. Correlation does not always equal causation, as we know, but it’s useful information.

Take the very first column, or the Subaru category. We can see that it’s one of the better-optimized categories (at 49%) and average rank is at 34.1. Now, these are hardly record-breaking positions, but it does point towards the value of well-worked static pages.

Making the categories as granular as possible can be very valuable here, as you can quickly build up a focused picture of where to put your effort to move the needle quickly. The process for doing so is an entirely subjective one, often based on your knowledge of your industry or your site information architecture.

Add keyword volume data into the mix and you know exactly where to build your static content creation to-do list.

Adding in context

Like any data set, however, it requires a level of benchmarking and context to give you the fullest picture possible before you commit time and effort to the content improvement process.

It’s for this reason that I always look to run the same process on key competitors, too. An example of the resulting comparison charts can be seen below.

The process is relatively straightforward: take an average of all the individual URL content scores, which will give you a “whole domain” score. Add competitors by repeating the process for their domain.

You can take a more granular view manually by following the same process for the grouped keywords and tabulating the result. Below, we can see how our domain sizes up against those same two competitors for all nine of our example keyword groups, such as the car brands example we looked at earlier.

With that benchmark data in place, you can move on to the proactive improvement part of the process.

The perfect page structure

Having identified your priority pages, the next step is to ensure you edit (or create them) in the right way to maximize impact.

Whereas a few years ago it was all about creating a few paragraphs almost solely for the sake of helping Google understand the page, now we MUST be focused on usability and improving the experience for the right visitor.

This means adding value to the page. To do that, you need to stand back and really focus in on the visitor: how they get to the page and what they expect from it.

This will almost always involve what I call “making the visitor smarter”: creating content that ensures they make better and more informed buying decisions.

To do that requires a structured approach to delivering key information succinctly and in a way that enhances — rather than hinders — the user journey.

The best way of working through what that should look like is to share a few examples of those doing it well:

1. Tredz Top 5 Reviews

Tredz is a UK cycling ecommerce business. They do a great job of understanding what their audience is looking for and ensuring they’re set up to make them smarter. The “Top 5” pages are certainly not classic landing pages, but they’re brilliant examples of how you can sell and add value at the same time.

Below is the page for the “Top 5 hybrids for under £500.” You can clearly see how the URL (http://www.tredz.co.uk/top-5-hybrids-under-500), meta, H tags, and body copy all support this focus and are consistently aligned:

2. Read it for me

This is a really cool business concept and they also do great landing pages. You get three clear reasons to try them out — presented clearly and utilizing several different content types — all in one package.

3. On Stride Financial

Finance may not be where you’d expect to see amazing landing pages, but this is a great example. Not only is it an easy-to-use experience, it answers all the user’s key questions succinctly, starting with “What is an installment loan?” It’s also structured in a way to capture Knowledge Graph opportunity — something we’ll come to shortly.

Outside of examples like these and supporting content, you should be aiming to

create impactful headlines, testimonials (where appropriate), directional cues (so it’s clear where to “go next”), and high-quality images to reflect the quality of your product or services.


Claiming Knowledge Graph

There is, of course, one final reason to work hard on your static pages. That reason? To claim a massively important piece of digital real estate: Google Featured Snippets.

Snippets form part of the wider Knowledge Graph, the tangible visualization of Google’s semantic search knowledge base that’s designed to better understand the associations and entities behind words, phrases, and descriptions of things.

The Knowledge Graph comes in a multitude of formats, but one of the most valuable (and attainable from a commercial perspective) is the Featured Snippet, which sits at the top of the organic SERP. An example can be seen below from a search for “How do I register to vote” in google.co.uk:

In recent months, Zazzle Media has done a lot of work on landing page design to capture featured snippets with some interesting findings, most notably the level of extra traffic such a position can achieve.

Having now measured dozens of these snippets, we see an average of 15–20% extra traffic from them versus a traditional position 1. That’s a definite bonus, and makes the task of claiming them extremely worthwhile.

You don’t have to be first

The best news? You don’t even have to be in first position to be considered for a snippet. Our own research shows us that almost 75% of the examples we track have been claimed by pages ranked between 2nd and 10th position. It’s far from being robust enough yet for us to formalize a full report on it, but early indication across more than 900 claimed snippets (heavily weighted to the finance sector at present) support these early findings.

Similar research by search data specialists STAT has also supported this theory, revealing that objective words are more likely to appear. General question and definition words (like “does,” “cause,” and “definition”) as well as financial words (like “salary,” “average,” and “cost”) are likely to trigger a featured snippet. Conversely, the word “best” triggered zero featured snippets in over 20,000 instances.

This suggests that writing in a factual way is more likely to help you claim featured results.

Measuring what you already have

Before you run into this two-footed, you must first audit what you may (or may not) already have. If you run a larger site, you may already have claimed a few snippets by chance, and with any major project it’s important to benchmark before you begin.

Luckily, there are a handful of tools out there to help you discover what you already rank for. My favorite is SEMrush.

The paid-for tool makes it easy to find out if you rank for any featured snippets already. I’d suggest using it to benchmark and then measure the effect of any optimization and content reworking you do as a result of the auditing process.

Claiming Featured Snippets

Claiming your own Featured Snippet then requires a focus on content structure and on answering key questions in a logical order. This also means paying close attention to on-page HTML structure to ensure that Google can easily and cleanly pick out specific answers.

Let’s look at a few examples showing that Google can pick up different types of content for different types of questions.


1. The list

One of the most prevalent examples of Featured Snippets is the list.

As you can see, Media Temple has claimed this incredibly visual piece of real estate simply by creating an article with a well-structured, step-by-step guide to answer the question:

“How do I set up an email account on my iPhone?”

If we look at how the page is formatted, we can see that the URL matches the search almost exactly, while the H1 tag serves to reinforce the relevance still further.

As we scroll down we find a user-friendly approach to the content, with short sentences and paragraphs broken up succinctly into sections.

This allows Google to quickly understand relevance and extract the most useful information to present in search; in this case, the step-by-step how-to process to complete the task.

Here are the first few paragraphs of the article, highlighting key structural elements. Below this is the list itself that’s captured in the above Featured Snippet:


2. The table

Google LOVES to present tables; clearly there’s something about the logical nature of how the data is presented that resonates with its team of left-brained engineers!

In the example below, we see a site listing countries by size. Historically, this page may well not have ranked so highly (it isn’t usually the page in position one that claims the snippet result). Because of the ways it has structured the information so well, however, Geohive will be enjoying a sizable spike in traffic to the page.

The page itself looks like this — clear, concise and well-structured:


3. The definition

The final example is the description, or definition snippet; it’s possibly the hardest to claim consistently.

It’s difficult for two key reasons:

  • There will be lots of competition for the space and answering the search query in prose format.
  • It requires a focus on HTML structure and brilliantly crafted content to win.

In the example below, we can see a very good example of how you should be structuring content pages.

We start with a perfect URL (/what-is-a-mortgage-broker/) and this follows through to the H1 (What is a Mortgage Broker). The author then cleverly uses subheadings to extend the rest of the post into a thorough piece on the subject area. Subheadings include the key How, What, Where, and When areas of focus that any good journalism tutor will lecture you on using in any good article or story. Examples might include

  • So how does this whole mortgage broker thing work?
  • Mortgage brokers can shop the rate for you
  • Mortgage brokers are your loan guide
  • Mortgage broker FAQ

The result is a piece that leaves no stone unturned. Because of this, it’s been shared plenty of times — a sure fire signal that the article is positively viewed by readers.

Featured Snippet Cheatsheet

Not being one to leave you alone to figure this out though, I have created this simple Featured Snippet Cheatsheet, designed to take the guesswork out of creating pages worthy of being selected for the Knowledge Graph.

Do it today!

Thanks for making it this far. My one hope is for you to go off and put this plan into action for your own site. Doing so will quickly transform your approach to both landing pages and to your ongoing content creation plan (but that’s a post for another day!).

And if you do have a go, remember to use the free COAT tool and guides associated with this article to make the process as simple as possible.

Content Optimization and Auditing Tool: Click to access

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Mike Deets - Living

 

 

 

Have an incredible day!

 

Mike

Deetslist – SEO, SOCIAL MEDIA, VIDEO

Source link

Businesses Can Now Insure Against the Risk of Social Media

Warren Buffett said it well,

‘It takes 20 years to build a reputation and five minutes to ruin it. If you think about that, you’ll do things differently’.

If we look at the world today in which businesses now operate, realistically that’s the time it can take for a tweet, post, news or video to go viral.

In previous posts on State of Digital I’ve talked about the risks associated with the use of review based websites which might impact your business or even the dangers associated with live streaming apps and how Online Reputation training for personal, professional and organisational use is key.

It’s now interesting to see that AIG, the Global Insurance Corporation (88 million customers in 130 countries) is underwriting a new online reputation insurance product for businesses, aimed particularly at SMEs. The new product, which will be available to Irish Businesses, will allow them to avail of insurance protection against the risks of social media and other online crisis.

It’s not big news that ‘negative sentiment towards a business on social media can have a serious and long lasting effect on its bottom line’ as Irish gift voucher supplier One4All found in 2013. With more and more businesses using social media and platforms such as Facebook and Twitter – around 64% according to Eurostat, it’s no surprise that a recent survey by the Marketing Institute of Ireland & Edelman reported one in 10 Irish businesses had had a social media crisis in the last year and a fifth said they were unprepared to deal with it.

Their survey also highlighted that despite the increasing amount of online issues, 82% of Irish marketers surveyed ‘had not undertaken any social media crisis training in the last year’ and 40% ‘don’t have any social media guidelines for employees’ – an area covered in my previous post.

As I predicted in 2013 the importance of video cannot be underestimated, similarly ‘real time’ marketing and it’s heartening to see that 97% of Irish business’s intend to use video more in 2016, and that 60% are already engaging in real-time marketing.

As businesses we can get caught up in the process of pushing out as much content via social media channels as we possibly can. We can forget social media works best as a 2 way street, to keep an eye on public reactions to our content and fail to monitor the interactions our audiences are having with messaging we create. With the growth of social media platforms, the two areas of monitoring and staff training are vital for business and brands.

Monitoring

hand-1231897_1280

 It’s happening already. Maybe you just don’t know it – someone is saying something damaging about your business or your brand – and a customer who has had what they perceive as a negative experience can quickly turn into an enemy. By monitoring your business or brands reputation on the web and social media you should be able to identify and address any areas of concern and take appropriate steps. There are plenty of tools available – many of them free! to help you monitor your brand on social media:

  • Mention
  • Talk Walker Alerts
  • Social Mention
  • Hootsuite

On a positive note, monitoring of social media and web results also helps you identify any influencers’ who are fans of your business or your brand and may potentially be able to assist you in getting access to a larger audience – i.e. your tribe.

Staff Training

meeting-business-936059_1280 (1)

In short, your business needs to invest in it.

Everyone within a business or organisation needs to be aware of their personal, professional and organisational Online Reputation. Ultimately, what staff and stakeholders post and share online can and will impact the reputation of the brand or organisation they work for.

‘A business’s reputation is its greatest yet most intangible asset… One comment or video clip distributed through social media can destroy everything’

Declan Kavanagh MD – RiskEye

While having a robust and regularly updated social media policy in place is important, staff need to read it, understand it and then feel responsible and accountable to follow it. Procedures should be in place with your HR department to implement the policy throughout the organisation. It only takes one uninformed member of staff to post one ill thought out comment.. Creating a training programme around this huge topic would be recommended. It cant just be a tick box exercise.

Here’s my Top 3 Tips for an Organisational Training Programme on Social Media

  1. Assessing the level of information staff have on the benefits and risks and their practical knowledge is a good place to start.  You may want to split training to accommodate the non starters and beginners from the more informed social media users who could be very valuable to your organisation. This will ensure you can pitch the sessions appropriately to engage your staff audiences, minimising confusion and/or boredom. The former you’ll hope will become informed and competent, the latter’s interest in social media can be maximised to influence your market and build your tribe with relevant, useful and creative positive content.  Their programme should include various aspects of digital intelligence such as giving a good account personally and professionally online, reputation, privacy, basic online legislation and brand voice consistency.
  2. The world of social media moves fast. A continuous flow of information may be necessary (platform privacy updates and new legislation are important for all) and ongoing training is necessary.  The depth and frequency depends on how integral individual staff are to your organisations public profile and the impact their personal profiles could have on the organisation’s reputation .  Frequently educating key staff or those who show a creative flair online is money well spent.
  3. While getting experts in to motivate and educate is preferable, designing your own programme can also be effective. Use real life examples of online faux pas, plenty of relevant imagery, interesting videos and reinforce strong positive messaging.  Above all make it memorable and ensure the delivery is credible.

Investing in training that motivates, energises and informs your staff to dig deeper into the benefits of connecting online, to build engagement, trust and influence for the organisation and a deeper understanding of the internet and social media limitless possibilities is key in today’s business world. As in all areas of education on online risk, it makes sense that promoting positive and responsible use within the workplace on social media should organically minimise misuse and risk and ultimately grow your community.

 

 

Post from Wayne Denner

Mike Deets - Living

 

 

 

Have an incredible day!

 

Mike

http://blog.deetslist.com

Source link

The Balanced Digital Scorecard: A Simpler Way to Evaluate Prospects

Posted by EmilySmith

[Estimated read time: 10 minutes]

As anyone who’s contributed to business development at an agency knows, it can be challenging to establish exactly what a given prospect needs. What projects, services, or campaigns would actually move the needle for this organization? While some clients come to an agency with specific requests, others are looking for guidance — help establishing where to focus resources. This can be especially difficult, as answering these questions often requires large amounts of information to be analyzed in a small period of time.

To address the challenge of evaluating prospective clients and prioritizing proposed work, we’ve developed the Balanced Digital Scorecard framework. This post is the first in a two-part series. Today, we’ll look at:

  • Why we developed this framework,
  • Where the concept came from, and
  • Specific areas to review when evaluating prospects

Part two will cover how to use the inputs from the evaluation process to prioritize proposed work — stay tuned!

Evaluating potential clients

Working with new clients, establishing what strategies will be most impactful to their goals… this is what makes working at an agency awesome. But it can also be some of the most challenging work. Contributing to business development and pitching prospects tends to amplify this with time constraints and limited access to internal data. While some clients have a clear idea of the work they want help with, this doesn’t always equal the most impactful work from a consultant’s standpoint. Balancing these needs and wants takes experience and skill, but can be made easier with the right framework.

The use of a framework in this setting helps narrow down the questions you need to answer and the areas to investigate. This is crucial to working smarter, not harder — words which we at Distilled take very seriously. Often when putting together proposals and pitches, consultants must quickly establish the past and present status of a site from many different perspectives.

  • What type of business is this and what are their overall goals?
  • What purpose does the site serve and how does it align with these goals?
  • What campaigns have they run and were they successful?
  • What does the internal team look like and how efficiently can they get things done?
  • What is the experience of the user when they arrive on the site?

The list goes on and on, often becoming a vast amount of information that, if not digested and organized, can make putting the right pitch together burdensome.

To help our consultants understand both what questions to ask and how they fit together, we’ve adapted the Balanced Scorecard framework to meet our needs. But before I talk more about our version, I want to briefly touch on the original framework to make sure we’re all on the same page.

The Balanced Scorecard

For anyone not familiar with this concept, the Balanced Scorecard was created by Robert Kaplan and David Norton in 1992. First published in the Harvard Business Review, Kaplan and Norton set out to create a management system, as opposed to a measurement system (which was more common at that time).

Kaplan and Norton argued that “the traditional financial performance measures worked well for the industrial era, but they are out of step with the skills and competencies companies are trying to master today.” They felt the information age would require a different approach, one that guided and evaluated the journey companies undertook. This would allow them to better create “future value through investment in customers, suppliers, employees, processes, technology, and innovation.”

The concept suggests that businesses be viewed through four distinct perspectives:

  • Innovation and learning – Can we continue to improve and create value?
  • Internal business – What must we excel at?
  • Customer – How do customers see us?
  • Financial – How do we look to shareholders?

Narrowing the focus to these four perspectives reduces information overload. “Companies rarely suffer from having too few measures,” wrote Kaplan and Norton. “More commonly, they keep adding new measures whenever an employee or a consultant makes a worthwhile suggestion.” By limiting the perspectives and associated measurements, management is forced to focus on only the most critical areas of the business.

This image below shows the relations of each perspective:

balanced scorecard graphic .gif

And now, with it filled out as an example:

92105_B.gif

As you can see, this gives the company clear goals and corresponding measurements.

Kaplan and Norton found that companies solely driven by financial goals and departments were unable to implement the scorecard, because it required all teams and departments to work toward central visions — which often weren’t financial goals.

“The balanced scorecard, on the other hand, is well suited to the kind of organization many companies are trying to become… put[ting] strategy and vision, not control, at the center,” wrote Kaplan and Norton. This would inevitably bring teams together, helping management understand the connectivity within the organization. Ultimately, they felt that “this understanding can help managers transcend traditional notions about functional barriers and ultimately lead to improved decision-making and problem-solving.”

At this point, you’re probably wondering why this framework matters to a digital marketing consultant. While it’s more directly suited for evaluating companies from the inside, so much of this approach is really about breaking down the evaluation process into meaningful metrics with forward-looking goals. And this happens to be very similar to evaluating prospects.

Our digital version

As I mentioned before, evaluating prospective clients can be a very challenging task. It’s crucial to limit the areas of investigation during this process to avoid getting lost in the weeds, instead focusing only on the most critical data points.

Since our framework is built for evaluating clients in the digital world, we have appropriately named it the Balanced Digital Scorecard. Our scorecard also has main perspectives through which to view the client:

  1. Platform – Does their platform support publishing, discovery, and discoverability from a technical standpoint?
  2. Content – Are they publishing content which combines appropriate blends of effective, informative, entertaining, and compelling?
  3. Audience – Are they building visibility through owned, earned, and paid media?
  4. Conversions – Do they have a deep understanding of the needs of the market, and are they creating assets, resources, and journeys that drive profitable customer action?
  5. Measurement – Are they measuring all relevant aspects of their approach and their prospects’ activities to enable testing, improvement, and appropriate investment?

These perspectives make up the five areas of analysis to work through when evaluating most prospective clients.

1. Platform

Most consultants or SEO experts have a good understanding of the technical elements to review in a standard site audit. A great list of these can be found on our Technical Audit Checklist, created by my fellow Distiller, Ben Estes. The goal of reviewing these factors is of course to “ensure site implementation won’t hurt rankings” says Ben. While you should definitely evaluate these elements (at a high level), there is more to look into when using this framework.

Evaluating a prospect’s platform does include standard technical SEO factors but also more internal questions, like:

  • How effective and/or differentiated is their CMS?
  • How easy is it for them to publish content?
  • How differentiated are their template levels?
  • What elements are under the control of each team?

Additionally, you should look into areas like social sharing, overall mobile-friendliness, and site speed.

If you’re thinking this seems like quite the undertaking because technical audits take time and some prospects won’t be open with platform constraints, you’re right (to an extent). Take a high-level approach and look for massive weaknesses instead of every single limitation. This will give you enough information to understand where to prioritize this perspective in the pitch.

2. Content

Similar to the technical section, evaluating content looks similar to a lightweight version of a full content audit. What content do they have, which pieces are awesome and what is missing? Also look to competitors to understand who is creating content in the space and what level the bar is set at.

Beyond looking at these elements through a search lens, aim to understand what content is being shared and why. Is this taking place largely on social channels, or are publications picking these pieces up? Evaluating content on multiple levels helps to understand what they’ve created in the past and their audience’s response to it.

3. Audience

Looking into a prospect’s audience can be challenging depending on how much access they grant you during the pitch process. If you’re able to get access to analytics this task is much easier but without it, there are many tools you can leverage to get some of the same insights.

In this section, you’re looking at the traffic the site is receiving and from where. Are they building visibility through owned, earned, and paid media outlets? How effective are those efforts? Look at metrics like Search Visibility from SearchMetrics, social reach, and email stats.

A large amount of this research will depend on what information is available or accessible to you. As with previous perspectives, you’re just aiming to spot large weaknesses.

4. Conversion

Increased conversions are often a main goal stated by prospects, but without transparency from them, this can be very difficult to evaluate during a pitch. This means that often you’re left to speculate or use basic approaches. How difficult or simple is it to buy something, contact them, or complete a conversion in general? Are there good calls to action to micro-conversions such as joining an email list? How much different is the mobile experience of this process?

Look at the path to these conversions. Was there a clear funnel and did it make sense from a user’s perspective? Understanding the journey a user takes (which you can generally experience first-hand) can tell you a lot about expected conversion metrics.

Lastly, many companies’ financials are available to the public and offer a general idea of how the company is doing. If you can establish how much of their business takes place online, you can start to speculate about the success of their web presence.

5. Measurement

Evaluating a prospect’s measurement capabilities is (not surprisingly) vastly more accurate with analytics access. If you’re granted access, evaluate each platform not just for validity but also accessibility. Are there useful dashboards, management data, or other data sources that teams can use to monitor and make decisions?

Without access, you’re left to simply check and see the presence of analytics and if there is a data layer. While this doesn’t tell you much, you can often deduce from conversations how much data is a part of the internal team’s thought process. If people are monitoring, engaging, and interested in analytics data, changes and prioritization might be an easier undertaking.

what-you-measure-quote.png

Final thoughts

Working with prospective clients is something all agency consultants will have to do at some point in their career. This process is incredibly interesting — it challenges you to leverage a variety of skills and a range of knowledge to evaluate new clients and industries. It’s also a daunting task. Often your position outside the organization or unfamiliarity with a given industry can make it difficult to know where to start.

Frameworks like the original Balanced Scorecard created by Kaplan and Norton were designed to help a business evaluate itself from a more modern and holistic perspective. This approach turns the focus to future goals and action, not just evaluation of the past.

This notion is crucial at an agency needing to establish the best path forward for prospective clients. We developed our own framework, the Balanced Digital Scorecard, to help our consultants do just that. By limiting the questions you’re looking to answer, you can work smarter and focus your attention on five perspectives to evaluate a given client. Once you’ve reviewed these, you’re able to identify which ones are lagging behind and prioritize proposed work accordingly.

Next time, we’ll cover the second part: how to use the Balanced Digital Scorecard to prioritize your work.

If you use a framework to evaluate prospects or have thoughts on the Balanced Digital Scorecard, I’d love to hear from you. I welcome any feedback and/or questions!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Mike Deets - Living

 

 

 

Have an incredible day!

 

Mike

http://blog.deetslist.com

Source link

Tap Into The Twitterverse

Twitter can be such a powerful tool and community. When I started my career in digital marketing Twitter created my first connections within the industry and after a while, landed me my first agency employment. I’m a daily Twitter user in love with the community. You can find most things on Twitter, from friends to your personal heros. As a digital marketer I find Twitter amazing because of one little tiny feature – the search functionality.

I guess that I don’t have to add any information about Twitter here. If this is the first time you’ve heard of it – welcome to the Internet, we’re glad to have you here.

What are people saying about your brand?

Brands must love Twitter search. You can with a simple search find out what people are saying about the brand, in real-time. But it can be a bit clunky and needs some tweaking. We don’t speak all languages so let me introduce you to the simple language filter: lang:en

Adding the language filter to a search makes all the difference. What are people saying about Mumford and Sons in Sweden? Just “mumford and sons” lang:sv and tap into peoples opinions.

Holy cow, we’re segmenting tweetings! How fun is that? But hold on, we’re not close to done yet. Everybody and their mum loves Mumford and sons. This will cause our Twitter searches to pick up many retweets, which we want to filter out with a simple -rt command. No retweets! Awesome.

Monitoring brand mentions should be the brand and the domain name (or names). For instance, I blog in Swedish at perpettersson.nu and in English at perpettersson.me. Using a simple OR command, I can set up a search if someone mentions me or any of my domains. Just as simple like this: “per pettersson” OR perpettersson.nu OR perpettersson.me

Captain Obvious: You’re setting this up for your competitors right now, aren’t you? You should. There is nothing wrong with listening in on public conversations. You can even turn it up to stalker alert level 49 by using a FROM and TO command like this: from:per_p to:juliencoquet

So now you’re stalking me and Julien Coquet on Twitter. Expect smartness – or just pictures of food. I would put my money on food 99% of the times.

Positive and negative mentions of your brand

Its all about how we perform our searches on Twitter that will set the limitations. “mumford and sons”:( will return our search with a sad emoticon. Sentiment search! Everybody high fives.

Twitter search for Mumford and Sons

We are all sad, Grass Cirby. We are all sad.

Captain Obvious: you could perform the opposite search – “mumford and sons” 🙂 for a positive Twitter search.

Its all about location

You are in charge of the PR for Mumford and Sons in Stockholm. Why not narrow your search with the help of location data (optional for users to activate) by a simple near:”Stockholm, Sweden” command.

Twitter search with location

Holy mole, that is a nifty little search filter we’ve not used before – right? Don’t be shy.

Let’s pause for a Twitter a-ha moment: By now you can set up a positive brand mention search in Swedish, in Stockholm – Sweden. That is pretty darn specific and at the same time, very cool. Remember the cost of this tool: zero dollars. Now let’s go deeper down the rabbit hole.

Doing content research on Twitter

Are your brain slowly melting and you have no idea what you should write about? There is not shame in doing some content research to see what people are talking about right now. Twitter search to the rescue: Javascript “how to”

Twitter "how to" search

Fresh content with interactions. Sweet stuff.

Real life experience with Twitter advanced search

Why do I feel the need to write this piece on how you should use Twitter search? To put everything in perspective, I’ve been employed via Twitter talking about digital marketing with the right people (just engage with your industry peers, god damn it). This also resulted in new business for the agency I worked at. This also resulted in a few speaking gigs.

Setting up industry specifc mentions in smart, but still simple, ways have created prospects – leads and customers.

Last, but not least, I found the love of my life on Twitter. That is pretty awesome, if you ask me.

Post from Per Pettersson

Mike Deets - Living

 

 

 

Have an incredible day!

 

Mike

http://blog.deetslist.com

Source link

Google’s Bigger Plan For The Mobile Web: Firebase, Deep Linking & Instant Apps

What Firebase Analytics, Deep Linking and Instant Apps tell us about where the mobile web heading.

With all the recent announcements from Google I/O 2016 regarding the app universe, could Google be hinting on their intentions of the future of the mobile web?

Deep Linking

When Google announced deep linking in 2013, their mission was clear: to provide the user with more seamless experience, which meant ability to view search results in previously installed apps. They have expanded their deep linking efforts in 2015 when they announced app deep linking for iOS apps as well.

One of the most exciting moments from Google’s annual I/O 2016 event last month was their announcement regarding Instant Apps. Android Instant Apps is a next step (or rather a leap) in the evolution of mobile apps, which allows users to load apps instantly (hence the name), without the need of installing them beforehand.

This means, that we – users are able to get all the benefits of better UX and personalised features without the downsides of a slow loading mobile site or the necessity to manually find and install the app – and thus clutter our phone’s precious storage.

As exciting as it is, this development comes hardly as a surprise – it’s really more of Google’s very logical next step in what could be a brilliant master plan to deep-link whole mobile web and thus effectively merge mobile websites and mobile apps into a new medium.

With more searches being made from mobile devices than ever before, it’s quite obvious that the future of web is mobile. And as it seems, the future of the mobile web are the apps rather than mobile websites.

With Android Instant Apps, the fact that user doesn’t need to have any app installed in order to open it, it really feels like the line between an app and a mobile website is non-existent.

How does it work?

Android Instant Apps are split into several modules. This means that every time a user clicks on a deep link which opens certain screen in an app, their device only downloads as much of the code, as is necessary to run this particular screen in an app.

What does it mean for developers?

The good news is that developers don’t need to build a new app from scratch. They can simply upgrade their existing app, which takes much less time and resources.

Official documentation for Android Instant Apps is available only to selected developers so far, but if you are interested, you can ask for an early access here.

As for the users, Android Instant Apps are compatible with Android version all the way to Jellybeans.

Why should we care?

Everyone who’s ever built and launched an app – Android or iOS – will agree with that it requires a lot of work and effort to get to the top positions in Google Play or App Store. Only a small percentage of apps are ever discovered or used by users. Android Instant Apps gives apps more discoverability – basically your app can benefit from your website’s existing rankings and possibly years of SEO efforts. Although users don’t have to download the app as a whole, consider it as a sampling – if users will love your app, maybe they will come back and install it in full.

Firebase Analytics: The App-First Analytics

Another big app-related announcement from Google I/O was the new Firebase Analytics platform.

Firebase was originally a tech startup founded by Andrew Lee and James Taplin. They built Firebase as a backend service that helps developers build realtime apps for iOS, Android and the web that can store and sync data instantly. Firebase was later acquired by Google in October 2014.

This May, Google announced a new feature built on top of the Firebase platform: Firebase Analytics.

firebase

The first question, which inevitable comes to mind is – why would Google bother building new analytics platform, when there is one perfectly good Google-owned analytics already?

As opposed to Google Analytics for apps, Firebase Analytics was built primarily with apps in mind. This means that Firebase Analytics provides multiple key functions that are not available in Google Analytics for exactly that reason: tracking in GA is focused on page views and sessions or impressions. Firebase is by default providing you with user-based data.

This is closely related to another new feature – audiences which allow developers to define groups of users with common attributes.

You can find out more about what Firebase Analytics is bringing here.

Who is it for?

While Firebase Analytics is completely free, you might not want to get rid of your existing Google Analytics account just yet. Google’s recommendation for companies who run both website and an app is to use both Google Analytics and Firebase Analytics in parallel. In fact, you can even connect your Firebase to Google Analytics and see the Firebase Analytics reports within Google Analytics interface.

Here is Google’s recommendation on what kind of analytics to use based on your needs:

Google Analytics Firebase Analytics
App-only YES
App and website YES YES
Website Only YES

Where we’re heading with the mobile web

Of course, at the moment it’s unclear how fast or slow the adoption of all new technologies by Google will be, but one thing is quite obvious: Google’s idea of the future of mobile web is strongly in favour of apps. In case of Instant Android Apps it almost looks like we are looking on a new medium: a hybrid of mobile websites and apps. Or maybe it’s just a new generation of apps?

It’s about time Brand’s change the way they look at apps. Apps are no longer a one component to your digital marketing strategy and are slowing becoming the centre and deserve attention accordingly.

Post from Pete Campbell

Mike Deets - Living

 

 

 

Have an incredible day!

 

Mike

http://blog.deetslist.com

Source link

10 Illustrations of How Fresh Content May Influence Google Rankings (Updated)

Posted by Cyrus-Shepard

[Estimated read time: 11 minutes]

How fresh is this article?

Through patent filings over the years, Google has explored many ways that it might use “freshness” as a ranking signal. Back in 2011, we published a popular Moz Blog post about these “Freshness Factors” for SEO. Following our own advice, this is a brand new update of that article.

In 2003, Google engineers filed a patent named Information retrieval based on historical data that shook the SEO world. The patent not only offered insight into the mind of Google engineers at the time, but also seemingly provided a roadmap for Google’s algorithm for years to come.

In his series on the “10 most important search patents of all time,” Bill Slawski’s excellent writeup shows how this patent spawned an entire family of Google child patents–the latest from October 2011.

This post doesn’t attempt to describe all the ways that Google may determine freshness to rank web pages, but instead focuses on areas we may most likely influence through SEO.

Giant, great big caveat: Keep in mind that while multiple Google patent filings describe these techniques — often in great detail — we have no guarantee how Google uses them in its algorithm. While we can’t be 100% certain, evidence suggests that they use at least some, and possibly many, of these techniques to rank search results.

For another take on these factors, I highly recommend reading Justin Briggs’ excellent article Methods for Evaluating Freshness.

When “Queries Deserve Freshness”

Former Google Fellow Amit Singhal once explained how “Different searches have different freshness needs.”

The implication is that Google measures all of your documents for freshness, then scores each page according to the type of search query.

Singhal describes the types of keyword searches most likely to require fresh content:

  • Recent events or hot topics: “occupy oakland protest” “nba lockout”
  • Regularly recurring events: “NFL scores” “dancing with the stars” “exxon earnings”
  • Frequent updates: “best slr cameras” “subaru impreza reviews”

Google may determine exactly which queries require fresh content by monitoring the web and their own huge warehouse of data, including:

  1. Search volume: Are queries for a particular term spiking (i.e. “Earthquake Los Angeles”)?
  2. News and blog coverage: If a number of news organizations start writing about the same subject, it’s likely a hot topic.
  3. Social media: A spike in mentions of a particular topic may indicate the topic is “trending.”

While some queries need fresh content, other search queries may be better served by older content.

Fresh is often better, but not always. (More on this later.)

Below are ten ways Google may determine the freshness of your content. Images courtesy of my favorite graphic designer, Dawn Shepard.

1. Freshness by inception date

Initially, a web page can be given a “freshness” score based on its inception date, which decays over time. This freshness score may boost a piece of content for certain search queries, but degrades as the content becomes older.

The inception date is often when Google first becomes aware of the document, such as when Googlebot first indexes a document or discovers a link to it.


“For some queries, older documents may be more favorable than newer ones. As a result, it may be beneficial to adjust the score of a document based on the difference (in age) from the average age of the result set.”
– All captions from US Patent Document Scoring Based on Document Content Update

2. Amount of change influences freshness: How Much

The age of a webpage or domain isn’t the only freshness factor. Search engines can score regularly updated content for freshness differently from content that doesn’t change. In this case, the amount of change on your webpage plays a role.

For example, changing a single sentence won’t have as big of a freshness impact as a large change to the main body text.


“Also, a document having a relatively large amount of its content updated over time might be scored differently than a document having a relatively small amount of its content updated over time.”

In fact, Google may choose to ignore small changes completely. That’s one reason why when I update a link on a page, I typically also update the text surrounding it. This way, Google may be less likely to ignore the change. Consider the following:

“In order to not update every link’s freshness from a minor edit of a tiny unrelated part of a document, each updated document may be tested for significant changes (e.g., changes to a large portion of the document or changes to many different portions of the document) and a link’s freshness may be updated (or not updated) accordingly.”

3. Changes to core content matter more: How important

Changes made in “important” areas of a document will signal freshness differently than changes made in less important content.

Less important content includes:

  • JavaScript
  • Comments
  • Advertisements
  • Navigation
  • Boilerplate material
  • Date/time tags

Conversely, “important” content often means the main body text.

So simply changing out the links in your sidebar, or updating your footer copy, likely won’t be considered as a signal of freshness.


“…content deemed to be unimportant if updated/changed, such as Javascript, comments, advertisements, navigational elements, boilerplate material, or date/time tags, may be given relatively little weight or even ignored altogether when determining UA.”

This brings up the issue of timestamps on a page. Some webmasters like to update timestamps regularly — sometimes in an attempt to fake freshness — but there exists conflicting evidence on how well this works. Suffice to say, the freshness signals are likely much stronger when you keep the actual page content itself fresh and updated.

4. The rate of document change: How often

Content that changes more often is scored differently than content that only changes every few years.

For example, consider the homepage of the New York Times, which updates every day and has a high degree of change.


“For example, a document whose content is edited often may be scored differently than a document whose content remains static over time. Also, a document having a relatively large amount of its content updated over time might be scored differently than a document having a relatively small amount of its content updated over time.”

Google may treat links from these pages differently as well (more on this below.) For example, a fresh “link of the day” from the Yahoo homepage may be assigned less significance than a link that remains more permanently.

5. New page creation

Instead of revising individual pages, fresh websites often add completely new pages over time. (This is the case with most blogs.) Websites that add new pages at a higher rate may earn a higher freshness score than sites that add content less frequently.


“UA may also be determined as a function of one or more factors, such as the number of ‘new’ or unique pages associated with a document over a period of time. Another factor might include the ratio of the number of new or unique pages associated with a document over a period of time versus the total number of pages associated with that document.”

Some webmasters advocate adding 20–30% new pages to your site every year. Personally, I don’t believe this is necessary as long as you send other freshness signals, including keeping your content up-to-date and regularly earning new links.

6. Rate of new link growth signals freshness

Not all freshness signals are restricted to the page itself. Many external signals can also indicate freshness as well, oftentimes with powerful results.

If a webpage sees an increase in its link growth rate, this could indicate a signal of relevance to search engines. For example, if folks start linking to your personal website because you’re about to get married, your site could be deemed more relevant and fresh (as far as this current event goes.)


“…a downward trend in the number or rate of new links (e.g., based on a comparison of the number or rate of new links in a recent time period versus an older time period) over time could signal to search engine 125 that a document is stale, in which case search engine 125 may decrease the document’s score.”

Be warned: an unusual increase in linking activity can also indicate spam or manipulative link building techniques. Search engines are likely to devalue such behavior. Natural link growth over time is usually the best bet.

7. Links from fresh sites pass fresh value

Links from sites that have a high freshness score themselves can raise the freshness score of the sites they link to.

For example, if you obtain a link off an old, static site that hasn’t been updated in years, this may not pass the same level of freshness value as a link from a fresh page, i.e. the homepage of Wired. Justin Briggs coined this FreshRank.


“Document S may be considered fresh if n% of the links to S are fresh or if the documents containing forward links to S are considered fresh.”

8. Traffic and engagement metrics may signal freshness

When Google presents a list of search results to users, the results the users choose and how much time they spend on each one can be used as an indicator of freshness and relevance.

For example, if users consistently click a search result further down the list, and they spend much more time engaged with that page than the other results, this may mean the result is more fresh and relevant.


“If a document is returned for a certain query and over time, or within a given time window, users spend either more or less time on average on the document given the same or similar query, then this may be used as an indication that the document is fresh or stale, respectively.”

You might interpret this to mean that click-through rate is a ranking factor, but that’s not necessarily the case. A more nuanced interpretation might say that the increased clicks tell Google there is a hot interest in the topic, and this page — and others like it — happen to match user intent.

For a more detailed explanation of this CTR phenomenon, I highly recommend reading Eric Enge’s excellent article about CTR as a ranking factor.

9. Changes in anchor text may devalue links

If the subject of a web page changes dramatically over time, it makes sense that any new anchor text pointing to the page will change as well.

For example, if you buy a domain about racing cars, then change the format to content about baking, over time your new incoming anchor text will shift from cars to cookies.

In this instance, Google might determine that your site has changed so much that the old anchor text is now stale (the opposite of fresh) and devalue those older links entirely.


“The date of appearance/change of the document pointed to by the link may be a good indicator of the freshness of the anchor text based on the theory that good anchor text may go unchanged when a document gets updated if it is still relevant and good.”

The lesson here is that if you update a page, don’t deviate too much from the original context or you may risk losing equity from your pre-existing links.

10. Older is often better

Google understands the newest result isn’t always the best. Consider a search query for “Magna Carta.” An older, authoritative result may be best here.

In this case, having a well-aged document may actually help you.

Google’s patent suggests they determine the freshness requirement for a query based on the average age of documents returned for the query.


“For some queries, documents with content that has not recently changed may be more favorable than documents with content that has recently changed. As a result, it may be beneficial to adjust the score of a document based on the difference from the average date-of-change of the result set.”

A good way to determine this is to simply Google your search term, and gauge the average inception age of the pages returned in the results. If they all appear more than a few years old, a brand-new fresh page may have a hard time competing.

Freshness best practices

The goal here shouldn’t be to update your site simply for the sake of updating it and hoping for better ranking. If this is your practice, you’ll likely be frustrated with a lack of results.

Instead, your goal should be to update your site in a timely manner that benefits users, with an aim of increasing clicks, user engagement, and fresh links. These are the clearest signals you can pass to Google to show that your site is fresh and deserving of high rankings.

Aside from updating older content, other best practices include:

  1. Create new content regularly.
  2. When updating, focus on core content, and not unimportant boilerplate material.
  3. Keep in mind that small changes may be ignored. If you’re going to update a link, you may consider updating all the text around the link.
  4. Steady link growth is almost always better than spiky, inconsistent link growth.
  5. All other things being equal, links from fresher pages likely pass more value than links from stale pages.
  6. Engagement metrics are your friend. Work to increase clicks and user satisfaction.
  7. If you change the topic of a page too much, older links to the page may lose value.

Updating older content works amazingly well when you also earn fresh links to the content. A perfect example of this is when Geoff Kenyon updated his Technical Site Audit Checklist post on Moz. You can see the before and after results below:


Be fresh.

Be relevant.

Most important, be useful.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Mike Deets - Living

 

 

 

Have an incredible day!

 

Mike

Deetslist – SEO, SOCIAL MEDIA, VIDEO

Source link

How human-robot collaboration is changing the face of digital marketing

Google is a great example of human-computer interaction; there is a human need for information, and the more successful the searcher is able to interact with the computer (Google in this case), the greater the outcome derived.

It is perfectly logical then, that the next step from this symbiotic relationship between people and computers, is the movement towards human-robot collaboration.

In this article my focus has been skewed towards search and digital marketing human-robot interaction as a topic, as opposed to specific aspects of interaction and debate like; whether digital assistants will replace search.

Human vs Robots?

As you might expect, one of the initial stages of human-robot interaction in the workplace, often focuses on the job robots will steal first, as well as the traditional attention grabbing headlines including human’s vs robots. The actual reality for human-robot working is quite different, especially within data driven industries including digital marketing.

The role of the robot in digital working

When it comes to search and digital marketing the primary roles of robots are proactive; collaborating with the marketing specialist, and reactive; responding to any triggering of machine learning and Artificial Intelligence signals.

Proactive human-robot interaction

There are many ways in which digital marketers can work effectively with robots and arguably the biggest opportunity lies with data.

In almost every digital job function data plays an important role. ‘Big data’ is a term almost everyone in the search marketing and digital sphere frequently uses (or is certainly familiar with), yet comparatively few specialists are leveraging the role of robots in making the most out of this wealth of data the industry has access to.

You can see the growth of big data using external trends tools, in this case; Google Ngram Viewer. Here you can view the frequency of big data references in textbooks since the 1950’s:

Graph

Personally, I believe the most effective collaborative working model for marketers and robots, is one where the robots take over the heavy data lifting, and the humans apply their specialisms turning data driven insights into meaningful actions. Let me explain this more practically.

A typical marketing manager may use anywhere from 15 to 30 different sets of data. For example; a tool for crawling a website, one for providing header status information, another collating backlink data… the list is endless.

When you have all of this data (usually in Excel files or similar office products), the initial task involves trawling through thousands (if you are fortunate), millions (for the less fortunate) of rows of data that all need lots of work, to even make the data sets interact and comparable. All of this work takes place before a single moment of meaningful, expert analysis can happen. Here is where the robot collaboration comes into play.

Any action that is highly structured, has a logical set of instructions, and requires processing of information that can be taught to a machine, a robot will perform more effectively than a human.

This proactive approach to human-robot working provides the expert with freedom to focus on tasks requiring knowledge, experience, creativity and expertise. It helps to remove working inefficiencies, and speeds up the process from data to insight, and finally, to action and implementation.

Reactive human-robot interaction

In the context of this post, reactive human-robot digital interaction relates to approaches for search and digital tactics, which can assist in making the most out of Google, and other search engine, robots opportunities – think about RankBrain as an example of this.

Other

The great thing about machine learning and Artificial Intelligence is that it is based on logic, and has a clearly identified purpose. The more that you apply this understanding into your search marketing and wider digital approaches, the greater the opportunities to leverage value from this insight.

Here’s a few tips for leveraging the reactive human-robot interaction (in this context tied towards search engine gains and Google RankBrain and associated areas like ‘Hummingbird’ and more):

  • Ensure content has a clear topic, a structured approach to its presentation and has depth of supporting information
  • Think about keywords – breadth and variation of terminology used, debunking jargon, and answering of pertinent questions specific to the topic
  • All devices and content types make an impact. Content needs to be fully accessible, supported by structured data, quick to load, and supporting as many search verticals and content digesting preferences as feasible
  • Focus on quality and value (putting the user first) and support this with robot understanding acknowledgment

Will robots replace humans in digital marketing?

No.

The more we embrace human-robot interaction and integrated working, the greater the combined, and individual value that both humans and robots will bring to the industry. By leveraging the strengths of each for proactive working, and evolving strategy and tactics deployed to reflect the opportunity of reactive working, the challenge becomes so much more exciting.

90% of all the data in the world was created in the past two years” (IBM – ‘What is big data?’), the challenge that digital experts now face, is how to make the most of this data, turn information into insight, and embrace the role of the robots for a true competitive advantage.

Images:

Note – all images included have been sourced from https://www.pexels.com/discover/ – “All photos on Pexels are free for any personal and commercial purpose.” – see https://www.pexels.com/photo-license/:

  • https://www.pexels.com/photo/silhouette-man-person-stars-12567/
  • https://static.pexels.com/photos/16510/wood-coffee-camera-desk-16510.jpg

The only exception is the screen shot taken from Google publically accessible information – Google Ngram Viewer

Post from Lee Wilson

Mike Deets - Living

 

 

 

Have an incredible day!

 

Mike

http://blog.deetslist.com

Source link

6 SEO Tools Which Make Life a Lot Easier

SEO is what it is because of the great tools available, giving us deeper insights, fresh ideas and the ability to back up everything we say with hard evidence. Here, I’ve listed a few of my very favourite tools, as well as the unique benefits that each brings to the table.

Whilst I’m sure that you’ll have come across a few of these tools before if you work in SEO, if this post inspires you to add another one or two to your repertoire, I’ll consider it a success!

Searchmetrics

Searchmetrics is one of the most popular SEO analysis platforms for a reason. With the largest dataset available from any SEO tools, it’s the perfect place to find a wealth of information about how a site’s performing in search.

One of Searchmetrics’ most handy features is the “SEO Visibility” score, a nominal and proprietary metric which can give an overview of a domain’s performance overall, both on mobile or on desktop. This is derived from the search value of your existing rankings, and is useful particularly when assessing a site’s performance against a competitor.

The “Traffic Value” score is also a useful one – estimating the worth of your organic keywords if you were to pay for them through Adwords – as is the “Position Spread” graph within the “Rankings” tab. This shows where the bulk of a site’s most valuable rankings are (in terms of position in search results). By exporting the “long tail” keywords from this view, you can also find big potential wins, by identifying “nearly there” rankings, which could generate big rises in traffic if bumped up the rankings successfully.

BuzzSumo

When it comes to content ideation and competitor analysis, there are few better and easier to use tools than BuzzSumo. The platform identifies the top performing articles in a particular area, or for a specific site or topic in terms of social shares.

What makes the tool so great is the simple view it provides of online social performance, allowing you to both find clues as to what people are finding interesting, and what popular posts have in common. The tool is also useful to introspectively look at your own content, allowing you to quickly assess what’s performing well and what may benefit from a re-think.

Screaming Frog

The Screaming Frog SEO Spider allows you to scrape an entire website, or a list of URLs or search engine results page, and is invaluable for too many reasons to list here. One crucial way, though, is allowing you to note URL response codes in bulk.  By first crawling a site, then exporting results to excel, you can simply and easily note all pages performing 301 or 302 redirects, as well as those with 404 error codes.

Looking at the volume of links with these response codes and altering them accordingly is an essential way to keep a site ticking over, and benefits both users and search engine performance by creating a quicker, easier to navigate and less frustrating site. You can also crawl an XML sitemap on Screaming frog, ensuring that it isn’t directing search engines to any unnecessary redirects or 404s either.

Majestic SEO

In order to check the quality and quantity of links to a site, Majestic is the ideal tool. Giving a time-specific view of incoming links, it’s the perfect way to spot a dodgy backlink profile, as well as to see where the most valuable links are coming from.

Majestic lets you review the number of overall links to a site, compared to the amount of domains they’ve come from. What’s more, you can see the link anchor text, and backlink breakdown. This can give invaluable clues as to whether links may have been bought or manipulated in the past, and whether these are benefiting, or indeed harming, a site.

Majestic also gives a handy snapshot of a site’s link performance with their CitationFlow/Trust Flow graph. This helps you to visualise the likely quality of links to a site, both in terms of link equity (essentially power) likely trustworthiness (by virtue of their closeness to known, trusted pages).

Google Site: Search and Data Miner Chrome Extension

Firstly, by performing a simple site:search on Google, you can quickly assess how many pages on a site have been indexed. Comparing this to the number of pages listed on a sitemap, or available through a Screaming Frog crawl can give a number of clues as to a site’s performance, or likely issues such as index wastage.

However, where the Data Miner chrome extension comes into this, the real magic happens. The tool is able to scrape the contents of search engine results pages, meaning (whilst a site:search gives you a number of pages), a clever use of data miner can see you export all of the URLs indexed on Google. Whilst the search engine will no longer let your scrape all of these in one go, clever searching (for example by one category at a time) can allow you to attain virtually a full list of indexed URLs.

This helps you to spot site areas which are failing to be indexed, as well as those which have unintentionally been added to the index (such as private site areas, and certain user generated content or parameters).

SEMRush

If you’re looking for useful competitor keyword data, SEMRush is a great place to start. Whilst the tool’s data-set is slightly smaller than that of Searchmetrics (around 200 million keywords as opposed to 600 million), SEMRush will take a snapshot of all of the top 20 pages ranking for terms, and – seeing as so few users pass further than this in their Google searches – does give an accurate picture of keyword rankings.

For a fresh insight into keyword rankings, and those shared between a site and its key competitors, within the main dashboard select “Domain vs. Domain” under “Domain Analytics”. Once you’ve input the sites you want to compare, click “enable graphs” for a handy Venn diagram of keyword overlap, and you can also easily export results to review both sites’ performance over valuable shared keywords.

I hope here to have been able to give a few new ideas for tools to use in your SEO strategy. For any questions on using the tools covered here, or for more information on any of the processes covered, I’d be glad to help so just drop a comment below.

Post from Jack Telford

 

 

 

Have an incredible day!

 

Mike

http://blog.deetslist.com

Source link