Sales & Support 1-888-220-7361

The Reciprocal Consulting Blog

You are Browsing the August 2014 Archive:

Google has announced that it will begin using HTTPS as a ranking signal. This is a welcome move because security is an increasing concern for internet sites and the effort to make the internet a safer place is one we all should share. Customers are concerned about things like security on the sites they use to make purchases, and every news item about changing your passwords again because of a new security breach keeps the concerns alive.

Right now, the use of HTTPS is what Google calls “a very lightweight signal” that affects fewer than 1% of global queries. But it’s clear that this will be part of what affects your search rankings in the future, so it’s a good idea to pay attention to their warning.

To directly quote Google’s basic tips to get started:

  • Decide the kind of certificate you need: single, multi-domain, or wildcard certificate
  • Use 2048-bit key certificates
  • Use relative URLs for resources that reside on the same secure domain
  • Use protocol relative URLs for all other domains
  • Check out our Site move article for more guidelines on how to change your website’s address
  • Don’t block your HTTPS site from crawling using robots.txt
  • Allow indexing of your pages by search engines where possible. Avoid the noindex robots meta tag.

If your site already uses HTTPS and you want to make sure your security level and configuration is adequate, use the Qualys Lab tool to see how it fares.  There’s also more information on web design and SEO at  http://www.reciprocalconsulting.com/search-engine-optimization.php

In AdWords, your Quality Score is like a warning light, according to a quote in “Google: Stop Losing the  Forest for the…Quality Score” at PPC Hero. It shows how healthy your ads and keywords are, but it isn’t anything more than an indication to look further if it’s low. The warning light is a good illustration, because when it comes on you aren’t supposed to be examining the light itself but the system it is connected to.

This look at Google’s whitepaper on Settling the (Quality) Score (pdf) includes a nice chart on things that matter, and things that don’t. These things make a difference in your Quality Score:

  • user device matters, so think about mobile targeting and landing page experience
  • performance on related keywords matters when launching new keywords, so invest in relevant searches
  • relevance to user intention matters, so make sure ads and landing pages match what they want

However, in terms of that Quality Score, keep this in mind:

  • It doesn’t matter how you structure your account, so do what works best for you
  • It doesn’t matter which networks you target, so feel free to test new networks
  • It doesn’t matter where the ad is on the page, so don’t bid up higher positions to get a higher score–think about user experience instead

The warning light is a valuable tool when it’s used the right way, as a signal that you need to look further into the system it monitors. With the Quality Score, Google is reminding us that it is a tool, not a grade.

You’ll find more insights on PPC Management at http://www.reciprocalconsulting.com/pay-per-click.php.

The different names Google has for algorithms gives a persona to an enigma. In “Feeding the Hummingbird: Structured Markup Isn’t The Only Way To Talk To Google“, Moz Blog contributor Cyrus Shepard says,

“Ever wonder why Google named certain algorithms after black and white animals (i.e. black hat vs. white hat?) Hummingbird is a broader algorithm altogether, and Hummingbirds can be any color of the rainbow.”

Panda and Penguin were going after webspam. Hummingbird is designed to optimize entity-based search. That means the Hummingbird algorithm is looking at what is said, how the keywords are placed, etc. Since Google uses over 500 algorithms and each one is going after different information, the exact “secret formula” for SEO is always going to be a secret. In fact, since those algorithms are constantly adjusted in an attempt to improve search, the secret formula keeps changing.

The nice thing that is pointed out, though, is that Hummingbird looks at more than the SEO savvy markup and can figure out relationship without it. This is natural search results instead of formulas.

Feeding the Hummingbird

Here’s a quick list of what is important to this algorithm:

  1. keywords (subject-predicate-object triples)
  2. tables & HTML elements
  3. entities & synonyms
  4. anchor text & links
  5. Google Local
  6. Google Structured Data Highlighter
  7. Plugins

All of the elements are balanced and weighted to figure out how to do Hummingbird’s part of the whole secret zoo at Google. Each one of the algorithms plays a role in where your data comes up on the page. Interesting, isn’t it? The thing to remember is that there’s a big difference between trying to play the system and trying to get quality content available for your audience. Google is always going after the players because they want to stay relevant to the rest of us.

For more information on optimizing your site for natural search results, visit  http://www.reciprocalconsulting.com/search-engine-optimization.php.

One of the tasks a webmaster often faces with trepidation is moving content around without taking a hit from Google. So many have expressed this that the official Google webmaster blog has addressed it in the post on Making Site Moves Easier.

3 Kinds of Content Migration

There’s really just two basic categories of site move, but since the second group has two subdivisions, there may as well be three:

  • site moves without URL changes
  • site moves with URL changes
  • site moves to responsive web design

Each kind of move will mean following different instructions, and Google does a good job of explaining the steps.

The Price of Site Moves the Wrong Way

It may seem like there’s no consequence to just shifting around your content on your site on the human level. After all, if your site is logical to you changes will probably will be logical to your customers who are regular users. But the search engines are not reading your site on the human level, and that will affect where your site comes up in the page ranks.

Humans tend to do a search and look at the first page or so. If you want new users, you will need to be found. Moving content around without keeping an eye on the Googlebot will change the way your site comes up on Google. That will change who sees your site.

It makes sense to follow Google’s instructions when you do site moves because then you will be doing things the way their search engine operates, and that improves your chances to be where you want to be in the rankings.

For more insights on web design, visit reciprocalconsulting.com/web-design.php

Webmaster Academy is a school run by Google designed to help new webmasters learn how to build an effective website and get it indexed in the world’s largest search engine. It’s a useful course, but I wouldn’t expect to learn everything there is to know about building a website if I were you.

The three newest modules in the academy are:

  • How to make a great site that is valuable to your audience
  • How Google sees your website
  • How to communicate with Google about your website

All of this is useful information to new webmasters, but there is more to building a website than making it look pretty and getting it indexed in the search engines (although, both of those are good places to start).

You also need to concern yourself with accuracy of content, navigational issues, whether or not your site is accessible to mobile phones, and the importance of keeping it updated. That’s just to name a few of the important details webmasters should concern themselves with on every new website.

It’s a lot harder to build an effective website than it used to be. There is a lot more to think about. I wouldn’t trust it to an amateur.

To learn more about how to build an effective website, visit http://www.reciprocalconsulting.com/web-design.php.

At Moz, Cyrus Shepard shares 12 powerful ways to optimize for Google traffic without building links. It’s a great list, but I’d like to focus on just 5 of those optimization tactics and show you how you can put them to good use in simple ways.

  1. In-depth articles – You don’t have to be a news publisher to take advantage of this markup, but you should take a cue from popular news websites like Huffington Post and the New York Times. Use pagination, Google Authorship, canonicalization, and paywalls more effectively.
  2. Rich snippets - Google is adding more and more rich snippets all the time. You can use them for reviews, videos, events, books, articles, and much more. This is advanced SEO.
  3. Google Authorship – Google Authorship means having your photo appear in search results, which gives you a higher authority rating and potentially more click-throughs for your content.
  4. Local SEO – Cyrus Shepard mentions internationalized SEO, which is great, but what about local SEO? If you’re a local business, then you want to drill down.
  5. Social annotations – Simply sharing your content on Google+ is enough to increase your SEO potential. Your content will show up in more search results, even among people who are not in your network, but it will definitely appear at the top of the search results for people who are in your Google+ network.

Implementing these tactics won’t necessarily increase your search engine results or get you more traffic, but not implementing them will definitely hold you back.

Get more information on the best SEO tactics at http://reciprocalconsulting.com.

Google Labs inside Webmaster Tools is an experimental section that allows Google to test new products before unleashing them on the public. Did you know you can check your Google Authorshipstats inside Google Webmaster Tools?

Sign in to your Google Webmaster Tools account. On the left side of the page you’ll see a link labeled Labs. Click that and you’ll see a dropdown with Author Stats. Click that.

Inside the Author Stats section of Google Labs you can get a glimpse of the pages you have authored around the web, both those on your website and those that are off-site. You can see a limited number of stats on those pages, as well.

To begin with, the overview consists of the number of pages you have authored, the number cumulative page impressions your pages have received in the last 30 days, and the number of clicks. When you scroll down you’ll get an overview of each page.

The stats you can check for each page you have authored include:

  • Number of page impressions
  • Number of clicks
  • Click-through rate
  • Average search engine position

This is all useful information, especially if you do a lot of guest blogging, which you should.

I hope this experiment continues and that Google rolls it out as a real product. I’d like to be able to track my authorship stats around the Web. Wouldn’t you?

Affiliate marketing icon Sugarrae posted a rant knocking Google and Matt Cutts off their conjoined high horse. Near the end of her post is this brilliant little gem:

From here on out, you work on generating traffic. From here on out, you work on generating branding. From here on out, you work on obtaining customers.

There’s more. You’ll have to excuse the profanity, but you should read the post. I’ll add this caveat:

This is really nothing new.

Your job has always been to build traffic and brand. That hasn’t really changed. The problem is, many online marketers got away from the real goal and started focusing on search engine rankings. Rankings are nice, but they’re not an end in themselves. They’re not the end goal. They are a means to an end.

With personalized search, Google+, and other late great algorithm changes, you can’t predict search rankings.

You might have a page rank #1 for a search phrase only to later in the day rank #10 for the same search phrase. There are a number of reasons for this. One reason is because different searchers have different search profiles and Google is tracking them. You can’t control that. That’s why you shouldn’t focus too heavily on ranking in Google.

Online marketers now have a lot of reasonable avenues for attracting new traffic to their websites. You have:

  • Facebook
  • YouTube
  • Twitter
  • Pinterest
  • Bing
  • Niche websites

And more!

Focus on building your brand and traffic through a variety of online promotional means. If you do that, rankings will take care of themselves – as long as you don’t get too spammy.

If you run frequent social media campaigns, you will undoubtedly use certain applications to assist you with posting messages. There are quite a few of them out there. The purpose of this blog post isn’t to discuss the merits of those applications or compare them. What we’d like to discuss today is whether or not it is prudent to pre-schedule your social media messages.

Some of the applications you can use allow you to pre-schedule your social media messages on the various social media sites.

Hootsuite, for instance, will allow you to pre-schedule messages on Facebook and Twitter, but you can’t pre-schedule on Google+. Do Share is a Google Chrome application that allows you to pre-schedule messages for Google+, but you have to be logged in for those messages to actually post.

Despite these drawbacks, there are benefits to pre-scheduling. First and foremost is time management. By pre-writing and pre-scheduling your messages, you can save time. Write your messages in advance and schedule them to post when you want them to.

I’d be careful to rely on this method too much. You still want to interact with your audience, retweet and re-share posts on the various social media sites you participate on. You want your presence to be personal and approachable if not spontaneous. Still, pre-scheduling some of your messages – those that are not necessarily timely or that are easy to write and can be posted at any time – can benefit you in the long run.

Our recommendation: Pre-schedule certain posts that you can share at any time without detriment. More timely messages should be posted when prudent for your business and your audience.

Search Engine Journal explains really well why you might be losing traffic to Google if you fall into a certain website classification. But it’s been our experience that even new websites aren’t getting as much direct traffic from Google as they used to. And that includes websites where Google is not providing direct information.

Nevermind why this is happening. The truth is, you can’t do anything about it. Except one thing: Seek alternative sources of traffic.

Now, more than ever, it is very important to seek website traffic from other sources. But what sources should you consider? Here are three specific sources I’d recommend for getting more website traffic besides Google search:

  1. Guest blogging – Much has been said about guest blogging. I won’t harp on the benefits. One thing is for sure, however. If you guest blog correctly, you’ll get more traffic to your website. Start with blogging on sites within your niche or that target the same audience you do.
  2. Social media – Google can’t control Facebook, Twitter, and other social media sites. If you’re more active on these sites, you’ll drive more traffic to your website. You may also appear in Google search more for your brand name, which is a huge benefit. By the way, Google+ is included in this category, and you should know that Google+ is counted as a separate referrer channel in most analytics packages than Google search.
  3. Paid advertising - Google wants your money. They want you to advertise with PPC. That’s why they’ve made certain changes like (keyword not provided). Don’t get upset about it. PPC is a good traffic generator. Use it wisely.

I know what you’re thinking. PPC costs money, and that’s true. If you want a less expensive alternative, spend some time on social media. It’s growing in its payoff benefits.

Google Trends is a fun way to find new opportunities for keywords and subjects to blog or write about. If you’re not using it in your research, you might want to give it a go. According to Google, they’re beginning to improve the tool with a beta.

In other words, they’re incorporating some changes to deal with ambiguity in searches and search comparisons.

For instance, to use their own example, if you want to compare search trends for Rice University and Harvard University, then you need to narrow your search to beyond “rice.” Otherwise, you might get skewed results as Google will include trends for the tiny white food that some people say isn’t real food. That’s not what you want.

There are countless other examples where this kind of ambiguity can play out. Searching for celebrities or place names could pose a problem as previously Google Trends wouldn’t include misspellings. Now, it does.

Also, alternative search terms may be included in your findings when you use the search tool. That would be a useful feature too – if you could exclude the alternate search terms at will.

I think we should all spend about half an hour playing around with Google Trends this afternoon. Then, you can get back to work and produce more of your fantastic content based on your findings.

A few years ago, if you’d have asked anyone doing any kind of Internet marketing at all what their No. 1 referrer was, the answer would have been overwhelmingly “Google.” In fact, Google accounted for about 90% of all website traffic at one time. Today, that number is reduced drastically.

If 60% of your traffic is coming from Google today, then you’re doing well. Chances are, however, that you’re getting the bulk of your website traffic from other sources.

But what are those other sources?

For many website owners, those sources include:

  • Facebook
  • Twitter
  • YouTube
  • Craigslist
  • Third-party niche websites
  • LinkedIn
  • Pinterest
  • Google+

See a trend?

For many website owners, social media has risen to be the No. 1 referrer of traffic. If you are active on several social media sites, then you may have noticed that too. But more often than not, it’s not just one social media website that is referring traffic. It’s several sites delivering a portion of the traffic each.

In that climate, Google may still be your No. 1 referrer, but it isn’t a majority referrer. In other words, they may refer more traffic to your site than any other website but not above 50% of your total traffic. If you do get more than 50% of your traffic from any one source, then you’ve got a gold mine.

This is important to note for several reasons. You should put your money where your traffic is, and where your conversions are.

In other words, if your No. 1 traffic referrer is Facebook, no matter what the percentage is, then focus on converting that traffic to sales. If Facebook is your No. 1 traffic source but most of your conversions come from Twitter, then spend a little more time on Twitter. But don’t neglect Facebook! Instead, try to figure out how to turn Facebook traffic into sales.

It’s an age-old strategy. Put your investment where your payoff is. Re-invest in your biggest moneymaker and you’ll see your ROI go up.

Google is getting more sophisticated in the way that they allow webmasters to track and measure website traffic. The new analytics is referred to as Universal Analytics.

Universal Analytics is centered around four specific and key areas of measurement:

  • Organic search traffic – Universal Analytics allows you to designate which search engines are more significant to your measurement goals. You can remove search engines from your list and prioritize those that are on your list.
  • Session and campaign timeout – The default is 30 minutes for sessions and 6 months for campaigns, but Universal Analytics allows you to change those parameters based on your cookies and website policies.
  • Referral exclusions – Referral traffic is an important metric for any website. By being allowed to exclude certain referral sources you can get a truer picture of your session timeout data. Learn more about how this works here.
  • Search term exclusions - You can exclude search terms that people use to find your website and when you do Universal Analytics will count that traffic as Direct Traffic.

Universal Analytics gives you more control over how you measure traffic information related to your website, but it also means spending more time playing with the controls that measure these statistics.

If you need help figuring out Universal Analytics, talk to a search engine marketing specialist about how to incorporate it into your business.

Everywhere I look now there is an article going up on some SEO website, in an e-mail newsletter, or one of the dozen or so Internet marketing news websites I read each day about how you get can back in Google’s good graces following the fallout from all those bad links you built. My only question is this, why did you even start building those links in the first place?

For at least ten years, Google’s song and dance has been “focus on content quality and usability.” You ignored that advice and went with your SEO agency’s advice instead. That advice amounted to

  • Paid links
  • Reciprocal links
  • Link wheels
  • Article directories
  • Link spam tactics

All the ways Google said not to do it, you did it anyway. Now you’re trying to figure out what happened.

In some cases, SEOs and online marketers thought they were following search engine guidelines. By the letter, they were. By the spirit, they weren’t even close. And now the owners of those websites are trying to figure out how to kill all their dead links and get back on top of the search engine listings.

Here’s a reality check: Even if you got rid of all of your bad links, there’s a good chance that you wouldn’t rise high enough in the search engines to recapture your old ranking. Sorry, but Google’s smarter than that. The latest algorithmic overhaul – Hummingbird, it’s called – is designed to give whole new ranking factors a greater prominence in the final results.

Instead of trying to game the system, why don’t you just focus on quality content instead?

Search Engine Journal comments on a video by Matt Cutts wherein he recommends three things specifically about metatag descriptions:

  • Write unique metatag descriptions for “pages that really matter”
  • Let Google auto-generate metatag descriptions for other pages
  • Absolutely DO NOT allow duplicate metatag descriptions for any of the pages on your website

This advice coincides perfectly with our own experience. We prefer metatag descriptions for most pages, but there are definitely times when you should let Google generate metatag descriptions.

For instance, when you have several web pages that are close to the same but not quite – an example would be an online dictionary of niche terms organized by alphabet where each letter of the alphabet has a separate page – then you might not want to write a metatag description. The last thing you want is 26 metatag descriptions that read something like

Glossary of terms for _____________, letter A.

where the only difference is the actual letter. In this case, you’d essentially have 26 duplicate metatag descriptions with one small variation. Even if you rewrite this description, there are only so many ways to say the same thing. Your best bet is to let Google generate the search snippet based on the user’s query.

When it comes to long web pages with a lot of information on them, especially web pages where you might have several subheadings, you want to write your own metatag description.

Still, even if you write your own metatag description, there is a good chance that Google will replace it with a search snippet customized to a searcher’s query. There’s nothing wrong with that so don’t be alarmed if you see it. But if you are targeting your long-copy web page toward one or two keywords or phrases, then you can write a metatag description that targets those words or phrases. That can benefit you.

Here’s an idea. If you want a new way to appear in the search results without having to build a website, tweak your website with a few additional pages, and/or bombarding your friends with social media messages, try writing and publishing a book. The long, drawn out legal battle between Google Books and The Authors Guild over whether or not it constitutes copyright infringement for Google Books to scan pages of published works has resulted in a big win for Google.

This is actually good for searchers and authors alike. Consider this scenario.

You write a book about the mating habits of warthogs. A searcher interested in the topic of warthog sexual behavior conducts a Google search and one of the results is a passage from the Foreword of your book. That Foreword actually entices the searcher to head to the library and check your book out. After thoroughly reading the book and returning it, they decide it would make a great Christmas gift for Uncle Bob.

Congratulations! You just picked up two new readers of your book, and it was all because you found a new way to be included in search results.

Authors should consider this a good thing. Google has been saying all along that the practice of scanning pages from books acts as a digital card catalog. They’re not scanning entire books, just a few passages, a few pages. A judge considered it fair use. I think we can expect The Authors Guild to appeal, but will they win?

I still run into people trying to do SEO likes it’s 2005. Bill Slawski has an excellent post at SEO By The Sea regarding a Google patent that may help the search engine identify link spam.

There are several aspects of this blog post that we could discuss. I’d like to focus on one point: Anchor text spam.

Here’s what Bill says about it.

Anchor Text Spamming – This involves acquiring links from a large number of pages linking to a particular page using the same anchor text, to get that page to rank highly for that text in search results.

I can think of two instances where this could be a problem for regular people trying to increase their search engine rankings and using outdated strategies that could get them into trouble. One is bloggers who use their blog to build internal links using the same anchor text phrases over and over and associating those phrases with a particular web page on their website. The other instance is guest blogging.

If you do a lot of guest blogging and you have a single bio that you use for every guest blog post, then you should pay attention this. It’s possible that your bio could be considered anchor text spam if you use the same anchor text phrase to link to your website every time.

I’m not saying you should stop guest blogging. I am saying you might consider varying your anchor text in your bio.

This isn’t to say that Google is definitely flagging your blog posts as anchor text spam, but if you keep doing the same thing and you aren’t getting results, then maybe you should try something different.

If you haven’t figured out that site speed is important, then you should consider why Google might introduce the Page Speed Suggestions Report inside Google Analytics.

This is a report that truly looks helpful.

When you’re inside your Google Analytics account, click on the Content – Site Speed section. Next, click on PageSpeed Suggestions. You’ll get a Page Speed Insights page, which should help you see how you can improve the necessary pages on your website.

Your Page Speed score will be a number between 0 and 100. The closer to 100 you get on that score, the better your page speed for the tested page. It’s important that you understand, however, that the tool doesn’t measure page speed. It measures the extent to which you can improve the speed of the page. A lower score means you can improve it a lot.

By analyzing the speed of your web pages, you can determine if you have too many graphics on a page, too much script, or a lot of videos. Too many ads, for instance, can result in a slower page speed.

As the Internet gets faster and faster, page speed will likely be a bigger issue for websites. That illustrates the importance of updating your web pages from time to time to take advantage of the latest web design practices. You should test all the pages on your website periodically to see how they rate on page speed.

Bill Slawski has an excellent post this morning on Hummingbird and Authorship. What it boils down to is short text, or social messages.

If you’re one of those people who has developed a habit of sharing links on social media but not including any context for those links by adding helpful commentary so your fans and followers can understand the importance of the link, then you probably aren’t doing yourself any favors. You should start adding more to your social messages.

I’m not saying you should write a book. Twitter only gives you 140 characters, but those 140 characters are very important.

In a word, they add context to your links. But that’s true of your messages on Facebook, LinkedIn, and Google+ too. What you say about the links you post can determine an awful lot about what you think of that link. In the case of Google+, it could also determine your authority on the topics you post about. Google knows what those topics are based on your social messages – or short text.

Here’s an example:

Let’s say you post a link to a how-to on changing the oil in a Mercedes. If you are a Mercedes auto mechanic, then that’s a link that is right in line with your expertise. But how will Google know that if all you post is a link. One paragraph of text explaining that the article is a must-read for anyone who owns a Mercedes helps Google associated the keyword “Mercedes” with your name and reputation. Do that enough times and Google will learn to associate your name with “Mercedes” all the time.

One post here and there isn’t much, but long term, a habit of turning your links into short commentary will give you a boost in authority.

It appears that Google passed the biggest update since 2010′s Caffeine a month ago. Did you notice? That’s OK. Most of us didn’t.

But it’s being talked about all over the Web.

The change seems to be in honor of its 15th birthday and took place in a private meeting evidently with some of the world’s top journalists yesterday. But what does this new algorithm update mean for us content marketers?

Google’s Inside Search blog gives us a clue.

Hummingbird is designed to make extensive use of Google’s Knowledge Graph. That’s great. I was wondering when they’d get around to actually doing something with that. Remember, the Knowledge Graph was introduced last year?

So the idea is this … you want to know something. Instead of typing in a keyword phrase to get information on a particular topic, you simply ask a question. One example Amit Singhal gives is, “How much saturated fat is in butter versus olive oil?” Just ask Google to compare them. Instant answer.

I have a feeling that this is in its primitive form and nowhere near perfect, but let’s try it out. Here are a few more examples:

Obviously, it doesn’t work for every search, but how can search marketers use this information to create better content? Start by ensuring that your content is designed to answer a single query. Write intelligent natural language content rather than keyword-based content searchers can find anywhere.

Lead your niche in high quality content that answers searchers’ questions and you’ll have a leg up.