Page Speed For and Beyond Google Mobile Ranking

Page Speed is Not Just for SEO Yet More Important than Ever for SEO

Nearly every audit I perform, I find weakness in page speeds. Twenty, thirty or forty second average load times.

Even when I audit a site and find the “30 day average” is under five seconds for most pages, the moment I dig in to look at individual page trends for 30 or sixty days, I often find some “mostly fast” pages are in the ten, fifteen or twenty five second range one or a few days a week or month.

Intermittent Page Speed Issues Revealed in Google Analytics

Scale that out. If thirty % of your site’s pages are slow a few days a month, and that’s involving a few hundred or several thousand visitors for those days, how many of them are getting those slow speeds?

SEO is far beyond the only thing that matters. Every visitor, from every referrer source, or even direct, who encounters a slow page, is potentially going to abandon the site.

And yes, even intermittent speed problems matter if you care about success.

While you can use a tool like Google Page Speed Insights, or Chrome Dev Tools to find ways to improve speeds, those tools are far from complete in their speed bottleneck issue identification, and far from complete in their recommendations.

WebPageTest, GTMetrix and other tools can help of course. Yet NONE of them cover some issues that need to be addressed if you want to properly resolve speed issues.

Sure, you can see “compress images” or “eliminate render-blocking, among other things. And that’s all important. Yet there’s much more that can be done.

_______________________________________________
Page Speed Improvements Beyond The Typical Tools

Rather than going through the GPSI list, I’m going beyond those.

Here then, are the top factors I find consistently over time, for page speed improvements not directly listed or mentioned in most speed tools:

_______________________________________________
Hosting Provider Forced Delays

1. OCSP Lookup

A fairly recent phenomena — a number of hosting configs are forcing OCSP certificate lookup verification through an off-server network. Just this week I saw one go through GoDaddy servers, and one go through Symantec servers.

Neither site I found this happening on, is being hosted on EITHER of those platforms.

And before the first byte is even downloaded, one and a half to two or more seconds have been lost to that process. It’s outrageous.

First Process in page load is a hosting system SSL certificate check, delaying page load.

A good hosting environment doesn’t force that process.  So go run a test through WebPageTest.org then go to the “Details” report.  Look at the first or first couple of processes for your home page.  Do they look odd? Out of place? Those could very well be a host certificate verification process being forced into the sequence.

If your own home page HTML retrieval in that test isn’t listed as the very first process, that’s a problem that needs to be resolved.

2. PHP Version

While we’re talking about hosting — check the PHP version on whatever host you’re using. Many hosts are stuck in PHP 5.6 instead of 7 — where PHP 7 has major speed improvements.

Be careful though — in some edge-case scenarios, upgrading to PHP 7 can cause other problems so be sure that won’t happen if you switch.  And if it will cause problems, figure those out and resolve them so you can switch.

_______________________________________________
Excessive File Weight and Process Delays

Strip out every single excess process that can’t be justified as necessary for revenue. I see sites with hundreds of individual elements at the code level for a single page.

1. Font Excess

There is no justification for five or eight font files, and most especially get those fonts stored locally. Run away from Google’s own GStatic font server.

And do you really need that one font where the file size of that font is 150 kilobytes? Honestly — aesthetic opinions often kill page speed needlessly. So look for similar fonts that weigh much less. Or just strip out fonts entirely and go with system fonts.  Or put them on your own site server.

2. Asset Hosting Location Matters

In fact, locally host (or host on your own CDN setup) as many assets as possible — fonts, images, scripts, whatever — too many sites pull assets from other networks and the delay in processing adds up significantly.

3. Tracking Pixel Insanity

No, a site doesn’t need 27 tracking pixels (see above where you need to justify every single asset being used or called).

4. Gravatar Nonsense.

While we’re at it, for cryin out loud, turn off Gravatar on WP blog systems. Forcing a page process to run back and forth to Gravatar for every single comment is nonsense, and I’ve seen pages grind to a snail’s pace loading while all that is going on. And the more comments a blog post has, the more time wasted retreiving Gravatar “default image” images.

5. Unused Script Code

What? the site has JavaScript files that by themselves are 250kb? Yeah, I bet half the code in those scripts isn’t even being used on that site.

6. Abusive Design Assets

No, a site doesn’t need fifteen CSS files.

7. Image Sizes — Pixel Dimensions Matter

Are all of the images on the site already sized dimensionally for their specific space needs? Or are you taking an image that’s 1000 pixels wide, down into a 200 pixel wide space?

So make sure those images are sized properly for dimensions before you compress them for weight.

_______________________________________________
Going Beyond Google Mobile Speed Ranking Considerations

I’m going to wrap up with a mini-rant. I don’t care if improving speed isn’t going to help the overwhelming majority of sites from an SEO perspective.

Let’s talk about perspective. Google reps claim the upcoming mobile speed impact update will only impact a small percentage of searches.

According to InternetLiveStats.com, there are 1.2 trillion searches a year. Some estimates around the web put it at over two trillion at this point.

Let’s say the “small percentage” John Mueller likes to use as his statement that’s potentially going to be impacted by the Google Mobile Ranking factor is “just” one percent.

That’s twelve billion searches will be impacted by that speed update each year.

TWELVE BILLION SEARCHES.

Let’s be generous though. Let’s say it’s only one tenth of one percent.

That’s still 1.2 billion searches.

ONE POINT TWO BILLION SEARCHES. EACH YEAR.

Anybody who ignores speed because “it’s a small percentage of searches” is not considering what that really means.

_______________________________________________
Mobile Speed — Way More than SEO

According to Google if your mobile speed is nine seconds, they estimate you will lose 29% of your visitors.

Let’s move beyond formulaic SEO. Formulaic SEO is never what matters most. User Experience matters most.

Formulaic SEO attempts to emulate that. Except formulaic SEO needs to go with averages and generalized cases.  Which means an entire sector of sites, and an entire sector of society, are going to be left out of that process.

User Experience on web sites isn’t just for those coming from Google organic results. It’s about visitors from everywhere.

Studies out there have shown that even a site that loads in 10 seconds (far below the old 20 second threshold for SEO on desktop), is likely causing a large number of visitors to abandon the site.

If you go to Google’s Newest Mobile Speed Testing Tool, (Powered by WebPageTest.org by the way, my go-to testing tool), you can see how many missed opportunities a site could very well be experiencing because that site isn’t lightning fast. Going from 10 seconds (not critical for SEO) down to six seconds or four seconds, can make an impact.

Yet there’s a reason Google pushes “one to three seconds” as the ideal range. It’s not all about SEO or Google resources. It’s because they have enough data to know the value for businesses, when site speeds are lightning fast.

While I don’t expect every site developer or manager or owner can expend the time to eek out every last drop of speed delay causes, I think it’s critical to go further than most people are willing to do, and will become more important as we move forward.

At least until every human being is connected to the web on lightning fast connections, every site is on a lightning fast server, and you can get away with fifteen megabyte page weights.

So if you want to maximize, or at least take logical, fiscally responsible steps to improve the quality of your site experience from a speed perspective,  it’s worth the effort.

Whether it’s because of Google’s new mobile speed ranking consideration or for all site visitors, I highly recommend you take the steps necessary to clean up the excess junk in your site’s code.  Even if you don’t see immediate increases in sales or goal conversions, it’s about sustainable quality.

Indexing and Crawling — Hints vs. Directives

A word or 300 about using “noindex,follow” and other indexing signal factors…
 
Cartoon about how many things in SEO require a response of 'it depends'Questions have been asked from people in the SEO community steadily over the years about using “noindex,follow”, or “canonical tags pointing to other pages”, as ways to get Google to act a certain way.  It’s a subject that’s come up from time to time recently in a group I’m an admin of over on Facebook — “Dumb SEO Questions”.  
In providing some insight there today, I thought it would be a strong enough topic to deserve my writing a blog post about it.
 
Note this is my perspective and experience to those lines of thinking. It doesn’t mean what I convey here is absolute, set in stone, or applicable to every site and every situation.  With all things SEO, there are edge case scenarios where something else may be true.  So take what I offer here and do with it what you will.

Google — The All-Knowing Decider of Indexing

Google, being the all wise cataloger of the web (in their view), does not truly respect robots.txt, meta robots tags, x‑robots tags or canonical tags, in spite of those each having the purpose of being a directive. Google’s programmers, in their view of the world, consider each such signal only as a “hint” as to what site owners intend.
_______________________________________

Conflicts Within Site Signals Muddy The Process

 Of course, many sites inject conflicts across that range of signals, which is the altruistic reason Google decided long ago to only take them as hints. Except that means the imperfect algorithmic process quite often makes a poor determination as to what the system “should” do if not what those signals convey.
 
Because of all of that, even including a URL type in robots.txt or canonicalizing to a different URL or having a meta robots noindex status can sometimes not prevent Google from not only crawling some URLs but also where they end up indexing them.
 _______________________________________

Crawled Not Indexed

 “Most” of the time (a relative concept), if a file is listed in the robots.txt file, even when Google crawls them, they will list those URLs in search results, yet they will have at least honored the spirit of the robots.txt file by not indexing the content of those URLs. It’s pretty insane sometimes.
 _______________________________________

The “Noindex,Follow” Way to Wealth, Fame and Lost Value

 As for “noindex,follow”, there’s never a valid reason to use that combination. Sure, Google will pass value THROUGH those pages, initially. Except if the noindex aspect remains long enough, Google will eventually stop even passing the value through them — they’ll end up being removed from the Google process entirely. 
John Mueller confirmed this in a  Webmaster Hangout — Barry Schwartz shared the video over on SEO Roundtable .
 
And from the perspective of consistency of signals, if a page deserves “noindex” status, it’s best to use “nofollow” as well, so as to more readily convey what you do want indexed and crawled simultaneously. For larger sites this is even more important because of crawl budget considerations, where such a factor starts to become integral to getting signals correctly understood in formulaic processing.
 _______________________________________

But Wait — There’s More to It!

 Then there’s the notion that other signals get factored in to all of this as well.  If enough pages are deemed, by Google’s systems, to be unworthy of indexing for other reasons — too much duplication, not enough unique value, not enough trust, and other examples, those pages won’t always be indexed in spite of other signals. Or they may be indexed, yet not helpful.  In fact, Google will sometimes index pages that don’t ultimately deserve to be indexed, and that alone can weaken the value of pages that do deserve to be indexed.
 _______________________________________

URL Parameters, Sitemap XML Files, and Inbound Links

Whenever discussing the crawl and indexation decision process, it’s important to also mention the fact that URL paramters, when set to “representative URL”, or “Let Googlebot Decide” can also muck up that decision process. The same is true for inclusion in sitemap XML files and when enough high value inbound links exist.  All of these can influence, to varying degrees, how and what Google crawls and ultimately indexes.  In spite of robots settings or canonical tags.

_______________________________________

The SEO Indexing Bottom Line — Consistency

Okay that wasn’t the bottom line. It was just the last section label in this post.  The best recommendation I have is one I repeat often in my audit work — Never leave it to Google to “figure it all out” when you have the ability to control, through consistency of signals, what you want their systems to do and how you want their systems to behave regarding your site.

 

How I Got Here — This SEO Audit Life

Jacob Punton asked me this over on Facebook today:

“Alan, do you just work on audits? Or do you do implementation as well? I would love for you to set up a course on how you gain so many clients for audits 🙂 would be happy to pay too”

His questions arose after I posted on Facebook last night that I had delivered yet one more audit, and have two more to deliver this week.

Another audit delivered - two more to be completed this week

I went to reply in a comment to his question with a short answer.  20 minutes later, I had this long answer and realized it might be fun to post it to my site instead.

I started out in web development in 1995.  Didn’t get involved with SEO until around 2000.  After seven years of doing the work of SEO, where many of those were a mix of doing web development, web content writing, web project management, PPC and SEO, I started offering site audits because one client in particular – a law firm, had a very difficult situation where only an audit would suffice.
The success we got from that base allowed me to start doing audits on all the sites I had anything to do with regarding SEO – either hands-on or managing others in the work.  I did that for a couple years, while I built up my audit business.

Then, the audit work got to the point where it was so steady I went to semi-retired status five years ago this month.

The view from my home office.
The view from my home office.

So – how is it that my audit business has grown so much over the years?  To the point where I grossed over $170k in the first six months of 2017, working part-time, from home, living at the beach?

I don’t know if this is going to help anyone in particular — Jacob or otherwise, however it’s my story. It’s how I got here.

____________________________________

1) I built my reputation in overall web marketing, web development and web project management over many years.

2) I became highly skilled in SEO by learning core SEO concepts and merging them with my knowledge of overall web marketing and development, then doing the work starting with small sites and building from there.

3) I shaped my personal brand around all of this, including my web site.

4) I have continually along the way looked for ways to be helpful to others not just in the paid work, but also in being a consistently serious participant in the online community, and speaking at conferences, shifting my energy to different places over the course of time, to increase my visibility as a truly helpful contributor.

5) In that effort, I have always sought to contribute more and gone above and beyond in that helpfulness where possible and reasonable.

6) I have built a strong list of agencies — web design, web development, search marketing, and PR agencies who outsource to me whenever they need to, and I offer them a 20% discount off my rates, where they can choose to charge less, the same, or more, or not at all, as they see fit, for that work I provide.

7) I’ve never been afraid to seek help and knowledge from others who have experience or more experience than I do for various situations.

8) I have built a reputation for being brutally honest in my audit work which is not what most site owners and managers typically get when they pay for SEO.  Clients appreciate that.

No Nonsense SEO
Email from a recent client

 

9) Because of my much broader experience and understanding of web in general, and business beyond marketing, I am able to help clients fit their SEO strategy into that bigger business reality.

10) Because of that experience, I am able to speak intelligently to and speak the same language as developers, designers, and content specialists, CEOs, CFOs, and CMOs, and can often cut through their potential initial misunderstanding, bias or resistance.

11) I have a prospective intake process that’s effortless.  Somebody reaches out to me asking about an audit. I take a quick look around the site and at some data from one of the more than forty-five tools I rely upon in my audit work.

I send them an email with two or three important things I find in that five or ten minute poking-around effort, explaining what I found, and why it’s important.

I then have a boilerplate set of audit process, pricing, expectation and caveat terms I paste below that customized response.

I send every prospect a list of several (not just three) previous client references.

My pricing is straight forward based on site scale. I offer two payment options – 50% up front, remainder on delivery, or all up-front for a 10% discount.  I do NOT offer or allow ANY other price / payment arrangements.

12) Most prospects after getting that, immediately ask for a formal contract proposal.  I have afew different proposal templates based on the type of contract (one site, multiple sites, or a “live site / rebuild/re-launch” scenario).  That proposal is SIMPLE – just a few pages. Yet crafted and refined over many years, with lawyer input for protection of the client and my business.

Most clients who get those sign right away, though some take months to respond because they have other things going on.  Yet once signed, and the contract is returned along with a payment (I accept check, wire transfer and credit card via PayPal), I send an intake form with several questions, and then the work begins.  It’s a methodical process I have done over and over for years.

13) A few times a year, I refine that process even more, or I refine the audit process even more.  I’ve gotten highly efficient in the full life cycle of audit work.

14) When I do an audit, I focus on what matters most, in a prioritized manner, and don’t bother throwing the kitchen sink of minor issues at them so my audits aren’t 500 pages long — they cut to the heart of the issues.

15) After audit delivery, I include hours for follow-up consulting to help guide clients and their teams during the implementation process, at no additional charge. They know they can count on me based on my availability, to be there to help them through that work and beyond.

16) I rely heavily on my part-time assistant, Sharon, who gathers much of the core data I need in my audits, and puts it into spreadsheet tabs – it’s what I consider the most tedious aspect of the work, yet she loves doing it.

17) I pay her the same rate I get – which varies by audit – around $300 to $500 an hour on average even though she’s “only” doing data gathering and organizing. Why? Because if she wasn’t doing it, I would. And since that’s how much I get per hour, why would I NOT pay her that rate?  I make so much money doing what I love that it makes sense as part of my “working smart” model.  She is so happy for that, and so loyal to participating in the business, it’s a no-brainer for me.

18) My audits educate and guide. The more I can help client teams learn as they go, the more empowered they are to not need someone at my level in the future.

19) ANY time I end up being too late in my deliverables due to the reality of business, I ALWAYS seek to compensate my clients without hesitation by offering partial refunds. This happens once or twice a year, and it goes a long way to showing I respect the relationship.

20) Fate, God, Luck, Intuition — whatever you call it, all along the way, over the entire course of my career, I’ve been guided to know where to shift, when to shift, and to what degree.

21) I pick and choose projects I work on where each year, I’ve focused more on what I want to be doing and less on what I don’t.

22) I do my best to refer out work to others I trust and respect as often as possible.

23) I have a strict no-compensation policy whether I refer work out to others, or others refer work to me, to keep the integrity line clear.

24) I know when to cut ties with a client who turns out to be overly needy. Yet even then, I do my best to hand them off to someone else where the relationship might be better aligned.

25) I never take ANY of this for granted. The last time I got truly arrogant in my life’s work, I lost it all – ended up relapsing after many years drug/alcohol free.  It took me many years after that to rebuild my life – I’ve been drug free since 2004 and I don’t plan on, nor do I have any desire to jeopardize that again. Even with that, life is delicate, and happens in ways we can’t always control, so my gratitude is insanely off the scale big.  So every single day, I express gratitude within myself, with God, and to others.  Every. Single. Day.  Did so long before I had anything to “show” for it.  Long before I was making 1/10th what I make now.  Long before living at the beach.  Gratitude is so important.

26 (Bonus factor) — many clients appreciate my work so much, they refer me to others.  Once in a while, clients like David Sinick, owner of PaleoHacks, are a gold mine for referrals, without me asking. Since doing my first few audits for David last fall, he’s sent me no less than a dozen other site owners who have also hired me for audits.

Well there you have it – a “brief” overview of how I got to this point.

Advanced SEO — Reducing Category & Faceted Navigation Clutter

While the actual numbers are different on each site, all too often I see sites where the volume of indexed pages becomes exponentially greater than the total number of products offered. For Ecommerce sites, it gets out of control, and becomes a crisis situation.

URL Parameters Clutter Crawlability and Uniqueness Understanding

Actual SKU Count: 9,000

Indexed Pages: 45,000

 

Actual SKU Count: 25,000

Indexed Pages: 350,000

 

Actual Sku Count: 750,000

Indexed Pages: 9,000,000

By understanding how this happens, why it happens, and what to do about it, you can, when proceeding with caution, dramatically reduce the indexed page count while simultaneously improving overall visibility in search engines for the pages you care most about. That in turn, can significantly increase revenue.

A Word of Caution

Warning – every site is unique. Each situation is different. While the concepts presented here are based on real world experience repeated many times across several different sites in different industries, there are countless reasons this is not a guaranteed outcome.

Inability to implement technical changes is often at the top of the list of reasons this can sometimes not be achieved. Difficulty and complexity of product / sku and customer base expectations can also be another major barrier.

Improved Revenue through Duplicate Content Reduction

Another critical consideration is that I do not believe in isolated changes to a site from an SEO perspective on scale. If a site has many weaknesses, making any one significant change may help, yet it may not be enough to score the big win. There’s no simple answer in where that threshold of “we did enough, across enough areas” exists for any single site.

When the wins happen though, they can be big. Very big!

In this example, for an audit I performed this spring, and where the work took weeks to implement, then a couple months to stabilize, we see how there was a dramatic reduction in total pages indexed based on my recommendations for cleaning up categorization and faceted navigation, along with a dramatic increase (over 20%) in revenue.

How do I fix Duplicate Content? Note:

Several changes were made in late spring, and the big “index cleanup” steps in this example, took place in June of 2016, while additional improvements were made after that as well.

 A Quick Primer

Before I dive into the ‘how do get there’, it’s important that people just getting started in SEO, or people not yet beyond fundamental intermediate understanding, get a short primer in core concepts.

If you’re already deeply advanced, read this anyway. We can never become complacent in our understanding.

 Core Principle:

Search Engines Attempt to Emulate User Experience through Formulaic Methods

Crawl Efficiency Red Flag:

If a human being ends up having to spend too much time trying to find what they want or need, it is most likely search engine crawl budget limitations will be overburdened and your most important content will become lost in that process. End result — SEO suffers.

Sensation Overload Red Flag:

If a human being is confused or overwhelmed, it is most likely search engine algorithms will be formulaically overwhelmed.

Too many sites I audit have content organizational overload that becomes toxic on scale. End result – SEO suffers.

Let’s Dive In – Proper Planning Prevents Poor Performance

If you don’t take the time to plan things out, the results you end up with could be a death knell to your business.
The concepts I present here will hopefully give you enough understanding for you to be able to evaluate your own site situation and then plan for the cleanup, accordingly.

The Ecommerce Organizational Example

9,000 products – different organizational scenarios:

If you break it out into 30 categories, with no subcategories, and 300 products in each category, that is “going too wide and not deep enough” for both human experience and SEO.

If you have 3 categories, 300 subcategories, and 10 products in each of those, that is just is equally problematic because it goes too wide at the subcategory level and too thin within each of those.

If you have 10 categories, and 10 subcategories in each, with 90 products in each subcategory, that is one example of a reasonable site structure.

Exceptions to the Rule

If you have some categories that contain more subcategories than others, that can be perfectly valid. If some subcategories have more products than others, that too, can be perfectly valid.

Sometimes you really do have a need for an extremely large volume of main categories or within any of those, a large volume of subcategories. Or within a single subcategory, a large volume of products.

Narrowing Opportunities Can Wreak Havoc

Another usability “feature that can get out of control is “faceted” navigation – the ability for visitors to further sort, group or refine any given category or subcategory even further.

Examples:

  • Sort by price (highest to lowest, lowest to highest)
  • Refine by price group (products under $x, products in a $x to $y price range, etc.)
  • Refine by color / size / brand / ______
  • Newest Products
  • Best Sellers / Most Popular

These are all valid ways, when it makes sense to offer them, to help visitors look for a narrow set of products to help meet their unique situation, goal or needs.

Except you can take that too far as well, and overwhelm the senses, end up causing duplicate content confusion, and severely weaken crawl efficiency.

Human Users vs. Search Engines as Users

One of the most crucial considerations in this process is whether a particular refinement of product grouping may be beneficial to potential customers, yet where that refinement overwhelms search engines and thus isn’t helpful when it’s indexed.

Sometimes people, once they arrive at a site, want to do things that, from an SEO perspective, don’t warrant indexation, so remembering that as you make decisions, is critical.

Product Grouping & Filter Narrowing Organization Decision Process

In order to determine exactly what silos should exist, and in what combination, it will be vital for you to take the time, up front, to evaluate organizational criteria based on:

  • Total number of products at each level
  • Overall search volume
  • Overall Revenue Value
  • Refined Profit Margin Value

 

If this step is taken, it will help you make decisions about whether some subcategories might not be worth keeping in the new system. Some might not have enough products on their own (very thin pages).

Others might not have enough search volume to bother (low value based on inactivity), and others still, may be loss-leader or very low profit margin value to bother wanting to keep as individual pages, where consolidating those products or subcategories into other, more valuable pages can help bolster those.

 Product Group Filter / Refinement Functionality

When you present visitors with the ability to narrow a particular category or subcategory result by various criteria, it can help those visitors bypass the need to look through all of your products when they want a particular highly refined sub-set.

As mentioned earlier in this article, giving visitors the ability to sort by price, new products, popularity, color or other “facets”, you can help them find what they are looking for sooner. Yet that can become toxic as well.

Too Many Filters End Up Becoming Toxic to Visitors & SEO

If I go to a page on a site and I am overwhelmed by the choices presented, I can become confused and frustrated. If search engine algorithms have to “figure it all out” with so many choices, formulaic evaluation decisions can break down.

Research Matters

Is this narrowing function helpful to visitors?

Is there enough search volume to warrant allowing this narrowing function result to become indexed?

Is there enough search volume to make the duplicate content concern worth addressing in other ways?

Example:

Women’s Dresses

Summer Dresses

Blue Floral Print Summer Dresses

Blue Floral Print Summer Dresses Under $100

At what point is there enough search volume to justify any of these being indexable?

At what point is there not enough search volume to justify the duplicate result concern?

Other Channel Data

When determining if there’s enough search volume, then it’s time to look at existing and previous sales data. At this point, don’t just rely on organic search originated sales. Especially if a particular combination of features or “facets” wasn’t visible previously due to poor SEO.

So look at overall sales data from all channels. Direct, PPC, email marketing – all of these can help inform your decision as to whether a particular combination of features or facets may be more valuable than the very narrow data specific to organic listings.

Profit Margins Matter!

Just because you have data that may show “we sold a lot of this combination”, it doesn’t mean that combination deserves to be indexed even if there’s enough organic search volume overall to seemingly justify indexation.

If a particular group of products has a high sales volume, yet the profit margin on that group is very low, by preventing that from being indexed in search engines, you can reduce some of the duplicate content confusion, and that in turn, can help put more focus on higher profit margin product groups.

Stop Already ! It’s Too Much to Figure Out!

One concept I drive repeatedly in my audit work is “don’t leave it to Google to figure it all out”. That’s because the more you leave to Google to have to figure out, when you can help their crawlers and algorithms, the more likely that automated, formulaic process is going to make poor choices, give more value to things you care less about, and give less visibility to things you care more about.

The same concept can be applied to you and to site decision makers with all of these steps needed to get to highly refined indexation choices.

This is especially true when you have tens of thousands or hundreds of thousands of products or SKUs.

If that’s the case for you, one way to deal with it, at least in the short to mid-term, is to block ALL of the various filter / feature / facet options on your site from indexation.

If you do that, you can then come back in a future phase, and slowly, methodically reintegrate a limited number of filters / features / facets to indexation, one at a time. Yet only if you’ve taken the time to determine which of those deserve indexation based on the criteria I’ve communicated above.

Content Blocking Tactics

Once you’ve mapped out what to keep in the index and what to block, there are multiple ways you can go about the blocking process.

Robots.txt Silo Blocking Method

You can use the robots.txt file to block entire silos from indexation, if groups of products you want to block are contained in their own hierarchical URL silo.

MyDomain.com/

MyDomain.com/womens-dresses/

MyDomain.com/womens-dresses/summer/

MyDomain.com/womens-dresses/summer/under-100-dollars/

With a single robots.txt entry, you can block the entire “under-100-dollars” group. The flaw in that method is if you don’t have those “under 100 dollars” products accessible outside of that silo level. You don’t want to go too far in blocking, so that’s generally not always advisable.

If you do list all of your summer women’s dresses in the /summer/ level, then sure – you can block the /under-100-dollars/ group. That is a valid use of the robots.txt file, as long as the individual product URLs are not buried beneath that level though.

MyDomain.com/womens-dresses/summer/under-100-dollars/liz-claiborne-powder-blue-dress-8930w2/ is an example of having the individual product URL hierarchically situated in a silo level where blocking the /under-100-dollars/ level would be toxic to SEO.

If, instead, you have MyDomain.com/womens-dresses/summer/liz-claiborne-powder-blue-dress-8930w2/, and the /under-100-dollars/ level only exists to help show that subset of products for site visitors, you can safely block that /under-100-dollars/ level from indexation in the robots.txt file.

Meta Robots and URL Parameter Blocking

If you pass filters or facets in your URLs, you can use a custom parameter to designate all the various combinations of filters and facets you want blocked with one easy method.

MyDomain.com/womens-dresses/summer/

MyDomain.com/womens-dresses/summer/?price=0–100

MyDomain.com/womens-dresses/summer/?price=0–100&length=floor-length

MyDomain.com/womens-dresses/summer/?length=floor-length

MyDomain.com/womens-dresses/summer/?sequence=oldest-to-newest

MyDomain.com/womens-dresses/summer/?price=0–100&length=floor-length&sequence=newest-first

All of these are examples of how passing along filter or facet options in the URL can be done, and they’re great examples of how out of control filter and facet refinement can become.

Many clients I work with have Google Search Console set up to deal with URL parameters. Except that’s rarely done properly.

Not only is it dangerous to assume “Googlebot can figure it all out”, it leaves the site vulnerable to incorrect indexation results. And it’s not helpful to Bing or any other search engine.

So for URL parameter cases, if there are any filter or facet specific parameter combinations you want to block from indexation, you can add “noidx=1” to the end of those URL strings. Then, you can have an entry in your robots.txt file to look for the “noidx=1” parameter / state combination, and block those out.

Sure, if you do that, you can also change the Google Search Console setting to “no urls”, however it’s best to take full control on-site as your primary method of communication.

Meta Robots Blocking Methods

If you need more granular control as to what you block, you can programmatically set all of the pages you want to be blocked through a meta robots “noindex,nofollow” state in the header of those pages.

The Meta Robots “noindex,follow” myth

While there are always exceptions to the rule, most of the time, it is not valid or appropriate to use noindex and follow together in the meta robots tag. Why? Because of the following:

  • If you want a page indexed, it should, whenever possible, be accessed from a silo path where each page in that page is indexed.
  • Since a “noindex” state causes a page to have zero added SEO value*, a noindex,follow state sends zero SEO value to the target pages.
  • Forcing search crawlers to navigate through noindex pages, for content discovery is very inefficient, and weakens crawl budget limitation concerns.

*When I say a noindex page has zero SEO value, I mean to say there is no new value added from that page to pass along to pages linked from it. PageRank is a Pass Through state on noindex pages, not an “add value” state. If you have 1 million links, the lost crawl efficiency does more harm than pas through is worth. Maximized link distribution on scale is more efficient when each page crawled passes through & adds value.

So while you CAN have noindex,follow, and it is possible that ranking won’t be harmed, from a crawl efficiency perspective it is not a best practice.  And on scale, the larger the site, the more harm that can come from crawl inefficiency than it is worth to allow noindex,follow to take place.

Example: 1 million pages. 200,000 are noindex,follow.  What if you also have intermittent server issues, and Google abandons the crawl some of the time? When is it ever acceptable to allow that possibility, where in the process, tens of thousands of noindex,follow URLs got crawled, yet other pages that you want crawled for indexability, would be abandoned?  This is an extreme example (though I’ve seen much bigger sites) — however it’s the point that in a maximized efficiency scenario, noindex,follow is not helpful.

Meta Robots / Canonicalization Conflicts

Another critical flaw I find in sites that attempt to control what gets indexed and what doesn’t, is conflicts between meta robots and canonical tags.

Example:

Robots Canonical Conflicts

In the above example, the meta robots tag communicates “don’t index this page”, while the canonical tag communicates “index this URL”.

A Word About Navigation and Hierarchical Grouping

In the examples I used early on in this post, I used a single case example – 9,000 products. In many of the audits I perform, an individual site can easily reach into the tens of thousands or hundreds of thousands of products.

When that happens, it’s tempting to have hundreds of categories, or in any one or more categories, hundreds of subcategories. And just as tempting to end up with a single subcategory containing tens of thousands of products.

There is no one right mix or mathematical formula for how many URLs and links you have at any single level. The most important concept is to always be looking at it from the perspective of your ideal client or customer market.

If a site visitor becomes overwhelmed with choices at any single point, that’s going to be a red flag that search crawlers and algorithms will also become overwhelmed.

When it comes to displaying navigational options to visitors, that too needs to be filtered through human experience. If I see a list of 500 choices on the navigation bar or sidebar (and yes, I’ve seen worse), not only does that overwhelm the senses, it dilutes the refined topical focus vital to maximized SEO.

General Navigation Volume Guidelines

As a general rule, it’s wise to not have more than eight to twelve main categories, and not to have more than eight to twelve subcategories in any single category.

I shouldn’t be assaulted, as a user (human or search engine) with links to “all the things” in navigation. As a result, the more products you have, the more need there will be to go deeper in hierarchical silos.

When I, as a user, go into a single category, I should only see subcategory links that point to content within that category.

A Word About Flat Architecture

One last concept I need to convey in this post has to do with flat architecture. The SEO myth states “the closer to the root domain a page exists, the more important that page is”.

While the very basic concept is sound, at this point in Internet history, flat architecture is almost always toxic.

Given the scale of individual sites and niche markets dictates that creating a flat hierarchical URL structure sends invalid, and topical weakening signals.

Mydomain.com

Mydomain.com/primarycategory/

Mydomain.com/primarycategory-subcategory/

Mydomain.com/productdetailspage/

In this example, you’re communicating “each of these pages is equally important”. That is not only not true, it dilutes the importance of those pages that are truly more important, and those that have a larger topical reach. It’s forcing search engine algorithms to have to “figure it all out”, which I have said previously, is dangerous and likely very harmful to maximized SEO.

“But my URLs will become too long”

Generally speaking, it’s not only acceptable to end up with longer URLs the deeper you go in navigation, it’s perfectly valid for human needs as well. How many site visitors actually read the entire URL of a given page deep within a site?

Obviously, you may need to alter how you generate product detail page URLs – instead of using the entire 20 word product title, you’d probably be better off crafting a shorter version of those for this situational need.

A Final Word

I’ve done my best in this article to give as much clarity to the issue of product category and faceted navigation challenges and solution options for SEO. Even though that’s the case, I can’t emphasize enough that each situation is unique, and every site has not only its own needs for SEO, but also any number of potential limitations regarding the implementation of fixes.

Like anything else in SEO though, the closer you can get to a healthier state from an existing mess, the better your site will be overall. For humans and search engines as users.

And there’s always more to do. Always.

Multimedia and the Thin Content Issue

One issue that comes of frequently in my forensic audit work (especially on news and informational sites) is how multimedia pages all too often have little to no content in the form of descriptive, crawlable HTML text. Examples of such pages most often include videos, infographics, photo galleries, and interactive charts.

Publishers all too often, think, “this is the content – it’s not text because we know people like visual content”.

And for many site visitors, that thinking isn’t wrong. Unfortunately that can’t be said about all visitors, and all consumers of your content.

Different Mind Models

The first, and most important consideration, is human visitors – actual consumers of your content. Not all people visiting sites are visual in the way they process information most effectively. Or for certain types of content, they prefer reading rather than watching, even if they might otherwise, in different circumstances, enjoy video content.

I most often like to use myself as an example – when I visit a news site to consume information, I almost never go directly to that section of the site where video files are the primary content. I prefer to read words on a page. It’s just how I prefer to process information. And if you force a video file on me, with auto-play on, and auto-sound on, it actually annoys me.

This isn’t to say I’m atypical though. I’ve found over the years that many other people feel the same way.

It’s Not Just Me — Public Opinion on the Issue

Imgurians speak up against multimedia as clickbait without readable text options

To illustrate this, I recently came upon this gem on imgur, a highly popular sharing site where you get the full range of humorous, insightful, and educational content and personal commentary from a wide range of people in all different age groups and from all different socio-economic backgrounds.

Note imgur is site is visual in nature, which you may find ironic here. However it’s a site built ENTIRELY on visual content and text commentary – it’s entirely UGC (user generated content), and as such, I go to the site expecting visual content AND text commentary, often on a massive scale.

As I was browsing through the site this week, I came upon the above post — that speaks to this very issue. (Warning – foul, crass, and often very disturbing commentary can be found on that site – as a UGC driven site, with a mostly “free-for-all” posting policy, this is reality).

Note how this meme speaks directly to news sites, and the expectation of content presentation type.

Over 1,000 people upvoted this in less than 24 hours – moving this content to the front page – a big deal on imgur – 99% of content NEVER makes it to the front page.

And here are some of the comments on that post:

imgur comment confirming readable text content matters

At the time of my writing, this is the top comment (with 185 people “points”(upvotes) for this one comment. Site abandonment, all because some people prefer to not have to be forced to watch a video.

Almost all of the comments that follow are more of the same in their displeasure with video-only content on news sites, each with its own reasoning or even other annoyances about why video-content on news sites are not liked by some people.

more people confirming reading text articles is preferred over watching videos on news sites.

Sometimes, it’s also a matter of circumstance why video based content is not even viable.

Inability to have sound on, or a desire to read through a transcript are an issue.

Sometimes, it’s specifically related to how the brain works in some people as far as information absorption is concerned.

some people prefer tutorials to be written / readable text they can follow at their own pace

And the number of comments with more supportive sentiment of the frustration of this post, is important to note. Some are quite adamant about how they feel.

Many other people agree - give us actual text content, for several different reasons

Note – not ALL who commented agree. Some people did express a different opinion.

Note how the comment below offers a reason why they disagree.

Some people of course, do prefer video content

That opinion, at least on this post, is one of only two comments on the entire thread (out of fifty seven comments posted) in opposition to video-only content on news sites.

And let’s not forget – imgur is a VISUAL content site, with commentary below it. The vast majority of value for visitors IS the visual content, however even then, the commentary is often hilarious, enjoyable, or otherwise educational (if you can get past the fact that some comments are vulgar).

________________________________

Data Usage and Page Speeds

Two additional considerations here are data usage and page speeds. As the world moves to a more mobile-centric media consumption model, data usage sensitivity becomes more important.

In this instance, it doesn’t mean that you should abandon video entirely – we can have an entirely different full-day discussion about that notion. However it is a consideration and needs to be weighed as part of the overall process when we talk about large video files, or interactive charts or infographics.

So, at the very least, remember to consider data usage when deciding “should we use multimedia assets as the primary content for this specific piece of content?” – and weigh that with the ease with which some of those multimedia assets can be created or shared.

Speed is another matter entirely, that requires much more serious consideration all too often. When I do an audit on a news or informational site, I often find that multimedia assets contribute significantly to slowing down of page speeds. When they are the primary content, it means that even asynchronous loading isn’t always going to help you.

So that too is another consideration that needs to be made, especially in this day and age when code and asset bloat from multiple ad networks and shiny object widgets further degrade performance (and even data usage cap considerations).

________________________________

The Impact of Failing to Accommodate Different Mind Models

Let’s, for this discussion, focus more on cases where you have decided that multimedia assets are the primary content.

When at least some site visitors come to a site expecting written content and they don’t get it, those visitors are more likely to abandon the page entirely, and abandon the site. Some of them will come back again, hoping for a better experience. Others will not.

Some of those visitors will tell others and rant about it online. Not always anonymously – often by mentioning brands in their rant.

Some people who came to the site via a search engine, will go back to the search engine in an attempt to find another source for a topic.

None of that helps overall site quality, authority or trust signals – even when looking at those from a purely human / social perspective.

________________________________

Accessibility

That then brings up the fact that visually impaired or hearing-impaired visitors are also likely to have problems getting any value, or complete value from your multimedia assets.

As a result, an entire sector of society is left out of the equation. Hearing-impaired people may be able to use a device or a video setting to get transcripts, however have you even considered that when posting multimedia?

________________________________

Search Engines as Users

Beyond Then we have search engines. A statement I have been making for several years is that, like it or not, search engines ARE users – they DO consume your content. And if you want the traffic that comes from search engines, you need to acknowledge this reality.

And no matter how sophisticated search crawlers and algorithms have become, those systems are just not capable of fully translating multimedia content into raw data for algorithmic evaluation purposes. And when they can do that to some extent, all context is lost regarding headlines, bold or bullet-pointed call-outs, and more.

This then causes search algorithms to fail to fully understand content volume, quality, uniqueness, or topical focus.

Media type content is always a challenge for SEO. As much as we don’t like to accept it, search engines are users of content. So while the common mantra is “create content for users”, if we want that content given maximized value by those “algorithmic users” (search engines), we need to find ways to accommodate that reality.

 ________________________________

Accommodating All Users

Here are the most common suggestions I make to clients in my audits when it comes to ensuring multimedia content pages can be better understood by search engine crawlers and algorithms, and where that also helps address many human users and their needs.

The best approach here is to take the visual content, and create a story around it. Written word stories that can compliment the visuals. Many visitors will only care about the visual content. Others will appreciate the deeper descriptive story.

While it pains me to reference CNN (they do a LOT of things counter to top tier SEO because they CAN – they’re CNN), this is a case where I can point to an example that is relevant.

With CNN, for example, even though that site has a vast volume of “video only” content, a significant portion of content has a video at the top of the page, and a written story beneath it. Often not labeled in links to that content, as being video based.

One issue I don’t like about CNN is how much of the articles they have about a given topic are written content, and yet they stick a “pseudo-related” video (auto-play, sound on) above it. I came there to read about a current news event, not to watch some archival outdated video.

Don’t use those kind of video placements as an example of “how to do it right” though. Please!

________________________________

Combining as an Option

Another technique is to combine multiple pieces into fewer individual pages. That brings up an additional set of challenges (such as page bloat and speed problems). However it is one technique to be considered.

________________________________

Change the Layout

With infographics – especially very big single-image infographics, a good workaround is to take the initial infographic, offer a thumbnail link to it on the main indexable content page, and write an editorial opinion piece on the contents of it. Or if you own the infographic, you can slice it up into smaller pieces, and write content and commentary around each of those smaller pieces.

Sure, it’s just easier to slap an infographic up, and maybe link to it using the code the creator of that infographic provides.  Except THEIR site is the one actually benefiting, since they’re they’re the creator.  So if you don’t add your OWN content, what’s the sticky value for YOUR site?

And if you are the original content creator, do you just want your exact same content replicated across half the web if there’s no differentiation?

________________________________

The Goal for Users and SEO

The goal in all of this is to help meet the needs of more user types (human and search engine originated). So it’s critical that whatever content you create not just be slapped up onto the site artificially.

That may very well mean fewer content pieces being created. Yet when it’s high quality, helpful, educational, informative or emotionally impactful content, the value is worth the effort.

  • You’re much more likely to get more people to stay on-page longer.
  • You’re much more likely to get more people willing to share your content.
  • You’re much more likely to get search engines to formulaically assign higher value scores to that content.

When you do that work with enough individual pieces of content, as a result of those value-gains, you’re much more likely to see a given section of the site, and in turn, the site overall, improve in search engine ranking.

 

 

Conversion Rates Matter More than Visits

Year over Year Organic Visit Improvements from SEOSure, it’s helpful quite often, to get increases to site visits through better SEO. (And this post doesn’t advocate to NOT want that.…) More visits are what everyone initially thinks of SEO as being about most of the time when they first get past higher rankings…

What if I told you it’s worthless in many instances, if increased visits don’t lead to increased sales or revenue?

Whether that revenue is from “here’s display ads, we need you to click on these to keep our site going” or “here’s stuff for sale, we need you to buy this stuff” or “here’s our services, we need you to fill out the contact form or call us, to hire us”, that’s what having a business site is all about, after all.

Unless you have a business built on page views for advertisers. In that case, you need a lot more help than this post can offer. Because that 20th century business model is on the way out the door and on its way into “this was never a real, sustainable business model anyhow, but now the publishing industry is finally forced to admit it” realm…

[ Note — Before we continue, Moosa Hemani just went live a few days ago with an excellent post on improving your conversion rates on an eCommerce site.  It’s a great write-up and I encourage you to read that because it covers critical aspects of CRO beyond what I’m covering here… ]

Moar Ads Moar Pop-Ups Moar Moar Moar…

And no, the answer to increasing revenue is also not “More ads, bigger ads, more obnoxious ads”.  That’s just downright visitor abuse.  It breaks the unwritten contract between publishers and visitors covering the concept of respect, trust and value.

So what if I then told you that more visitors isn’t even necessarily worth the effort?

Instead, what if it’s “better”, “more qualified” visitors?  And what if I told you that once those visitors arrive, it’s “better User Experience”?

If User Experience is strong, getting fewer, more accurate visitors can increase ecommerce conversion rates

As a forensic SEO consultant, a good amount of my work involves conversion rate optimization, because it’s not good enough to just get more visitors to a web site if you can’t also increase conversions. So where do I begin?

Start with Existing SERPs

When a new client comes to me for an audit, one of the data points I look at is a click-through rate from organic search. It’s a quick way to see if there might be weakness in page Titles and Meta descriptions. If there is, then the site is likely missing out on tremendous existing opportunities. Opportunities that, with the proper effort, can be tapped with a one-time change.  And where that can be sustainable.

Search Results with low click-through rates can be improved to increase conversion rates

Some people talk about “quick wins” or “low hanging fruit” in our industry.  Except many focus on other concepts around that — lower volume search phrases that might have less competition, for example.  That’s okay — those can be beneficial as well.  Here however, I’m talking about not being afraid to focus on even high value phrases.  Heavy competition phrases.

[ Side Note — in my audit work, I do my best to encourage clients to avoid the pitfall of only focusing on the lower competition scenario.  Sure, sometimes given very limited resources, or extremely heavy competition, that makes sense as a primary need.  Yet quite often, I find that with higher emphasis on the 5 Super Signals of SEO — QUART - smaller sites can, often, compete in the big leagues.  So where it makes sense, I encourage clients to think bigger! ]

Start with Most Important SERPs

Since my work is strategic in nature, and not tactical, and needs to cover a vast range of SEO rather than any one aspect, I don’t just look at “all the phrases”, and I don’t look at 2nd tier or 3rd tier phrases (phrases that relate to secondary or support type content).  Instead, I focus on a sampling of those phrases the site is already showing up for somewhere in search results where those phrases are most aligned with conversion point value.

Looking at those results, where are there opportunities for increased clicks that better align with conversion rate goals? Not just “move up in rankings, more than that — improve existing rankings right where they are, as its own effort.

Conversion Point Value

When I talk about “conversion point value”, I’m referring to the notion that people psychologically and emotionally go through phases in a purchase decision process.

If you can help someone get past the notion (good luck!) that “we have to rank for all the things”, or “but all we care about is showing up for our name” or “we need more visitors!”, you can work on or guide the process of improving revenue in ways that have the most impact based on the most efficient use of resources.

To evaluate this, I will often start with conversion improvements by looking at existing organic search results.  What is a site showing up for in organic results?  Within that, what are the phrases people are using where those phrases are most relevant to the products a site offers?

Looking at those, when we examine the page Title and meta description that show up for a given phrase, it’s important to ask several questions:

“Is this the best page on the site for this phrase?”

Right SERP, Wrong Page

Sometimes a blog post shows up in a result set that’s more of the “ready to buy” type in regard to searcher intent.  If that’s the case, the 1st opportunity is to figure out why a more direct sales type page isn’t ranking on your site.  Work to correct that.

Do you even have a proper “sales” specific page (product category or product purchase details page) set up on the site for that topic?

If you do, consider what you can change to reorient the focus from the blog post or FAQ page or information page over to that sales funnel specific page.

[ NOTE — if you have a sales page that’s more appropriate for the ‘buy now’ searcher need, a stop-gap task to consider as you work on moving the ranking signals over, is adding a paragraph at the top of the main content area of that non-sales-specific page or blog post that links to the better destination.

By doing this, people who are expecting a sales oriented page won’t be as likely to completely abandon the site when they don’t immediately get to the “right” page for their current goals. ]

The Buyer Mindset Multi-Step Life Cycle

Focus on understanding that people looking to make a purchase go through a multi-step psychological/intellectual process. Sometimes that’s all in one day, and sometimes that’s over an extended period.

Primary points in that process include

  • “I’m thinking about buying, just not sure, so let me explore”
  • “I’m looking to buy, just not sure who or where to buy from”
  • “I’m looking to buy, and have narrowed down some likely sources, just not sure what the exact thing is that I want or need”
  • “I want to buy today”

Match Target Pages To Buying Life Cycle Needs

Crafting content and user experience within a site & across social channels to align with the right point in the decision process is essential to strong conversion rates. Don’t try and get all of that done on one page. Don’t attempt to rank for “all the stages” at once, for “all the things”.  Take it manageable steps. Ask the important questions at each step.

“Is the Title of that page in this search result one that reinforces relevance properly, while reinforcing trust? For that stage in the buying life cycle?”

“Is the Meta description that shows up in the search result strong in further reinforcing relevance specific to searcher intent, and does it convey authority and trust?”

Weak / Irrelevant / Confusing Page Titles & Descriptions

Quite often I see cases where the page Title is too generic and confuses relevance; and where meta descriptions are keyword stuffed or fail to be motivational to click-through opportunity.

In many cases, Google will “decide” through algorithmic processes, that the provided description is not ideal for this search result.

When that happens, you can end up getting junk for a description, though sometimes you will see something that’s at least “somewhat” more helpful for invoking relevance, authority or trust signals.

Example:

Search Phrase: Mazda Tail Light Replacement Parts

Current Page Title: OEM Replacement Auto Parts — Mazda, Mitsubishi, & Toyota

Current Meta Description: Mazda — Miata, 626, Mazda3, Mazda 6 — 2009, 2010, 2011, 2012, 2013, 2014, 2015! engine components, transmission parts, body repair…

Even if the page that’s linked to is an actual “Mazda Tail Light Replacement Parts” page, that page Title is not helpful to communicate “this is a page you’re looking for”. And the description is trying to “be all things to all the people!” and in no way conveys “this site has what you’re looking for right here” or “we are a highly trusted supplier” or “we’re the best site with the best prices” or anything else that matches to the psychology of buying decisions…

So that’s a great starting point – working with people who are already finding the site in search results, yet where you can get more clicks from people ready to buy.

That can help take very low click-through rates and not just increase visits to the site but where those lead to higher conversions.

A/B Testing Titles & Descriptions

While you can use many different methods to figure out how to improve page Titles and Meta descriptions, one way is to rely on existing data you already have if you invest in AdWords or other pay per click advertising.

If you do, and if the person/team responsible for that channel knows what they’re doing, they may very well have done extensive A/B testing on those.  In that case, leverage that data and knowledge. It can go a long way to providing real world value to better organic Title and description writing.

Competitor Evaluations

Another way to come up with improvements, when done properly, is to take a sampling of phrases you show up for, and look at what competitors are doing where those competitors consistently outrank you.  How are they seeding page Titles and Meta descriptions?

Within that, is there anything you can do in writing your Descriptions to show a value-add, or a competitive strength? Don’t just copy competitors — look for something that sets YOUR brand, YOUR site, YOUR pricing, YOUR return policies, YOUR inventory selection apart from the competition.

Matching On-Site Signals to SERP Signals

From there, it’s working on site – and especially on those pages, to reinforce the five super-signals of SEO all around, and more specifically, confidence in site visitor minds that “this is exactly what I want” or, alternately, “this site makes it effortless for me to refine exactly what it is I am looking for”. Focus on that, and you go a long way toward strong conversions.

So using the example above where a “right page” matters, use the QUART concept to evaluate what’s going on:

  • Is the page this SERP links to the best page for the search queries people click through to come to the site?
  • Is the site designed aesthetically to convey a high level of quality overall, and trust?
  • Is the section this page resides within supportive of reinforcing additional value?
  • Is this page set up to quickly confirm the visitor came to the right place?
  • Is the information provided easy to read, evaluate and understand?
  • Does the page overall make it effortless for the visitor to achieve the goal they came here for?

If the answer to ANY of those questions is “sort of” or “not really”, you have work to do!

Remember — it’s not about YOUR opinion from YOUR perspective. It’s about the VISITOR point of view.

Mobile is The Same, Mobile is Different!

When you’re working on these issues, realize that mobile SEO involves a lot of the same considerations I’ve described already. What applies to desktop/laptop search and on-site, applies to mobile.

Except it’s not 100% parity.  Mobile searches involve different kinds of search words quite often — different language. Especially as we now move more into a world of asking Siri or Google (or yes, even Amazon Echo — oh wow…).  And mobile user experience is different, by leaps and bounds.

Don't ignore mobile site visitors as a separate evaluation point.

[Note: sometimes mobile visitors will come to a site on an initial phase of the buying decision process, and then come back to complete the purchase from a desktop or laptop computer.  Because of this, at least some transaction rates / conversion rates may legitimately be lower for mobile visits. However when the transaction % is so dramatically off compared to desktop/laptop rates, it’s critical to confirm that is the case, or discover where that’s not the case…]

More More More…

Everything I’ve covered here is valid and based on sustainable success in SEO. There’s always more to do, more to consider, both for overall SEO and for conversion rate optimization. However these are strong starting points. So I encourage you to get to work with these.  You can always circle around later to go deeper, or build on this work.

Follow-Up Resources

If you think you need help in figuring all of this out, I highly encourage you to consider hiring us for a proper, forensic level audit. Because these issues can become nuanced, and they’re never the only thing to consider — you don’t want to do all this work only to find out you have sixteen other problems that are equally or even more important for SEO.

Check out “Why Accessibility Will Matter More in 2016 & Beyond” by Kim Krause Berg to help better understand how and why User Experience is critical from that perspective.

Want to understand more about influencing consumers?  Check out “Brainfluence: 100 Ways to Persuade and Convince Consumers with Neuromarketing” — by Roger Dooley — one of the world’s top experts in this “advanced” aspect of mareting…

_________________________________________

Thoughts? Opinions? Additional Recommendations?

If you’ve made it this far, and you want to add anything, please leave a comment. — I greatly appreciate the collective mind value in our industry and on these topics! Don’t be afraid to challenge what I share either! If you have a different opinion, let me know!

 

Google Penguin Update! Stahp Already!

Google Penguin Update - Stahp Already!

Here we are, another day, another client email asking my thoughts on whether Google might be rolling out Penguin…

Why did this, otherwise highly intelligent, quite capable client ask?

Because sites he generally trusts for information about SEO have once again recently posted nonsense about Google being ready or likely to be rolling out Penguin.

This is beyond pathetic now.  It’s come to a point where eventually SOMEBODY is going to have posted a “get ready” post and Penguin WILL roll out.  And that person/those people are going to be hoisted up on a pedestal for “having vision” or “knowing the insider scoop”, or some nonsense.

I mean, just look at a PARTIAL list of the various headlines to have come from our industry about Penguin since last October, a full year after the last actual, known Penguin rollout.

October 1, 2015
“Google Confirms – Real Time Penguin Coming Soon” — SearchEngineLand

October 19, 2015
“Penguin RealTime Confirmed by Google” — LinkResearchTools

October 29, 2015
“Google’s Next Penguin Update Should Be Within The Next Two Months” SearchEngineLand

November 17, 2015
“How to Prep for the Pending Penguin Update” — SearchEngineLand

December 3, 2015
“New Penguin Update Not Happening Until Next Year” – SearchEngineLand

December 11, 2015
“Confident Penguin 4.0 is Good Enough for a January Release” – SearchEngine Roundtable

December 15, 2015
“Is Penguin 4.0 Coming in March 2016?” — SearchEngineWatch

January 10, 2016
“Massive Google Fluctuations – Is it Penguin?” SearchEngine Roundtable

January 20, 2016
“Next Penguin Update to Happen Within Weeks” SearchEngineJournal

January 20, 2016
“News: Google Penguin Update Coming Soon” HotClickMarkting

January 25, 2016
“SEJ WrapUp – Google Penguin Coming Soon” SearchEngineJournal

February 1, 2016
“Googles Newest Penguin Update Coming Soon” LinkedIn

March 21, 2016
“5 Reasons to Believe Penguin is Just Around the Corner” — V9SEO

March 29, 2016
“Signs of Update – Not Sure if Penguin” SearchEngine Roundtable

April 6, 2016
“Is Google Testing the Penguin 4.0 Algorithm?” SearchEngine Roundtable

April 15, 2016
“Google: When Penguin Begins Rolling Out, We Will Post an Announcement” — SearchEngine Roundtable

__________
Do you notice a pattern here?  

Here are a few you may have observed:
1. Nobody has ANY actual knowledge on this.

Those people at Google who many in the industry have turned to for “insider” knowledge have NO CLUE. They do NOT work on the Penguin algorithm, and are also at the mercy of other people’s willingness, ability or capacity to provide knowledge.

They have done their best to help the industry when asked, except it got stupid fast with this one. (Note that at least one Google rep reached a point not long ago where he was TIRED OF BEING ASKED.

2. Everybody is beyond anxious to finally see a Penguin update.

We ALL want to see Penguin updated. Heck, in a recent interview, I was asked what I thought was among the most detrimental changes in our industry over the years, and I said, without hesitation, Penguin.  For all the good it has done (and it has done a LOT of good), the collateral damage has been too severe, to vast.  (And I believe Penguin is broken — thus the long delay in refreshing it. )

3. People want to sound like they know what they’re talking about.

Hey, we’re marketers. We all suffer from a desire to sound like we’re experts, like we have the inside scoop or forethought. Or at the very least, want to offer our readers some sense of hope, which is a genuine desire.

At this point though, it’s insane.

4. People who report, want to believe people who claim to know what they’re talking about.

We’re desperate for real information from people who know what they’re talking about. I often say that while SEO is not rocket science, it IS search science.  So if we as an industry, can get insight from people who actually work AT google, we as an industry, tend to feel a bit more confident in the chaos.

Except Matt Cutts is no longer our go-to source at Google on Penguin or ANY algorithm.  And most information comes from two people who the industry collectively hoisted onto the pedestal when Matt Cutts moved on to his next amazing life adventure.  Even though neither has direct knowledge and neither one is on the spam or quality algorithm teams.

So for all their desire to help, much of the time they are either getting half-information from others in Google and just passing that along, or they’re guessing. Or they’re offering their opinion.

Except half the industry takes their word as gold.  When we’re no longer on the Matt Cutts gold standard for insider knowledge.  We’ve moved to a paper currency with paper faces “representing” the gold that we no longer have access to.

5. It makes the search industry look pathetic.

Seriously. Some of you already hate me for being so harsh.  Some of you will, as you usually do when I rant about the vital need for disclaimers and caveats in “Google said this” stuff, just find me annoying.

I don’t care anymore. Honest.  Site owners / readers / clients deserve better than the rumor mill content.  They need professional help and guidance.

So please stop the bullshit.

I know many who write this stuff won’t stop.  However now I’ve said it.  And even if Penguin updates tomorrow or next summer, the next time some other nonsense reporting comes out, I can just point to this post and say — see my thoughts on that subject, over here…

_____

Penguin clipart courtesy GoGraph

 

Industry Interview — Jeremy Knauff — Spartan Media

If you don’t know Jeremy Knauff, or his company Spartan Media, I thought it would be a good thing to interview him and share that here.  Jeremy is someone I admire both because he really cares about the work he and his agency do, and because of his experience in the Marines.  He is a true American patriot, and veteran.  The fact that he ended up in this industry was something I wanted to know more about, so here, for your reading pleasure, I present my interview with Jeremy…

_______________________

Jeremy - Spartan Media1.  So – how does a U.S. Marine, who (thank you very much!) was willing to (and  did) put his own life on the line in the service of this country, end up in the marketing industry?

The short version of that story is that during the last year in the Marine Corps, I spent all of my spare time (and in the infantry, we don’t have much spare time) putting the pieces together for a company I wanted to start with a partner. Everything fell apart before the company even got off the ground, but through that process, I taught myself graphic design.

After I got out and applied for a few design jobs, I realized I had become pretty damn good. When I explained my thought process behind some of my designs during an interview, the founder of a particular design firm told me that I knew things that most people he has interviewed with an MFA didn’t know.

As time went on, I added to my skill set by learning web design, which eventually led to search engine optimization and PHP programming. Each new skill was self-taught by pouring through gigantic books—the 4″ thick kind that you’re old enough to remember, online tutorials, which weren’t anywhere near as plentiful or easy to find as they are today, and lots of old-fashioned trial and error.

Along the way, I had some great mentors who generously helped me in their area of expertise when I got stuck. That being said, I’d like to thank  Donna Cavalier (Fontenot), John Carcutt, and Gillian Muessig for taking the time to help me get to the next level.
_______________________
2.  Your company, Spartan Media, offers a range of services – from web design to SEO, social media, and beyond.  Do you find these days that clients want the entire range, or they’re more likely to want/need one or only a couple services?

Ha! Yes, most clients want everything, but it’s a lot like those real estate shows where their wish list almost never seems to align with their budget.

I don’t think most clients need everything, especially all at once. I usually recommend that they start on the smaller side of a realistic budget so that they can afford to maintain it until it starts to deliver the kind of results that make it sustainable. It’s almost always a good idea to take one aspect of the bigger online marketing picture, and invest the time and money to do it right.

In most cases, the website should come first since that’s like a home base where you can drive customers to from other sources like social media or PPC, as well as using it to build your mailing list. (You have started a mailing list, right?)

Since the website and list go hand in hand, you should generally tackle them together, then move on to one social network and build up a solid, engaged following before moving on to another. SEO, PPC, online advertising—each element should all be launched one at a time.

It sounds counter-intuitive, but if you spend your time jumping around, you’ll never get the traction you can by focusing. Think of it like compounding interest; eventually your money invested in one account will generate so much interest that you’ll have a much greater ability to invest in additional funds, but if you spread it around from the beginning, it takes much longer.

_______________________
3.  If there was one thing (yeah, there’s never just ONE) you think business owners need to understand most when hiring an agency to provide their online marketing services that they might not understand before you work with them?

That they are hiring an agency for a reason.

I don’t tell my doctor how to do her job, but way too often, new clients try to tell agencies how to do theirs. Fortunately, data proves us right pretty quickly.

We had a client who insisted on having us develop content about how they were the best, they were the only company who offered their service, they were the biggest, they were “nationally accredited,” whatever that means—none of which were true. I don’t have to explain to you that the only person interested in that kind of content was the owner.

After we showed him the traffic and share stats of the ego-fluff he wanted us to write compared to the content that we recommended from the beginning, the difference was mind-blowing. In fact, one of the articles we developed  had achieved more traffic per day for about one week than the entire site usually received.

The bottom line is that if you’re going to invest your money to hire a professional, you need to let them do their job—otherwise you’re wasting your money.

_______________________
4.  What do you think the biggest challenge is to running an agency these days?

I think there are three big challenges:

  • Keeping employees inspired and happy so they will do the best job possible for clients. It’s not just about the money—work environment goes a long way!
  • Setting realistic expectations for clients and communicating effectively so they feel cared for.
  • Staying up to date on industry trends and technology.

_______________________

5.  How does your experience in the Marines translate in regard to running an agency or serving clients?

Improvise, adapt, and overcome.

I wrote an entire article on the subject, titled 14 Things the Marine Corps Taught Me About Running a Business.

Distilled down to a single sentence, Have a plan, have a plan for your first plan to fall apart, work hard, always improve, never give up, and take care of your team along the way.

_______________________
6.  I know a little from interacting with you in social media that you’re a family man – tell us a little, if you would, about your family – where and how you met your wife, how many children you have…

We’re just your typical family trying to build a better life than we had for our kids.

My wife hates when I tell people this, but we met online. I don’t know why… I think it’s pretty common today. In any case, it worked out well for us and led to two of the most beautiful kids and more than I could have ever asked for. One boy and one girl, polar opposites from each other.

_______________________
7.  One final, yet important question.  You care about helping and supporting our brother and sister veterans – I know that care is genuine, I’ve gotten to know you enough as a human to recognize that.  What is your message for others who might not know about the needs our veterans have these days – the biggest issue(s), for example. And what can my readers do to help?

There are a lot of organizations out there, so if you want to donate money to help, please do your homework first. There are some great ones, but there are also some really bad ones, and it can be hard to tell the difference on the surface. One that I can personally vouch for is 22 Until None.

It’s a totally volunteer organization founded by a Marine, and they’re doing big things all over the country for veterans to fight the veteran suicide epidemic.

I think an awesome approach is as simple as picking up the phone.

Everyone knows at least one veteran, and a lot of them are struggling—often with invisible issues. These are the kind of men and women who signed that blank check and put their life on the line for every American out there. They are warriors and protectors who aren’t used to asking for help no matter how badly they may need it, and that’s why we’re currently losing 22 veterans every single day to suicide.

Pick up the phone and check in on them from time to time. Sometimes, all it takes to stop them from making that fatal choice is knowing that someone gives a damn.

_______________________

Well there you go — Jeremy went from ground pounding life-on-the-line selfless service to running a marketing agency — not unlike others in our industry, yet unique in what I find to be a fascinating way.  And his understanding of client thinking, as well as how to help them achieve their goals, is something I appreciate.  So I encourage you to follow Spartan media over on Twitter, and if you think you or someone you know could use their services, visit their site and reach out.