Page Speed is Not Just for SEO Yet More Important than Ever for SEO
Nearly every audit I perform, I find weakness in page speeds. Twenty, thirty or forty second average load times.
Even when I audit a site and find the “30 day average” is under five seconds for most pages, the moment I dig in to look at individual page trends for 30 or sixty days, I often find some “mostly fast” pages are in the ten, fifteen or twenty five second range one or a few days a week or month.
Scale that out. If thirty % of your site’s pages are slow a few days a month, and that’s involving a few hundred or several thousand visitors for those days, how many of them are getting those slow speeds?
SEO is far beyond the only thing that matters. Every visitor, from every referrer source, or even direct, who encounters a slow page, is potentially going to abandon the site.
And yes, even intermittent speed problems matter if you care about success.
While you can use a tool like Google Page Speed Insights, or Chrome Dev Tools to find ways to improve speeds, those tools are far from complete in their speed bottleneck issue identification, and far from complete in their recommendations.
WebPageTest, GTMetrix and other tools can help of course. Yet NONE of them cover some issues that need to be addressed if you want to properly resolve speed issues.
Sure, you can see “compress images” or “eliminate render-blocking, among other things. And that’s all important. Yet there’s much more that can be done.
_______________________________________________ Page Speed Improvements Beyond The Typical Tools
Rather than going through the GPSI list, I’m going beyond those.
Here then, are the top factors I find consistently over time, for page speed improvements not directly listed or mentioned in most speed tools:
A fairly recent phenomena — a number of hosting configs are forcing OCSP certificate lookup verification through an off-server network. Just this week I saw one go through GoDaddy servers, and one go through Symantec servers.
Neither site I found this happening on, is being hosted on EITHER of those platforms.
And before the first byte is even downloaded, one and a half to two or more seconds have been lost to that process. It’s outrageous.
A good hosting environment doesn’t force that process. So go run a test through WebPageTest.org then go to the “Details” report. Look at the first or first couple of processes for your home page. Do they look odd? Out of place? Those could very well be a host certificate verification process being forced into the sequence.
If your own home page HTML retrieval in that test isn’t listed as the very first process, that’s a problem that needs to be resolved.
2. PHP Version
While we’re talking about hosting — check the PHP version on whatever host you’re using. Many hosts are stuck in PHP 5.6 instead of 7 — where PHP 7 has major speed improvements.
Be careful though — in some edge-case scenarios, upgrading to PHP 7 can cause other problems so be sure that won’t happen if you switch. And if it will cause problems, figure those out and resolve them so you can switch.
_______________________________________________ Excessive File Weight and Process Delays
Strip out every single excess process that can’t be justified as necessary for revenue. I see sites with hundreds of individual elements at the code level for a single page.
1. Font Excess
There is no justification for five or eight font files, and most especially get those fonts stored locally. Run away from Google’s own GStatic font server.
And do you really need that one font where the file size of that font is 150 kilobytes? Honestly — aesthetic opinions often kill page speed needlessly. So look for similar fonts that weigh much less. Or just strip out fonts entirely and go with system fonts. Or put them on your own site server.
2. Asset Hosting Location Matters
In fact, locally host (or host on your own CDN setup) as many assets as possible — fonts, images, scripts, whatever — too many sites pull assets from other networks and the delay in processing adds up significantly.
3. Tracking Pixel Insanity
No, a site doesn’t need 27 tracking pixels (see above where you need to justify every single asset being used or called).
4. Gravatar Nonsense.
While we’re at it, for cryin out loud, turn off Gravatar on WP blog systems. Forcing a page process to run back and forth to Gravatar for every single comment is nonsense, and I’ve seen pages grind to a snail’s pace loading while all that is going on. And the more comments a blog post has, the more time wasted retreiving Gravatar “default image” images.
5. Unused Script Code
6. Abusive Design Assets
No, a site doesn’t need fifteen CSS files.
7. Image Sizes — Pixel Dimensions Matter
Are all of the images on the site already sized dimensionally for their specific space needs? Or are you taking an image that’s 1000 pixels wide, down into a 200 pixel wide space?
So make sure those images are sized properly for dimensions before you compress them for weight.
_______________________________________________ Going Beyond Google Mobile Speed Ranking Considerations
I’m going to wrap up with a mini-rant. I don’t care if improving speed isn’t going to help the overwhelming majority of sites from an SEO perspective.
Let’s talk about perspective. Google reps claim the upcoming mobile speed impact update will only impact a small percentage of searches.
According to InternetLiveStats.com, there are 1.2 trillion searches a year. Some estimates around the web put it at over two trillion at this point.
Let’s say the “small percentage” John Mueller likes to use as his statement that’s potentially going to be impacted by the Google Mobile Ranking factor is “just” one percent.
That’s twelve billion searches will be impacted by that speed update each year.
TWELVE BILLION SEARCHES.
Let’s be generous though. Let’s say it’s only one tenth of one percent.
That’s still 1.2 billion searches.
ONE POINT TWO BILLION SEARCHES. EACH YEAR.
Anybody who ignores speed because “it’s a small percentage of searches” is not considering what that really means.
_______________________________________________ Mobile Speed — Way More than SEO
Let’s move beyond formulaic SEO. Formulaic SEO is never what matters most. User Experience matters most.
Formulaic SEO attempts to emulate that. Except formulaic SEO needs to go with averages and generalized cases. Which means an entire sector of sites, and an entire sector of society, are going to be left out of that process.
User Experience on web sites isn’t just for those coming from Google organic results. It’s about visitors from everywhere.
Studies out there have shown that even a site that loads in 10 seconds (far below the old 20 second threshold for SEO on desktop), is likely causing a large number of visitors to abandon the site.
If you go to Google’s Newest Mobile Speed Testing Tool, (Powered by WebPageTest.org by the way, my go-to testing tool), you can see how many missed opportunities a site could very well be experiencing because that site isn’t lightning fast. Going from 10 seconds (not critical for SEO) down to six seconds or four seconds, can make an impact.
Yet there’s a reason Google pushes “one to three seconds” as the ideal range. It’s not all about SEO or Google resources. It’s because they have enough data to know the value for businesses, when site speeds are lightning fast.
While I don’t expect every site developer or manager or owner can expend the time to eek out every last drop of speed delay causes, I think it’s critical to go further than most people are willing to do, and will become more important as we move forward.
At least until every human being is connected to the web on lightning fast connections, every site is on a lightning fast server, and you can get away with fifteen megabyte page weights.
So if you want to maximize, or at least take logical, fiscally responsible steps to improve the quality of your site experience from a speed perspective, it’s worth the effort.
Whether it’s because of Google’s new mobile speed ranking consideration or for all site visitors, I highly recommend you take the steps necessary to clean up the excess junk in your site’s code. Even if you don’t see immediate increases in sales or goal conversions, it’s about sustainable quality.
A word or 300 about using “noindex,follow” and other indexing signal factors…
Questions have been asked from people in the SEO community steadily over the years about using “noindex,follow”, or “canonical tags pointing to other pages”, as ways to get Google to act a certain way. It’s a subject that’s come up from time to time recently in a group I’m an admin of over on Facebook — “Dumb SEO Questions”.
In providing some insight there today, I thought it would be a strong enough topic to deserve my writing a blog post about it.
Note this is my perspective and experience to those lines of thinking. It doesn’t mean what I convey here is absolute, set in stone, or applicable to every site and every situation. With all things SEO, there are edge case scenarios where something else may be true. So take what I offer here and do with it what you will.
Google — The All-Knowing Decider of Indexing
Google, being the all wise cataloger of the web (in their view), does not truly respect robots.txt, meta robots tags, x‑robots tags or canonical tags, in spite of those each having the purpose of being a directive. Google’s programmers, in their view of the world, consider each such signal only as a “hint” as to what site owners intend.
Conflicts Within Site Signals Muddy The Process
Of course, many sites inject conflicts across that range of signals, which is the altruistic reason Google decided long ago to only take them as hints. Except that means the imperfect algorithmic process quite often makes a poor determination as to what the system “should” do if not what those signals convey.
Because of all of that, even including a URL type in robots.txt or canonicalizing to a different URL or having a meta robots noindex status can sometimes not prevent Google from not only crawling some URLs but also where they end up indexing them.
Crawled Not Indexed
“Most” of the time (a relative concept), if a file is listed in the robots.txt file, even when Google crawls them, they will list those URLs in search results, yet they will have at least honored the spirit of the robots.txt file by not indexing the content of those URLs. It’s pretty insane sometimes.
The “Noindex,Follow” Way to Wealth, Fame and Lost Value
As for “noindex,follow”, there’s never a valid reason to use that combination. Sure, Google will pass value THROUGH those pages, initially. Except if the noindex aspect remains long enough, Google will eventually stop even passing the value through them — they’ll end up being removed from the Google process entirely.
John Mueller confirmed this in a Webmaster Hangout — Barry Schwartz shared the video over on SEO Roundtable .
And from the perspective of consistency of signals, if a page deserves “noindex” status, it’s best to use “nofollow” as well, so as to more readily convey what you do want indexed and crawled simultaneously. For larger sites this is even more important because of crawl budget considerations, where such a factor starts to become integral to getting signals correctly understood in formulaic processing.
But Wait — There’s More to It!
Then there’s the notion that other signals get factored in to all of this as well. If enough pages are deemed, by Google’s systems, to be unworthy of indexing for other reasons — too much duplication, not enough unique value, not enough trust, and other examples, those pages won’t always be indexed in spite of other signals. Or they may be indexed, yet not helpful. In fact, Google will sometimes index pages that don’t ultimately deserve to be indexed, and that alone can weaken the value of pages that do deserve to be indexed.
URL Parameters, Sitemap XML Files, and Inbound Links
Whenever discussing the crawl and indexation decision process, it’s important to also mention the fact that URL paramters, when set to “representative URL”, or “Let Googlebot Decide” can also muck up that decision process. The same is true for inclusion in sitemap XML files and when enough high value inbound links exist. All of these can influence, to varying degrees, how and what Google crawls and ultimately indexes. In spite of robots settings or canonical tags.
The SEO Indexing Bottom Line — Consistency
Okay that wasn’t the bottom line. It was just the last section label in this post. The best recommendation I have is one I repeat often in my audit work — Never leave it to Google to “figure it all out” when you have the ability to control, through consistency of signals, what you want their systems to do and how you want their systems to behave regarding your site.
While the actual numbers are different on each site, all too often I see sites where the volume of indexed pages becomes exponentially greater than the total number of products offered. For Ecommerce sites, it gets out of control, and becomes a crisis situation.
Actual SKU Count: 9,000
Indexed Pages: 45,000
Actual SKU Count: 25,000
Indexed Pages: 350,000
Actual Sku Count: 750,000
Indexed Pages: 9,000,000
By understanding how this happens, why it happens, and what to do about it, you can, when proceeding with caution, dramatically reduce the indexed page count while simultaneously improving overall visibility in search engines for the pages you care most about. That in turn, can significantly increase revenue.
A Word of Caution
Warning – every site is unique. Each situation is different. While the concepts presented here are based on real world experience repeated many times across several different sites in different industries, there are countless reasons this is not a guaranteed outcome.
Inability to implement technical changes is often at the top of the list of reasons this can sometimes not be achieved. Difficulty and complexity of product / sku and customer base expectations can also be another major barrier.
Another critical consideration is that I do not believe in isolated changes to a site from an SEO perspective on scale. If a site has many weaknesses, making any one significant change may help, yet it may not be enough to score the big win. There’s no simple answer in where that threshold of “we did enough, across enough areas” exists for any single site.
When the wins happen though, they can be big. Very big!
In this example, for an audit I performed this spring, and where the work took weeks to implement, then a couple months to stabilize, we see how there was a dramatic reduction in total pages indexed based on my recommendations for cleaning up categorization and faceted navigation, along with a dramatic increase (over 20%) in revenue.
Several changes were made in late spring, and the big “index cleanup” steps in this example, took place in June of 2016, while additional improvements were made after that as well.
A Quick Primer
Before I dive into the ‘how do get there’, it’s important that people just getting started in SEO, or people not yet beyond fundamental intermediate understanding, get a short primer in core concepts.
If you’re already deeply advanced, read this anyway. We can never become complacent in our understanding.
Search Engines Attempt to Emulate User Experience through Formulaic Methods
Crawl Efficiency Red Flag:
If a human being ends up having to spend too much time trying to find what they want or need, it is most likely search engine crawl budget limitations will be overburdened and your most important content will become lost in that process. End result — SEO suffers.
Sensation Overload Red Flag:
If a human being is confused or overwhelmed, it is most likely search engine algorithms will be formulaically overwhelmed.
Too many sites I audit have content organizational overload that becomes toxic on scale. End result – SEO suffers.
Let’s Dive In – Proper Planning Prevents Poor Performance
If you don’t take the time to plan things out, the results you end up with could be a death knell to your business. The concepts I present here will hopefully give you enough understanding for you to be able to evaluate your own site situation and then plan for the cleanup, accordingly.
The Ecommerce Organizational Example
9,000 products – different organizational scenarios:
If you break it out into 30 categories, with no subcategories, and 300 products in each category, that is “going too wide and not deep enough” for both human experience and SEO.
If you have 3 categories, 300 subcategories, and 10 products in each of those, that is just is equally problematic because it goes too wide at the subcategory level and too thin within each of those.
If you have 10 categories, and 10 subcategories in each, with 90 products in each subcategory, that is one example of a reasonable site structure.
Exceptions to the Rule
If you have some categories that contain more subcategories than others, that can be perfectly valid. If some subcategories have more products than others, that too, can be perfectly valid.
Sometimes you really do have a need for an extremely large volume of main categories or within any of those, a large volume of subcategories. Or within a single subcategory, a large volume of products.
Narrowing Opportunities Can Wreak Havoc
Another usability “feature that can get out of control is “faceted” navigation – the ability for visitors to further sort, group or refine any given category or subcategory even further.
Sort by price (highest to lowest, lowest to highest)
Refine by price group (products under $x, products in a $x to $y price range, etc.)
Refine by color / size / brand / ______
Best Sellers / Most Popular
These are all valid ways, when it makes sense to offer them, to help visitors look for a narrow set of products to help meet their unique situation, goal or needs.
Except you can take that too far as well, and overwhelm the senses, end up causing duplicate content confusion, and severely weaken crawl efficiency.
Human Users vs. Search Engines as Users
One of the most crucial considerations in this process is whether a particular refinement of product grouping may be beneficial to potential customers, yet where that refinement overwhelms search engines and thus isn’t helpful when it’s indexed.
Sometimes people, once they arrive at a site, want to do things that, from an SEO perspective, don’t warrant indexation, so remembering that as you make decisions, is critical.
Product Grouping & Filter Narrowing Organization Decision Process
In order to determine exactly what silos should exist, and in what combination, it will be vital for you to take the time, up front, to evaluate organizational criteria based on:
Total number of products at each level
Overall search volume
Overall Revenue Value
Refined Profit Margin Value
If this step is taken, it will help you make decisions about whether some subcategories might not be worth keeping in the new system. Some might not have enough products on their own (very thin pages).
Others might not have enough search volume to bother (low value based on inactivity), and others still, may be loss-leader or very low profit margin value to bother wanting to keep as individual pages, where consolidating those products or subcategories into other, more valuable pages can help bolster those.
Product Group Filter / Refinement Functionality
When you present visitors with the ability to narrow a particular category or subcategory result by various criteria, it can help those visitors bypass the need to look through all of your products when they want a particular highly refined sub-set.
As mentioned earlier in this article, giving visitors the ability to sort by price, new products, popularity, color or other “facets”, you can help them find what they are looking for sooner. Yet that can become toxic as well.
Too Many Filters End Up Becoming Toxic to Visitors & SEO
If I go to a page on a site and I am overwhelmed by the choices presented, I can become confused and frustrated. If search engine algorithms have to “figure it all out” with so many choices, formulaic evaluation decisions can break down.
Is this narrowing function helpful to visitors?
Is there enough search volume to warrant allowing this narrowing function result to become indexed?
Is there enough search volume to make the duplicate content concern worth addressing in other ways?
Blue Floral Print Summer Dresses
Blue Floral Print Summer Dresses Under $100
At what point is there enough search volume to justify any of these being indexable?
At what point is there not enough search volume to justify the duplicate result concern?
Other Channel Data
When determining if there’s enough search volume, then it’s time to look at existing and previous sales data. At this point, don’t just rely on organic search originated sales. Especially if a particular combination of features or “facets” wasn’t visible previously due to poor SEO.
So look at overall sales data from all channels. Direct, PPC, email marketing – all of these can help inform your decision as to whether a particular combination of features or facets may be more valuable than the very narrow data specific to organic listings.
Profit Margins Matter!
Just because you have data that may show “we sold a lot of this combination”, it doesn’t mean that combination deserves to be indexed even if there’s enough organic search volume overall to seemingly justify indexation.
If a particular group of products has a high sales volume, yet the profit margin on that group is very low, by preventing that from being indexed in search engines, you can reduce some of the duplicate content confusion, and that in turn, can help put more focus on higher profit margin product groups.
Stop Already ! It’s Too Much to Figure Out!
One concept I drive repeatedly in my audit work is “don’t leave it to Google to figure it all out”. That’s because the more you leave to Google to have to figure out, when you can help their crawlers and algorithms, the more likely that automated, formulaic process is going to make poor choices, give more value to things you care less about, and give less visibility to things you care more about.
The same concept can be applied to you and to site decision makers with all of these steps needed to get to highly refined indexation choices.
This is especially true when you have tens of thousands or hundreds of thousands of products or SKUs.
If that’s the case for you, one way to deal with it, at least in the short to mid-term, is to block ALL of the various filter / feature / facet options on your site from indexation.
If you do that, you can then come back in a future phase, and slowly, methodically reintegrate a limited number of filters / features / facets to indexation, one at a time. Yet only if you’ve taken the time to determine which of those deserve indexation based on the criteria I’ve communicated above.
Content Blocking Tactics
Once you’ve mapped out what to keep in the index and what to block, there are multiple ways you can go about the blocking process.
Robots.txt Silo Blocking Method
You can use the robots.txt file to block entire silos from indexation, if groups of products you want to block are contained in their own hierarchical URL silo.
With a single robots.txt entry, you can block the entire “under-100-dollars” group. The flaw in that method is if you don’t have those “under 100 dollars” products accessible outside of that silo level. You don’t want to go too far in blocking, so that’s generally not always advisable.
If you do list all of your summer women’s dresses in the /summer/ level, then sure – you can block the /under-100-dollars/ group. That is a valid use of the robots.txt file, as long as the individual product URLs are not buried beneath that level though.
MyDomain.com/womens-dresses/summer/under-100-dollars/liz-claiborne-powder-blue-dress-8930w2/ is an example of having the individual product URL hierarchically situated in a silo level where blocking the /under-100-dollars/ level would be toxic to SEO.
If, instead, you have MyDomain.com/womens-dresses/summer/liz-claiborne-powder-blue-dress-8930w2/, and the /under-100-dollars/ level only exists to help show that subset of products for site visitors, you can safely block that /under-100-dollars/ level from indexation in the robots.txt file.
Meta Robots and URL Parameter Blocking
If you pass filters or facets in your URLs, you can use a custom parameter to designate all the various combinations of filters and facets you want blocked with one easy method.
All of these are examples of how passing along filter or facet options in the URL can be done, and they’re great examples of how out of control filter and facet refinement can become.
Many clients I work with have Google Search Console set up to deal with URL parameters. Except that’s rarely done properly.
Not only is it dangerous to assume “Googlebot can figure it all out”, it leaves the site vulnerable to incorrect indexation results. And it’s not helpful to Bing or any other search engine.
So for URL parameter cases, if there are any filter or facet specific parameter combinations you want to block from indexation, you can add “noidx=1” to the end of those URL strings. Then, you can have an entry in your robots.txt file to look for the “noidx=1” parameter / state combination, and block those out.
Sure, if you do that, you can also change the Google Search Console setting to “no urls”, however it’s best to take full control on-site as your primary method of communication.
Meta Robots Blocking Methods
If you need more granular control as to what you block, you can programmatically set all of the pages you want to be blocked through a meta robots “noindex,nofollow” state in the header of those pages.
The Meta Robots “noindex,follow” myth
While there are always exceptions to the rule, most of the time, it is not valid or appropriate to use noindex and follow together in the meta robots tag. Why? Because of the following:
If you want a page indexed, it should, whenever possible, be accessed from a silo path where each page in that page is indexed.
Since a “noindex” state causes a page to have zero added SEO value*, a noindex,follow state sends zero SEO value to the target pages.
Forcing search crawlers to navigate through noindex pages, for content discovery is very inefficient, and weakens crawl budget limitation concerns.
*When I say a noindex page has zero SEO value, I mean to say there is no new value added from that page to pass along to pages linked from it. PageRank is a Pass Through state on noindex pages, not an “add value” state. If you have 1 million links, the lost crawl efficiency does more harm than pas through is worth. Maximized link distribution on scale is more efficient when each page crawled passes through & adds value.
So while you CAN have noindex,follow, and it is possible that ranking won’t be harmed, from a crawl efficiency perspective it is not a best practice. And on scale, the larger the site, the more harm that can come from crawl inefficiency than it is worth to allow noindex,follow to take place.
Example: 1 million pages. 200,000 are noindex,follow. What if you also have intermittent server issues, and Google abandons the crawl some of the time? When is it ever acceptable to allow that possibility, where in the process, tens of thousands of noindex,follow URLs got crawled, yet other pages that you want crawled for indexability, would be abandoned? This is an extreme example (though I’ve seen much bigger sites) — however it’s the point that in a maximized efficiency scenario, noindex,follow is not helpful.
Meta Robots / Canonicalization Conflicts
Another critical flaw I find in sites that attempt to control what gets indexed and what doesn’t, is conflicts between meta robots and canonical tags.
In the above example, the meta robots tag communicates “don’t index this page”, while the canonical tag communicates “index this URL”.
A Word About Navigation and Hierarchical Grouping
In the examples I used early on in this post, I used a single case example – 9,000 products. In many of the audits I perform, an individual site can easily reach into the tens of thousands or hundreds of thousands of products.
When that happens, it’s tempting to have hundreds of categories, or in any one or more categories, hundreds of subcategories. And just as tempting to end up with a single subcategory containing tens of thousands of products.
There is no one right mix or mathematical formula for how many URLs and links you have at any single level. The most important concept is to always be looking at it from the perspective of your ideal client or customer market.
If a site visitor becomes overwhelmed with choices at any single point, that’s going to be a red flag that search crawlers and algorithms will also become overwhelmed.
When it comes to displaying navigational options to visitors, that too needs to be filtered through human experience. If I see a list of 500 choices on the navigation bar or sidebar (and yes, I’ve seen worse), not only does that overwhelm the senses, it dilutes the refined topical focus vital to maximized SEO.
General Navigation Volume Guidelines
As a general rule, it’s wise to not have more than eight to twelve main categories, and not to have more than eight to twelve subcategories in any single category.
I shouldn’t be assaulted, as a user (human or search engine) with links to “all the things” in navigation. As a result, the more products you have, the more need there will be to go deeper in hierarchical silos.
When I, as a user, go into a single category, I should only see subcategory links that point to content within that category.
A Word About Flat Architecture
One last concept I need to convey in this post has to do with flat architecture. The SEO myth states “the closer to the root domain a page exists, the more important that page is”.
While the very basic concept is sound, at this point in Internet history, flat architecture is almost always toxic.
Given the scale of individual sites and niche markets dictates that creating a flat hierarchical URL structure sends invalid, and topical weakening signals.
In this example, you’re communicating “each of these pages is equally important”. That is not only not true, it dilutes the importance of those pages that are truly more important, and those that have a larger topical reach. It’s forcing search engine algorithms to have to “figure it all out”, which I have said previously, is dangerous and likely very harmful to maximized SEO.
“But my URLs will become too long”
Generally speaking, it’s not only acceptable to end up with longer URLs the deeper you go in navigation, it’s perfectly valid for human needs as well. How many site visitors actually read the entire URL of a given page deep within a site?
Obviously, you may need to alter how you generate product detail page URLs – instead of using the entire 20 word product title, you’d probably be better off crafting a shorter version of those for this situational need.
A Final Word
I’ve done my best in this article to give as much clarity to the issue of product category and faceted navigation challenges and solution options for SEO. Even though that’s the case, I can’t emphasize enough that each situation is unique, and every site has not only its own needs for SEO, but also any number of potential limitations regarding the implementation of fixes.
Like anything else in SEO though, the closer you can get to a healthier state from an existing mess, the better your site will be overall. For humans and search engines as users.
One issue that comes of frequently in my forensic audit work (especially on news and informational sites) is how multimedia pages all too often have little to no content in the form of descriptive, crawlable HTML text. Examples of such pages most often include videos, infographics, photo galleries, and interactive charts.
Publishers all too often, think, “this is the content – it’s not text because we know people like visual content”.
And for many site visitors, that thinking isn’t wrong. Unfortunately that can’t be said about all visitors, and all consumers of your content.
Different Mind Models
The first, and most important consideration, is human visitors – actual consumers of your content. Not all people visiting sites are visual in the way they process information most effectively. Or for certain types of content, they prefer reading rather than watching, even if they might otherwise, in different circumstances, enjoy video content.
I most often like to use myself as an example – when I visit a news site to consume information, I almost never go directly to that section of the site where video files are the primary content. I prefer to read words on a page. It’s just how I prefer to process information. And if you force a video file on me, with auto-play on, and auto-sound on, it actually annoys me.
This isn’t to say I’m atypical though. I’ve found over the years that many other people feel the same way.
It’s Not Just Me — Public Opinion on the Issue
To illustrate this, I recently came upon this gem on imgur, a highly popular sharing site where you get the full range of humorous, insightful, and educational content and personal commentary from a wide range of people in all different age groups and from all different socio-economic backgrounds.
Note imgur is site is visual in nature, which you may find ironic here. However it’s a site built ENTIRELY on visual content and text commentary – it’s entirely UGC (user generated content), and as such, I go to the site expecting visual content AND text commentary, often on a massive scale.
As I was browsing through the site this week, I came upon the above post — that speaks to this very issue. (Warning – foul, crass, and often very disturbing commentary can be found on that site – as a UGC driven site, with a mostly “free-for-all” posting policy, this is reality).
Note how this meme speaks directly to news sites, and the expectation of content presentation type.
Over 1,000 people upvoted this in less than 24 hours – moving this content to the front page – a big deal on imgur – 99% of content NEVER makes it to the front page.
And here are some of the comments on that post:
At the time of my writing, this is the top comment (with 185 people “points”(upvotes) for this one comment. Site abandonment, all because some people prefer to not have to be forced to watch a video.
Almost all of the comments that follow are more of the same in their displeasure with video-only content on news sites, each with its own reasoning or even other annoyances about why video-content on news sites are not liked by some people.
Sometimes, it’s also a matter of circumstance why video based content is not even viable.
Sometimes, it’s specifically related to how the brain works in some people as far as information absorption is concerned.
And the number of comments with more supportive sentiment of the frustration of this post, is important to note. Some are quite adamant about how they feel.
Note – not ALL who commented agree. Some people did express a different opinion.
Note how the comment below offers a reason why they disagree.
That opinion, at least on this post, is one of only two comments on the entire thread (out of fifty seven comments posted) in opposition to video-only content on news sites.
And let’s not forget – imgur is a VISUAL content site, with commentary below it. The vast majority of value for visitors IS the visual content, however even then, the commentary is often hilarious, enjoyable, or otherwise educational (if you can get past the fact that some comments are vulgar).
Data Usage and Page Speeds
Two additional considerations here are data usage and page speeds. As the world moves to a more mobile-centric media consumption model, data usage sensitivity becomes more important.
In this instance, it doesn’t mean that you should abandon video entirely – we can have an entirely different full-day discussion about that notion. However it is a consideration and needs to be weighed as part of the overall process when we talk about large video files, or interactive charts or infographics.
So, at the very least, remember to consider data usage when deciding “should we use multimedia assets as the primary content for this specific piece of content?” – and weigh that with the ease with which some of those multimedia assets can be created or shared.
Speed is another matter entirely, that requires much more serious consideration all too often. When I do an audit on a news or informational site, I often find that multimedia assets contribute significantly to slowing down of page speeds. When they are the primary content, it means that even asynchronous loading isn’t always going to help you.
So that too is another consideration that needs to be made, especially in this day and age when code and asset bloat from multiple ad networks and shiny object widgets further degrade performance (and even data usage cap considerations).
The Impact of Failing to Accommodate Different Mind Models
Let’s, for this discussion, focus more on cases where you have decided that multimedia assets are the primary content.
When at least some site visitors come to a site expecting written content and they don’t get it, those visitors are more likely to abandon the page entirely, and abandon the site. Some of them will come back again, hoping for a better experience. Others will not.
Some of those visitors will tell others and rant about it online. Not always anonymously – often by mentioning brands in their rant.
Some people who came to the site via a search engine, will go back to the search engine in an attempt to find another source for a topic.
None of that helps overall site quality, authority or trust signals – even when looking at those from a purely human / social perspective.
That then brings up the fact that visually impaired or hearing-impaired visitors are also likely to have problems getting any value, or complete value from your multimedia assets.
As a result, an entire sector of society is left out of the equation. Hearing-impaired people may be able to use a device or a video setting to get transcripts, however have you even considered that when posting multimedia?
Search Engines as Users
Beyond Then we have search engines. A statement I have been making for several years is that, like it or not, search engines ARE users – they DO consume your content. And if you want the traffic that comes from search engines, you need to acknowledge this reality.
And no matter how sophisticated search crawlers and algorithms have become, those systems are just not capable of fully translating multimedia content into raw data for algorithmic evaluation purposes. And when they can do that to some extent, all context is lost regarding headlines, bold or bullet-pointed call-outs, and more.
This then causes search algorithms to fail to fully understand content volume, quality, uniqueness, or topical focus.
Media type content is always a challenge for SEO. As much as we don’t like to accept it, search engines are users of content. So while the common mantra is “create content for users”, if we want that content given maximized value by those “algorithmic users” (search engines), we need to find ways to accommodate that reality.
Accommodating All Users
Here are the most common suggestions I make to clients in my audits when it comes to ensuring multimedia content pages can be better understood by search engine crawlers and algorithms, and where that also helps address many human users and their needs.
The best approach here is to take the visual content, and create a story around it. Written word stories that can compliment the visuals. Many visitors will only care about the visual content. Others will appreciate the deeper descriptive story.
While it pains me to reference CNN (they do a LOT of things counter to top tier SEO because they CAN – they’re CNN), this is a case where I can point to an example that is relevant.
With CNN, for example, even though that site has a vast volume of “video only” content, a significant portion of content has a video at the top of the page, and a written story beneath it. Often not labeled in links to that content, as being video based.
One issue I don’t like about CNN is how much of the articles they have about a given topic are written content, and yet they stick a “pseudo-related” video (auto-play, sound on) above it. I came there to read about a current news event, not to watch some archival outdated video.
Don’t use those kind of video placements as an example of “how to do it right” though. Please!
Combining as an Option
Another technique is to combine multiple pieces into fewer individual pages. That brings up an additional set of challenges (such as page bloat and speed problems). However it is one technique to be considered.
Change the Layout
With infographics – especially very big single-image infographics, a good workaround is to take the initial infographic, offer a thumbnail link to it on the main indexable content page, and write an editorial opinion piece on the contents of it. Or if you own the infographic, you can slice it up into smaller pieces, and write content and commentary around each of those smaller pieces.
Sure, it’s just easier to slap an infographic up, and maybe link to it using the code the creator of that infographic provides. Except THEIR site is the one actually benefiting, since they’re they’re the creator. So if you don’t add your OWN content, what’s the sticky value for YOUR site?
And if you are the original content creator, do you just want your exact same content replicated across half the web if there’s no differentiation?
The Goal for Users and SEO
The goal in all of this is to help meet the needs of more user types (human and search engine originated). So it’s critical that whatever content you create not just be slapped up onto the site artificially.
That may very well mean fewer content pieces being created. Yet when it’s high quality, helpful, educational, informative or emotionally impactful content, the value is worth the effort.
You’re much more likely to get more people to stay on-page longer.
You’re much more likely to get more people willing to share your content.
You’re much more likely to get search engines to formulaically assign higher value scores to that content.
When you do that work with enough individual pieces of content, as a result of those value-gains, you’re much more likely to see a given section of the site, and in turn, the site overall, improve in search engine ranking.
Sure, it’s helpful quite often, to get increases to site visits through better SEO. (And this post doesn’t advocate to NOT want that.…) More visits are what everyone initially thinks of SEO as being about most of the time when they first get past higher rankings…
What if I told you it’s worthless in many instances, if increased visits don’t lead to increased sales or revenue?
Whether that revenue is from “here’s display ads, we need you to click on these to keep our site going” or “here’s stuff for sale, we need you to buy this stuff” or “here’s our services, we need you to fill out the contact form or call us, to hire us”, that’s what having a business site is all about, after all.
Unless you have a business built on page views for advertisers. In that case, you need a lot more help than this post can offer. Because that 20th century business model is on the way out the door and on its way into “this was never a real, sustainable business model anyhow, but now the publishing industry is finally forced to admit it” realm…
[ Note — Before we continue, Moosa Hemani just went live a few days ago with an excellent post on improving your conversion rates on an eCommerce site. It’s a great write-up and I encourage you to read that because it covers critical aspects of CRO beyond what I’m covering here… ]
Moar Ads Moar Pop-Ups Moar Moar Moar…
And no, the answer to increasing revenue is also not “More ads, bigger ads, more obnoxious ads”. That’s just downright visitor abuse. It breaks the unwritten contract between publishers and visitors covering the concept of respect, trust and value.
So what if I then told you that more visitors isn’t even necessarily worth the effort?
Instead, what if it’s “better”, “more qualified” visitors? And what if I told you that once those visitors arrive, it’s “better User Experience”?
As a forensic SEO consultant, a good amount of my work involves conversion rate optimization, because it’s not good enough to just get more visitors to a web site if you can’t also increase conversions. So where do I begin?
Start with Existing SERPs
When a new client comes to me for an audit, one of the data points I look at is a click-through rate from organic search. It’s a quick way to see if there might be weakness in page Titles and Meta descriptions. If there is, then the site is likely missing out on tremendous existing opportunities. Opportunities that, with the proper effort, can be tapped with a one-time change. And where that can be sustainable.
Some people talk about “quick wins” or “low hanging fruit” in our industry. Except many focus on other concepts around that — lower volume search phrases that might have less competition, for example. That’s okay — those can be beneficial as well. Here however, I’m talking about not being afraid to focus on even high value phrases. Heavy competition phrases.
[ Side Note — in my audit work, I do my best to encourage clients to avoid the pitfall of only focusing on the lower competition scenario. Sure, sometimes given very limited resources, or extremely heavy competition, that makes sense as a primary need. Yet quite often, I find that with higher emphasis on the 5 Super Signals of SEO — QUART - smaller sites can, often, compete in the big leagues. So where it makes sense, I encourage clients to think bigger! ]
Start with Most Important SERPs
Since my work is strategic in nature, and not tactical, and needs to cover a vast range of SEO rather than any one aspect, I don’t just look at “all the phrases”, and I don’t look at 2nd tier or 3rd tier phrases (phrases that relate to secondary or support type content). Instead, I focus on a sampling of those phrases the site is already showing up for somewhere in search results where those phrases are most aligned with conversion point value.
Looking at those results, where are there opportunities for increased clicks that better align with conversion rate goals? Not just “move up in rankings, more than that — improve existing rankings right where they are, as its own effort.
Conversion Point Value
When I talk about “conversion point value”, I’m referring to the notion that people psychologically and emotionally go through phases in a purchase decision process.
If you can help someone get past the notion (good luck!) that “we have to rank for all the things”, or “but all we care about is showing up for our name” or “we need more visitors!”, you can work on or guide the process of improving revenue in ways that have the most impact based on the most efficient use of resources.
To evaluate this, I will often start with conversion improvements by looking at existing organic search results. What is a site showing up for in organic results? Within that, what are the phrases people are using where those phrases are most relevant to the products a site offers?
Looking at those, when we examine the page Title and meta description that show up for a given phrase, it’s important to ask several questions:
“Is this the best page on the site for this phrase?”
Right SERP, Wrong Page
Sometimes a blog post shows up in a result set that’s more of the “ready to buy” type in regard to searcher intent. If that’s the case, the 1st opportunity is to figure out why a more direct sales type page isn’t ranking on your site. Work to correct that.
Do you even have a proper “sales” specific page (product category or product purchase details page) set up on the site for that topic?
If you do, consider what you can change to reorient the focus from the blog post or FAQ page or information page over to that sales funnel specific page.
[ NOTE — if you have a sales page that’s more appropriate for the ‘buy now’ searcher need, a stop-gap task to consider as you work on moving the ranking signals over, is adding a paragraph at the top of the main content area of that non-sales-specific page or blog post that links to the better destination.
By doing this, people who are expecting a sales oriented page won’t be as likely to completely abandon the site when they don’t immediately get to the “right” page for their current goals. ]
The Buyer Mindset Multi-Step Life Cycle
Focus on understanding that people looking to make a purchase go through a multi-step psychological/intellectual process. Sometimes that’s all in one day, and sometimes that’s over an extended period.
Primary points in that process include
“I’m thinking about buying, just not sure, so let me explore”
“I’m looking to buy, just not sure who or where to buy from”
“I’m looking to buy, and have narrowed down some likely sources, just not sure what the exact thing is that I want or need”
“I want to buy today”
Match Target Pages To Buying Life Cycle Needs
Crafting content and user experience within a site & across social channels to align with the right point in the decision process is essential to strong conversion rates. Don’t try and get all of that done on one page. Don’t attempt to rank for “all the stages” at once, for “all the things”. Take it manageable steps. Ask the important questions at each step.
“Is the Title of that page in this search result one that reinforces relevance properly, while reinforcing trust? For that stage in the buying life cycle?”
“Is the Meta description that shows up in the search result strong in further reinforcing relevance specific to searcher intent, and does it convey authority and trust?”
Quite often I see cases where the page Title is too generic and confuses relevance; and where meta descriptions are keyword stuffed or fail to be motivational to click-through opportunity.
In many cases, Google will “decide” through algorithmic processes, that the provided description is not ideal for this search result.
When that happens, you can end up getting junk for a description, though sometimes you will see something that’s at least “somewhat” more helpful for invoking relevance, authority or trust signals.
Search Phrase: Mazda Tail Light Replacement Parts
Current Page Title: OEM Replacement Auto Parts — Mazda, Mitsubishi, & Toyota
Current Meta Description: Mazda — Miata, 626, Mazda3, Mazda 6 — 2009, 2010, 2011, 2012, 2013, 2014, 2015! engine components, transmission parts, body repair…
Even if the page that’s linked to is an actual “Mazda Tail Light Replacement Parts” page, that page Title is not helpful to communicate “this is a page you’re looking for”. And the description is trying to “be all things to all the people!” and in no way conveys “this site has what you’re looking for right here” or “we are a highly trusted supplier” or “we’re the best site with the best prices” or anything else that matches to the psychology of buying decisions…
So that’s a great starting point – working with people who are already finding the site in search results, yet where you can get more clicks from people ready to buy.
That can help take very low click-through rates and not just increase visits to the site but where those lead to higher conversions.
A/B Testing Titles & Descriptions
While you can use many different methods to figure out how to improve page Titles and Meta descriptions, one way is to rely on existing data you already have if you invest in AdWords or other pay per click advertising.
If you do, and if the person/team responsible for that channel knows what they’re doing, they may very well have done extensive A/B testing on those. In that case, leverage that data and knowledge. It can go a long way to providing real world value to better organic Title and description writing.
Another way to come up with improvements, when done properly, is to take a sampling of phrases you show up for, and look at what competitors are doing where those competitors consistently outrank you. How are they seeding page Titles and Meta descriptions?
Within that, is there anything you can do in writing your Descriptions to show a value-add, or a competitive strength? Don’t just copy competitors — look for something that sets YOUR brand, YOUR site, YOUR pricing, YOUR return policies, YOUR inventory selection apart from the competition.
Matching On-Site Signals to SERP Signals
From there, it’s working on site – and especially on those pages, to reinforce the five super-signals of SEO all around, and more specifically, confidence in site visitor minds that “this is exactly what I want” or, alternately, “this site makes it effortless for me to refine exactly what it is I am looking for”. Focus on that, and you go a long way toward strong conversions.
So using the example above where a “right page” matters, use the QUART concept to evaluate what’s going on:
Is the page this SERP links to the best page for the search queries people click through to come to the site?
Is the site designed aesthetically to convey a high level of quality overall, and trust?
Is the section this page resides within supportive of reinforcing additional value?
Is this page set up to quickly confirm the visitor came to the right place?
Is the information provided easy to read, evaluate and understand?
Does the page overall make it effortless for the visitor to achieve the goal they came here for?
If the answer to ANY of those questions is “sort of” or “not really”, you have work to do!
Remember — it’s not about YOUR opinion from YOUR perspective. It’s about the VISITOR point of view.
Mobile is The Same, Mobile is Different!
When you’re working on these issues, realize that mobile SEO involves a lot of the same considerations I’ve described already. What applies to desktop/laptop search and on-site, applies to mobile.
Except it’s not 100% parity. Mobile searches involve different kinds of search words quite often — different language. Especially as we now move more into a world of asking Siri or Google (or yes, even Amazon Echo — oh wow…). And mobile user experience is different, by leaps and bounds.
[Note: sometimes mobile visitors will come to a site on an initial phase of the buying decision process, and then come back to complete the purchase from a desktop or laptop computer. Because of this, at least some transaction rates / conversion rates may legitimately be lower for mobile visits. However when the transaction % is so dramatically off compared to desktop/laptop rates, it’s critical to confirm that is the case, or discover where that’s not the case…]
More More More…
Everything I’ve covered here is valid and based on sustainable success in SEO. There’s always more to do, more to consider, both for overall SEO and for conversion rate optimization. However these are strong starting points. So I encourage you to get to work with these. You can always circle around later to go deeper, or build on this work.
If you think you need help in figuring all of this out, I highly encourage you to consider hiring us for a proper, forensic level audit. Because these issues can become nuanced, and they’re never the only thing to consider — you don’t want to do all this work only to find out you have sixteen other problems that are equally or even more important for SEO.
If you’ve made it this far, and you want to add anything, please leave a comment. — I greatly appreciate the collective mind value in our industry and on these topics! Don’t be afraid to challenge what I share either! If you have a different opinion, let me know!