It’s an SEO’s worst nightmare. You’ve got a great new client but their website is a nightmare to deal with. Worse still they’ve just bought it, so you’re struck with it long term. If only all web designers were SEOs … wait let’s not go there. If only all web designers consulted SEOs we’d be much better off.
What does the goog say?
It’s Google’s modus operandi to serve results to their users that Google won’t be embarrassed by. All of their algorithms and over complexities are basically just trying to teach their bots to imitate or predict the response of a person. The judgements that their robots make are supposed to be in line with the judgements that a person would make. If the result doesn’t match the intent behind the search it frustrates people, so robots try to figure out whether the result is likely to cause that emotional response. If it figures it out, the site is less likely to be served to Google’s customer. If the site it serves is an ugly trainwreck then a person clicks away. The Googlebot aims to identify things like this.
Google tries to serve the result that is going to make the person the most happy. The most recent algorithm update is an extension of this philosophy. Google Hummingbird is a response to technologies like voice control search, wearable tech and reality integration and aims to clean up the “long tail” by identifying specific intent in phrasings. “Conversational search” is the term thrown about. Google would like to think that as long as you make a good website that people will like, you’ll live happily ever after and achieve amazing success with your SEO.
Well why are we even having this conversation!?
So as long as your site is good for users, it should be good for the bots right? No, of course not. If the robots were that good we’d already have Google Android’s walking the streets acting as the human race’s personal reality concierges.
Let’s be intelligent about this. Instead of turning this conversation into an a vs. b let’s realise the truth of the matter. You really have to do both. I like the concept of Mobile First Design so let’s call it Robots First Design.
It’s in my experience that I’ve discovered that it’s just far more difficult to undo issues that causes a website to be difficult to analyse and index for robots than it is to make it good to look at it. SEO friendly web design needs to be a focus at every point of the design and developmental stage of the website. After you ensure that the following concepts are well established in the creation of the website, you can then go ahead and put a fancy dress on it.
3 Key Robots First Design Concepts
1. Landing Pages
Long gone are the days of stuffing your page title and web page with five different keyphrases and getting it ranking for all five. On page relevance is a big deal now and you have to ensure that you have specific pages for specific things.
If your site is strong enough for your home page to rank for certain phrases based solely off the location of that your links and citations are coming from then I can guarantee that a good internal landing page on the subject will rank even stronger.
Take a plumber for instance. If you’re a plumber that ranks well for blocked drains because you’ve had citations that mention that as a service you’re good at, or links from blog posts that talk about ways to do it, and the ranking page is your home page, an internal page for blocked drains is going to rank even better than the home page.
Landing pages are more than just for campaigns, but there’s one important thing that you need to consider ….
2. Internal Linking
What’s the point of having that landing page for E-Commerce Design if the Googlebot can never find it? Internal linking has always been an important element of onsite SEO and it’s more important than ever. Ensuring that your pages are linked as often and frequently as possible ensures that Googlebot can find all your deeper content and serve it when it’s more relevant.
I have a hypothesis that is in early stages of testing. I have begun analysing web pages to test where you’d end up in the website if you just kept clicking the first link on the page. My hypothesis is that Googlebot basically does this and as a result you have to make sure you don’t drive Googlebot straight off your website. The results have been interesting. On a website that is very strong that has three clicks before linking to another site the only internal pages that rank are those that have links pointing to them from outside the website. The home page outranks all the internal landing pages for their target phrase. The internal landing pages are linked to from the footer menu of the home page.
A different website with the internal landing pages linked to in a products drop down in the main menu has every single landing page indexed and ranking over the home page.
Alternatively the results could suggest that the footer credit link devaluation may affect internal links as well as outbound links. Regardless, a lot of emphasis should be placed on your internal linking protocols.
Helping Googlebot to understand what your page is about at a deeper level will help with conversational search. Schema.org allows you to use a mark-up language to denote whether pages are about products, locations, categories and so much more. And the language is ever growing.
Google is aiming to understand why you’re looking for things, not just what you’re looking for. By applying Schema mark-up language to your content you will be more likely to appear in search when the result you’re giving matches the intent of the user, meaning higher conversion rate, less bounce rate and a happier digital marketing division.
So to summarise, Robots First Design, is only a small principle you can live by to ensure that your website is going to rank to it’s full potential. The best part is that it really doesn’t limit design at all. You can have the best of both worlds.