“How to make friends. And influence robots” - Martin Belam at BrightonSEO

 by Martin Belam, 14 September 2012

This is a version of the talk I gave at BrightonSEO today.

My links to search engines and SEO go back a long way. In fact, one of the first ways I remember learning about how websites were put together was using Hotbot to find sites where it had crawled “Index of”, which led me to servers that hadn’t locked down their directory view. It meant I could explore all the code and assets of how they were put together.

My first proper digital job was at the BBC. As “Registration co-ordinator” I was responsible for trying to increase search engine traffic to the BBC, often by literally registering new URLs with the search engines of the day - a hotchpotch of names like Excite, Altavista and Lycos that I suspect most modern day SEO and social media gurus have never heard of. In some ways it was an editorial role, and I went on a sub-editing course to learn to improve honing the description of any BBC site or service down to the 25 word maximum description allowed in the Yahoo! directory.

I also started going to SEO conferences, where, amusingly I’d run into lots of people on the “black hat” or “naughty SEO” side of the fence. Saying you were a goody-two-shoes from the BBC would usually clam those conversations up pretty quickly. Within the BBC, with my colleague Anne Scott, I would tour production centres for the web, teaching them about “Search engine success”.

Over the years I’ve gradually moved away from SEO, through being a product manager, IA and into the user experience work I do now. But I’ve always kept my eye on the SEO industry - and my fascination with search has never dimmed. After all, typing words into a box and pressing “Search” is one of the most verbose interactions a user can make with your site or the web as a whole.

Here are five things I think are important to make sure that your SEO efforts are part of making a broader, better, user experience.

Headlines for news

I’ve spent a lot of time working in news organisations, where SEO often has a bad reputation. I once blogged a talk by Guardian SEO editor Chris Moran about how he approached it. Chris said that “A lot of the opinions people have about SEO are based upon prejudice and a lack of understanding about how search engines work.” He observed that a lot of the words we use about readers who come to news sites via search engines are the kind of words we associate with sexual infidelity, like “promiscuous”.

Chris argues that it is easy to forget that every search query starts with a human being at a computer asking a question. Nearly every site, regardless of whether it is informational or transactional, has a news component - new products, new press releases, new reviews - and these need headlines. These have to be not just optimised for search queries, but designed to capture the interest of that human being and their question. It is possible to craft a headline that is both SEO-optimised and inventive and witty.

One of the most important things to remember as a publisher is that you no longer control the context in which your headline appears. You need to think about how this nugget is optimised around search keywords, yet still conveys meaning and grabs attention when it is in the context of a search results page, or a tweet, or shared on Facebook. The classic mistake is to forget to refer to a key component that is obvious on the page - because there is a great big picture of the thing - but which becomes invisible when shared. Newspapers are particularly bad at doing this with big interview features, where the name of the interviewee or a notable interviewer is omitted.

Whenever I wireframe or prototype I always try and use real content wherever possible, and moving an Observer headline like “He just wants to be loved” from a trailer where it appears alongside a picture of Ricky Gervais, into a Facebook status update that says “Martin Belam read ‘He just wants to be loved’ on The Guardian” with no contextual picture of Ricky makes the headline becomes uninformative and uninteresting.

Navigation for people

Navigation is important for people. Spiderability is important for search engines. The two are not incompatible - but that doesn’t mean publishing one flat page of a site map. Or organising the navigation on your website based on who sits on what floor in the office. It means thinking carefully about how users flow through your site. Use card-sorting or tree testing exercises to help model your content around how your target audience would organise it, not around how your business is organised.

And if you are having to do something clever with redirects, cloaking, javascript, not javascript or dodging Flash in order to make sure that spiders can find their way around your site - you’ve probably broken it for humans too.

Using “hot topics” to build up link authority is a great idea, but don’t forget navigation sets a tone for your site. There is a reason why the Guardian has “Culture” in their primary navigation, and the Daily Star has “Celebs” and “Babes”. It sets user expectation around content. So make sure if you are trying to optimise for slightly off-tone or off-brand terms, that doesn’t leak into your site.

And if you are cramming the lower decks of your web pages with lots of links in tiny type, think about what that is conveying to your users when they hit the bottom of the page. Are you putting clear and helpful calls to action for what the user might do next, or are you just thinking about anchor text?

Forget the crazy antics

I really question the value of what I would call “crazy antics” for most sites. I think the epitome of this for me was when there was a craze for using “nofollow” on internal links to “sculpt” PageRank for your site. The logic ran that each page only has a finite amount of “link juice” to distribute to the rest of your site, so why waste it on pages linking to your privacy policy or this unpopular product page or that unimportant page.

Two things struck me about this.

Firstly, if the pages are that unimportant that you don’t want people to be able to find them - why have them at all?. I mean it. Sure, a lawyer might argue that you need 73 pages of legalese on your site. But really, why? If you don’t want the pages found, find a way to kill them, or to merge them all into one unwanted page, or to avoid linking to them altogether from every single page. Don’t mess about having 17 links to pages you think are unimportant on every page - simplify your pages for your users, and for the search engine robots.

Secondly, I envy anybody who has got so little else to do to improve their site, that fiddling around with which links in the footer have “nofollow” on them is a key task. Writing better copy, creating new content, doing usability testing on the existing flows - any of those things seem likely to reap a greater longer-lasting benefit.

OK, I’m probably picking on PageRank sculptors unfairly, but my point is this. Every time you disappear down the rabbit-hole of some very nerdy SEO activity that is effectively chasing the algorithm, ask yourself about the opportunity cost of not improving something else on your site for humans instead.

Site performance is key

One of my former bosses gave his name to “Loosemore’s Law”. Tom Loosemore pointed out that our collective patience with things loading on the internet decreases as bandwidth improves. So, for example, whereas I used to happily sit all day with my computer connected to Napster over dial-up trying to peer-to-peer download obscure Radiohead bootlegs, 4G is being advertised with the idea that finally you’ll able to download a HD movie in two minutes, instead of the twelve minute eternity it currently takes over 3G. Bloody hell humans, you can download a movie onto a movable computer in your pocket over the thin air and you are quibbling about it taking a few extra minutes? What are you like?

Well, human beings make decisions fast. Lightning fast. And the speed with which your page loads is a key determinant of trust. In almost the blink of an eye, the speed of your site will influence the user’s perception of it.

The user experience of your site isn’t just “the bit where they are on your site” - it includes the bit before. So “Search for ‘thing x’...click link...wait for page to load...wait...wait...wait...” isn’t a good user experience, however well you’ve ranked. It makes users much more likely to “pogo” back from your site to the search results page to click another link. All your efforts to rank highly for that keyword have been for nothing, because you didn’t pay enough attention to server performance.

The law of unintended consequences

Don’t under-estimate the power of micro-copy and small layout details on influencing human behaviour when they are faced with digital content. The keywords that people search for may not necessarily be the best keywords to drive conversion once you have attracted them to your site.

A company like Facebook obsesses over the small details on the page, using data to verify their decisions. The precise shade of green and wording on the “Sign up” button is based on observing exactly what is the most effective. I’ve seen BA talking about how multivariant testing of one page in their sales funnel helped increase conversion, based on very small changes to the button and layout.

On page optimisation with A/B testing is also very useful. At the Guardian we found we could vary the number of people likely to leave a comment underneath an article with slight changes in the way we worded the invite to join in the discussion. We discovered when testing some variations around sharing buttons that we could get a 25% uplift in sharing with a specific location placement.

The lesson here isn’t that you should be doing A/B testing instead of SEO, but that you should beware that if you are reconfiguring the layout or copy of your pages with only an eye on the SEO implications, you may be missing a trick, and unintentionally harming the effectiveness of the page that the user lands on.

Keep up to date on my new blog