Note: Google rolled out Penguin 2.0 a few days ago, which affected 2.3% of queries. This post is not written to discuss any possible future update by Google – you’ll see no hypothetical Zebras here, no bigger Penguins, no deadlier Pandas and, in fact, nobody from Zoo.
Now this means the title is a little misleading, right? No. Wrong.
Let me ask you a couple of questions:
- For how long will we pray at nights that Google doesn’t release an algorithm update tomorrow?
- For how long will we do things that can trigger a manual unnatural link warning from Google or even get us penalized?
I argue that it makes little sense to worry about what Google’s next update would be every other day (though it’s always intriguing). And it’s high time for us to make our site future-friendly (or let’s call it Google updates-friendly), one that’s not based on manipulation, gives a great user experience and leverages the latest SEO best practices.
Since we’re talking about keeping both Google and users happy, it’s imperative to dive in things that you should and shouldn’t do.
Let’s Start With The Things You Shouldn’t Do
Many of us have unfortunately spent our lives doing our job in the baddest possible way: Article syndication. Advertorials for do-follow links. Comment spam. Hacking. Keyword repetition. Over-optimized anchor texts. Too much internal linking. Thin content. Scraped content. Fake likes, +1’s and tweets…
When I say this I think most of you would agree with me: the dark SEO practices – though useful to some extent when Google’s artificial intelligence wasn’t intelligent enough – have tainted our name badly. And it’s time to do away with them once and for all, so that people learn to differentiate between SEOs and spammers.
Then there are some areas that inevitably cause damage to the site only because we don’t use or take care of them properly; areas such as website speed and robots.txt.
Let me explain.
Having fat websites has become a rather disturbing trend in the age where more and more people use mobile devices to access the Web. We all have been enduring mobile network bottlenecks for, well, forever. So we know how much a slow website irritates, especially the low-bandwidth users. And studies after studies have proved that a website’s speed has a great impact on its users’ experience and, of course, conversions. Even Google considers site speed as its ranking factor.
What all you can do to improve the speed of your website, you ask? Start small: set up a Web page size limit (50kb max).Then you can go on minimizing HTTP requests and optimizing images by using lazy loading technique; leverage conditional loading technique if you have a responsive website. Google has laid down some very useful Web performance tips.
Here’re some other great resources:
- How To Make Your Websites Faster On Mobile Devices
- How To Lose Wait On Your Website By Increasing Page Load Speeds
- 15 Simple Yet Effective Techniques to Speed Up Your WordPress Site
We all know that the robots meta tag helps us make the less significant pages of our website uncrawlable. The pages blocked by this tag can be indexed (because of the links pointing toward them) but can’t be crawled by Google. This means Google ranks them but a little unusually (i.e. without meta description).
The most common mistake with robots.txt is blocking a complete site accidentally (Disallow: /). Since this mistake can altogether remove a site from being indexed, it must be taken care of.
Things You Should Do
Deviate From Your Typical Tasks A Bit
As SEOs our responsibilities are manifold. We do SEO audit, keyword research, competitive analysis, link analysis, on-page optimization, link building and more – and we often do it all to help our clients get those much-coveted page #1 rankings. What if we change that a bit? What if we start giving a shit about things such as content, social, conversion rate optimization, UX etc. This will not just help us build their brands and get them better rankings and more customers – it’s also a great way to make happy clients.
Stop Creating Bullshit
I loved Brad Frost’s presentation on Death to Bullshit, which inspired me to write a post of my own – If SEO undermines user experience, it’s bullshit. As responsible SEOs, we need to understand that bullshit links, bullshit content, bullshit usability might help us rape our clients off their money, but it’s all ultimately futile. Those clients are never going to come back. That money won’t help us grow. It’s like getting paid for littering shitload of bullshit over the Web – our Web.
Since we as SEOs have a crucial role to play in the development of a website and carving its overall role in the Web, it’s our responsibility to embrace awesomeness and help others embrace it, too.
Never Forget SEO Best Practices
It’s good to make the most of evergreen SEO practices such as creating compelling content and using alt attribute for images. But it’s even better if you take care of things that your competitors might be missing out on. Let’s walk through some of them:
Use Google Plus authorship to get those cool, more clickable search results and to enjoy more traffic.
Since Google’s most likely to increase the weightage of social and author trust in the future, it’s better to make the most out of authorship sooner than later. Not convinced? Let’s recall what lately has become one of the most quoted statements: “Within search results, information tied to verified online profiles will be ranked higher than content without such verification, which will result in most users naturally clicking on the top (verified) results. The true cost of remaining anonymous, then, might be irrelevance.“– Eric Schmidt. Still not convinced? Then go through Jeff Sauer’s awesome post on Google Plus.
- Facebook’s Open Graph protocol and Twitter Cards help give a wonderful experience around the content shared socially. However, if metadata requires some editing, do that. If it’s not available, don’t use it at all. It’s better to stick to what’s usual than giving a poor experience.
- Schema.org implementation is a great way to leverage HTML tags in a way that search engines love.
- Rich snippets: ecommerce sites that have been using them for a while now (for their product information and product reviews and ratings) know their true importance. But rich snippets are not limited to ecommerce only; they are used for people, businesses, media, events and recipes, too.
- It’s a real must to cater on-the-go users nowadays. Though Google recommends responsive design, it’s not against dedicated mobile sites in any manner (unfortunately many have badly misunderstood Google’s recommendations for mobile). It all boils down to business needs and budget.
hreflang="x"if your business is spread around the globe. It helps Google serve the correct language in its SERPs.
rel="canonical"to set the preferred version of a Web page and keep duplicate content issues at bay.
- Similarly, use
"view all"to make paginated pages prominent for search engines.
I know it’ll take some time when we truly carpe diem on worrying about Google’s future updates, and that am hoping for much. But it’s something we all want: a life that’s not haunted by Google’s ever-evolving algorithm.