Monday, 29 February 2016

The Penguin in the room: what to do until Google rolls out its latest update

Google’s Penguin 3.0 update affected less than 1% of U.S./English queries in 2014. Granted, Google processes over 40,000 search queries every second, which translates to a staggering 1.2 trillion searches per year worldwide, so Penguin 3.0 ultimately hit 12 billion search queries.

What’s scary though, is that Penguin 3.0 wasn’t too bad. Penguin 1.0 hit 3.1% of U.S./English queries, or 37.2 billion search queries. The quasi-cataclysmic update changed the topography of SEO, leaving digital agencies forever scarred by the memory.

Now, Google is supposedly going to roll Penguin 4.0 out in the imminent future. Everyone expected the monolithic tech company to launch the update in 2015, but the holidays delayed it to 2016. Then, everyone expected it to drop sometime in Q1 2016.

However, the SEO world still waits with bated breath.

Why is everyone so afraid of the Big Bad Penguin?

penguins marching to war

Google first launched the Penguin Update in April 2012 to catch sites spamming its search results, specifically the ones who used link schemes to manipulate search rankings. In other words, it hunted down inorganic links, the ones bought or placed solely for the sake of improving search rankings.

In the time it took for Penguin 2.0 and 3.0 to come out, digital agencies wised up. They heard the message loud and clear. Once a new Penguin update comes out, they know they have to take action to get rid of bad links.

Google targets links that come from poor quality sites, have little to no relevancy to the backlinked site, have overly optimized anchor text, are paid for, and/or are keyword rich.

However, what makes Penguin truly terrifying isn’t only the impact it can have on a site’s ranking, but on an honest marketing campaign.

Earning backlinks is tough. That’s why some stoop to paying for them or working with shady link networks. The most tried-and-true way to earn backlinks is guest blogging, which is not only difficult, but time consuming, as well.

Although Google usually ignores backlinks earned by guest blogging, that’s not to say that they’re completely Penguin-proof. Your guest blogging backlinks may have become toxic in an unlikely, but entirely possible scenario.

In other words, people are so afraid of Penguin because it can ruin a lot of the hard work you’ve put into a campaign.

How can you slay the fearsome Penguin?

Luckily, there are a number of preventative measures you can take to avoid Penguin’s wrath.

The first thing you’re going to want to do is look at your backlink profile using Open Site Explorer, Majestic SEO, or Ahrefs. Look at the total number of links, the number of unique domains, the difference between the amount of linking domains and total links, the anchor text usage and variance, page performance, and link quality.  

If this sounds like too much work, there are tools that will automate the analysis process for your and apply decision rules for a fee, such as HubShout and Link Detox.

If you find a bunch of toxic links – the backlinks that came from link networks, unrelated domains, sites with malware warnings, spammy sites, and sites with a large number of external links – you need to take action before the Penguin strikes.

Your next step is to remove the links manually. Contact the site’s owner to request he or she remove the links. Failing that, you can always disavow them. This tells Google not to count the links when it determines PageRank and search engine ranking.

How can you recover after a Penguin attack?

penguin diving

If Penguin 4.0 does wind up pecking your campaign to the verge of death, don’t worry. You can recover.

Analyzing your backlink profile and removing toxic links – what you should do to prevent a Penguin issue – are also the steps you need to take to recover.

However, the thing about disavowing a link is that it may actually hurt your campaign. No one but the Google hivemind really knows whether or not a link helps or hurts. You can only make an educated guess. Despite this risk, you still need to disavow any links that appear to be toxic.

The next logical step after purging your backlink profile is to build it up again. Although you should never stop trying to earn backlinks, it’s a smart idea to redouble your efforts after a Penguin attack.

Guest blogging isn’t the only way to earn backlinks, either. Entrepreneur offers a great list of creative ways to get people to link to your site, such as:

  • Broken-Link Building: Check a site for broken links, and compile them into a list. Then, take said list to the webmaster, and suggest other websites to replace the links, one of which being yours.
  • Infographics: The thing about infographics is that they’re more shareable than blogs. Research shows that 40% of people respond better to visual information than plain text.  The idea here is exactly the same as the idea of content marketing. You create a great piece of content – an infographic, in this case – and people are going to share it. In the case of an infographic, other sites and blogs could repost it. Success isn’t guaranteed, but this method can work.
  • Roundups: Similar to guest blogging, reaching out to bloggers and sites that run weekly or monthly roundups is a great way to get some backlinks. Search your keyword and “roundup,” and limit the results to the past week or month. Once you’ve found a few, send the webmaster a link to one of your guides, tutorials, or other pieces of content (like, say, a new infographic). Sites that run roundups are constantly looking for content, so there’s a good chance they’ll include your work in their next edition.

what makes a good infographic

What’s next?

So long as you take these precautionary steps, you’ll be fine, whenever Penguin does rear its beaked head.

The article The Penguin in the room: what to do until Google rolls out its latest update was first seen from

What is HTTP2 and how does it affect us?

The web is about to get faster, with the introduction of the latest version of the HTTP protocol: HTTP/2.

It’s been 17 years since the last update and so many things have changed in almost two decades. Technology has created more demanding users, sites only got heavier and speed is an important factor for most of us while browsing.

As servers already started adapting to HTTP/2, it’s time to learn more about it and try to understand everything we need to know about this significant change on the web. How does it affect us?

What is HTTP/2?

HTTP/2 is an updated version of HTTP (Hypertext Transfer Protocol) and it is based on Google’s SPDY protocol, which was developed to improve the speed and the performance of the browsing experience.

The history of HTTP

The Hypertext Transfer Protocol (HTTP), or what most of us know as the ‘http://’ in a web address, is the protocol that established the connection between a user’s browser and a server’s host.

HTTP was defined back in 1991, while its current version, HTTP/1.1, was introduced in 1999, which means that it was only a matter of time to welcome the next update. Last February the Internet Engineering Task Force (IETF) formally approved a draft of HTTP/2 and that’s how the standardisation attempt started.

HTTP2-graphic - source akamai


Why should I care?

If you are using the web, then you should probably care. You don’t have to be a developer to be interested in this exciting change, as it promises a faster and more functional browsing experience for everyone.

Sites have significantly changed since the last HTTP protocol update almost 20 years ago and it’s time to face the fact that modern sites consist of more images and data, which affect the loading time for a page.

According to Daniel Stenberg,

“When looking at the trend for some of the most popular sites on the web today and what it takes to download their front pages, a clear pattern emerges. Over the years the amount of data that needs to be retrieved has gradually risen up to and above 1.9MB”

HTTP/2 promises to adapt to the needs of our time, by assisting everyone to access any site as fast as possible, even without having a high speed internet connection.

http2 usage (source


What’s changing?

We don’t need to dive into technical details to discover the most important changes that HTTP/2 brings, so this is a simplified overview:


Multiple messages can be sent at the same time, with just one TCP (Transmission Control Protocol) connection. This will reduce the required time to process the requests that are sent and received, improving the user experience, by also speeding up the loading time.

Up to now, HTTP/1.1 allowed only one request to be handled at a time, which led to a series of multiple requests and slower connection. What’s more, a page load used to require several connections, while HTTP/2 solves both challenges with multiplexed streams and the use of just one connection while a site is open.

These lead to a cleaner and faster connection, improving latency, which is expected to be highly appreciated.

multiplexing (source cloudflare)

source: Cloudflare

Server Push

Server push is about saving time, with the server analysing the client’s next request, sending additional information, even before they are needed.

There’s no need to wait for the HTML to load until the browser requests the Javascript, or images, etc., as HTTP/2 protocol will allow the server to make faster data transmissions by sending “push” responses. 

No more delays, time for proactively pushed responses!


Prioritization is about understanding the importance of each element, by transferring the most important requests first. It’s the browser that suggests the data to be prioritized, but the final decision is made by the server.

http2 (source google)

source: Google


HTTP/2 focuses again in boosting the sites’ loading speed by transferring data to a binary format, which is the computer’s native language. This will remove the unnecessary step of translating text messages to binary protocols, which leads to a more efficient result.

Header Compression

HTTP/2 allows the compression of the headers, in order to reduce the header’s size along with the number of round trips needed for each request. This is even more important in mobile browsing, where a page’s assets and its latency may be even more challenging.



Is HTTP/2 currently in use?

HTTP/2 may not be the standard protocol yet, but there is a growing interest on its use month by month, with 6.6% of all websites currently using it. In fact, the percentage goes up to 13.5% percent for websites that rank in the top 1,000.

http2 usage1 (source


According to Can I Use, it is supported by 71.14% of the browsers globally, with Chrome, Firefox and Opera supporting it only through encrypted connection (HTTPS).

It is promising to consider that several top sites and servers are starting to embrace HTTP/2, with CloudFlare and WordPress supporting it for several months now. Beta support is also available from Akamai, Google, and Twitter, while Microsoft and Apple are planning to support it on their future releases.



In case you’re wondering whether it’s still early for HTTP/2, Mark Nottingham is clear about it:

“It’s just important to remember that HTTP/2 is an infrastructure upgrade for the web, and as such it’s going to take time to see the full benefit. That said, there’s still considerable benefit in adopting them now.”

isthewebhttp2yet (2)


What should I do?

There’s no need to do anything from a user’s point of view, as the change has already started in several sites. As HTTP/2 is backwards compatible with HTTP/1.1, a user won’t notice any difference except for the speed and as more and more servers and browsers eventually adapt to it, we will all enjoy a faster browsing experience.

Here’s an example:

If you’re curious to see the actual performance of HTTP/2, Akamai created a test site for you to compare the latency of each protocol. 


As you can see, there is a difference in the loading time and according to the initial stats, we are generally expecting a speed boost of 20 – 30%.

The article What is HTTP2 and how does it affect us? was first seen from

Friday, 26 February 2016

Six of the most interesting SEM news stories of the week

Welcome to our weekly round-up of all the latest news and research from around the world of search marketing and beyond.

Oh you’ve been away all week? Right, okay. Well sit down. We have some news.

Google’s been making some changes…

Google kills its Right Hand Side Ads

The inescapable news this week is of course Google AdWords removing all the ads from the right hand of its SERPs.

Now instead of seeing PPC listings, you’ll either see an odd blank space, or Product Listing Ads, or the standard Knowledge Graph for Ryan Gosling (or insert your current crush here – mine’s still Ryan Gosling).

london hotel Google Search with right hand side ads

The ramifications for the change are myriad, but the biggest change for user and marketer alike is the increase in PPC ads at the top of the SERP from three possible links to four.

new york flights Google Search

And speaking of Google…

Google launches Accelerated Mobile Pages

Although we had been expecting the launch of AMP – Google’s open source initiative which aims to improve the performance of the mobile web – around now anyway, Google began rolling out AMP at the beginning of this week (Tues 23 Feb).

As expected, pages enabled with AMP carry a symbol on their mobile search results to tell users they are faster loading pages. And you can probably be certain that this is a positive ranking factor.

amp pages symbol

There’s lots more information on Google’s AMP project here.

And speaking of Google AGAIN…

Google to close its financial comparison service

For some people this may be the biggest (positive) change of the week. Google is to shut down Google Compare from March 23 in both the UK and US.

Google compare serps

As Graham Charlton reports, “On the face of it, this news will have competitor comparison sites jumping for joy, as Google Compare constituted a major threat to their own business models.”

A spokesperson from Google stated it has decided to focus more intently on AdWords and other future innovations, which will “enable us to provide fresh, comprehensive answers to Google users, and to provide our financial services partners with the best return on investment.”

Facebook rolls out new Like buttons

In non-Google news, Facebook has added to your arsenal of social interactions with a variety of facial expressions and emojis.

They’re called Reactions.


I still don’t have them yet and I’ve got a LOT of anger built up about it and when I eventually get them EVERY SINGLE ONE OF MY FRIENDS is getting the furious face.

Outgoing links probably good for your sites SEO

Reboot recently carried out a study to prove somewhat definitively whether or not the strength of a site’s outgoing links has an effect on ranking.

The good news is that yes it does.

Reboot created 10 new websites each targeting the same keyword, only half of which included links to high authority sites and after five months it was concluded that, “Outgoing relevant links to authoritative sites are considered in the algorithms and do have a positive impact on rankings.”


For a complete guide to the research and lots more well-explained graphs such as the one above, visit the study and be safe in the knowledge that linking to bigger sites will definitely not do you any harm in search.

Call your mum

Finally, Bing has released a few search insights in time for Mother’s Day, revealing that more than half of Mother’s Day retail searches are set to be made from a mobile device.

  • Over 60% of searches are expected to be made on the move via mobile, with search volumes set to increase by five times between 7am-9am on the day itself.
  • Women take the lead in searching for gifts, making up over two thirds (67%) of all searches.
  • Searches will increase by up to four times in the 48 hours leading up to Mother’s Day.

I’m merely including this to remind you that Mother’s Day is 6th March, so you have more than a week to buy a card. You’re welcome.

The article Six of the most interesting SEM news stories of the week was first seen from

Thursday, 25 February 2016

Why accessibility is key for search and visibility

If you’re involved with SEO, you’ve no doubt thought about all sorts of ways and means to boost your site in the search rankings. But if your site isn’t web accessible, your efforts will be in vain for 1/5 of your potential visitors.

Web accessibility is the name given to making websites and online materials usable to people with disabilities, removing barriers to the way they experience the internet.

From physical disabilities like loss of mobility, blindness and deafness to learning difficulties like dyslexia, a wide range of disabilities affect the way that someone accesses the internet.

People with disabilities make up a significant portion of the population, yet far too little is done to cater towards them online, representing a huge missed opportunity in terms of traffic and visibility for any website or brand.

The need for accessibility

The Global Economics of Disability Annual Report 2014 estimated the global population of people with disabilities as 1.3 billion – nearly 18% of the world’s population, or one in every five people.

These numbers are likely to climb as Generation X ages, making catering to age-related disabilities even more important to anyone targeting the baby boomer generation.

In no other context would it make sense to overlook such a significant demographic, yet making websites and digital materials accessible is far too often seen as a tedious and pointless exercise. But anyone who pays attention to SEO and optimises their website is already part of the way there.

Making your website more accessible to users with disabilities also happens to overlap nicely with improving the all-round user experience, and with boosting your site that much higher up the search rankings.

A simple diagram illustrating six out of seven of Peter Morville's user experience elements. On the inside are the "internal" elements, illustrated in red: desirable, valuable, credible. On the outside are "external" elements, illustrated in blue: findable, accessible, usable.

Accessibility and good user experience go hand-in-hand with search.
Image by Paul Veugen on Flickr; some rights reserved

Why accessibility works for search

A good rule of thumb for accessibility is making sure that all information is delivered to the user in more than one way.

For example, you shouldn’t rely only on the ability to see colour to distinguish the important parts of a web form, or the ability to use a mouse to navigate a website.

Images, audio and video should all have text alternatives available in the form of alt text descriptions, closed captions and transcripts. This makes the content accessible to users with visual or hearing impairments. It also provides more information to search engines, which rely on text to find out about a site.

In a past piece on the SEO benefits of web accessibility, Mark Jackson explained that text browsers, which ignore graphic content, are often used to review how a website appears to a search engine. A website will appear in the same way to users of a screen reader, which can interpret the web for those who are blind, visually impaired, illiterate or learning disabled.

In other words, making the text-based ‘version’ of your website as comprehensive as possible has benefits for both accessibility and search.

A circular diagram illustrating non-text web content and its alternatives, grading them on a scale of A to AAA in terms of accessibility. In the innermost circle are types of non-web content: charts and graphs, pre-recorded audio-video content, live audio-video content, live audio-only content and pre-recorded audio-only content. In the next circle out (minimum or A grade accessibility) are long descriptions (for charts and graphs), pre-recorded captions, text transcripts and descriptive labels. In the AA or enhanced accessibility circle are live captions. Finally in the outermost ring, AAA or additional accessibility enhancements, are pre-recorded sign language and live captions.

The overlap between increased accessibility, SEO and a better user experience can be seen in all sorts of areas.

Providing a site map, for instance, gives a handy point of reference for all users, but particularly those with a screen reader. It also allows search engine robots to quickly crawl a website.

Writing for your website in simple, jargon-free language can benefit users with a learning difficulty or cognitive disability, but it’s also a helpful practice for everyone, especially those for whom English isn’t their first language.

People are more likely to search the web for simple words than for industry jargon, too, so writing for your website simply will help ordinary users to find it in search.

There are also a few search engines, such as Net Guide, specifically designed to promote and rank websites which are highly accessible. Google used to have one such search outlet, but it is no longer supported.

The extra traffic from these search engines might not be huge, but they are worth bearing in mind as another source of visibility for an accessible site.

Putting it into practice

The theory of how accessibility can boost your site’s visibility and traffic is all very well, but how can you carry it out in practice?

I mentioned a few practical steps in the previous section, such as providing text alternatives to visual content and checking out how your website looks in a text browser.

Not everything that makes a site more accessible will align directly with optimising for search, and a lot of it will require extra effort and some new ways of thinking. But it’s a worthwhile exercise, and there are a lot of free tools and resources available to help anyone who sets out to design for accessibility.

A good place to start is Deque System’s guide on Designing for Website Accessibility, part of a series on accessible marketing produced in association with the Whole Brain Group. They divide website design into five key areas for accessibility – complete with a handy checklist you can download.

W3C, the web accessibility initiative, also has a set of tips for getting started with web accessibility with visual reference points. To check how accessible your website is already, you can run it through WebAIM’s web accessibility evaluation tool.

A screenshot of the web accessibility check for the website Search Engine Watch, using WebAIM's WAVE tool. The screenshot shows a total of 33 errors, 126 alerts, 27 features, 53 structural elements, 5 HTML5 and ARIA and 27 contrast errors. These are flagged up across the homepage with various yellow, red and green icons.The accessibility check for Search Engine Watch shows some work still to be done…

A major current development in search technology, voice search, also has its roots in something that is highly beneficial to users with disabilities.

While ways of catering to voice search technology are not yet as refined as other techniques in SEO, there are still ways you can adapt your site to capitalise on voice search, as laid out in depth by Asim Ahmed.

Altogether, adaptations to make your site accessible are well worth adding alongside your usual optimisation for search in order to stay ahead of the curve and ahead of your competitors – and your visitors will thank you for it.

The article Why accessibility is key for search and visibility was first seen from

Say goodbye to Google: 14 alternative search engines

Well it’s been a big week for search, I think we can all agree.

If you’re a regular Google user (65% of you globally) then you’ll have noticed some changes, both good and bad.

I won’t debate the merits of these improvements, we’ve done that already here: Google kills Right Hand Side Ads and here: Google launches Accelerated Mobile Pages, but there’s a definite feeling of vexation that appears to be coming to a head.

As the paid search space increases in ‘top-heaviness’, as organic results get pushed further off the first SERP, as the Knowledge Graph scrapes more and more publisher content and continues to make it pointless to click through to a website, and as our longstanding feelings of unfairness over Google’s monopoly and tax balance become more acute, now more than ever we feel that there should be another, viable search engine alternative.

There was a point not that long ago when you could easily divide people between those that used Google, Yahoo, Ask Jeeves and AltaVista. Now it’s got to the point where if you’re not using Google, you’re not really using the internet properly.

Remember when The Amazing Spider-Man reboot came out in 2012 and the most unbelievable thing in the movie – which involves a teenager with superhuman spider powers crawling up walls and swinging through New York in lycra – is that Peter Parker uses Bing?

Right now though maybe we should be paying more attention to the alternatives. Maybe our daily lives and, for some of us, careers shouldn’t need to balance on the fickle algorithm changes of the world’s most valuable company.

Let’s see what else is out there in the non-Google world. It’s not that scary, I promise.

Please note: this is an update of an article published on SEW in May 2014, we felt like it needed sprucing up especially many of the listed engines (Blekko, Topsy) are no longer with us.


Microsoft’s search engine is the second most popular search engine in the world, with 15.8% of the search market.

Bing homepage

But why should you use Bing? Lifehacker has some great articles where they try to convince themselves as much as anyone else why Bing is a serious contender to Google. Plus points include:

  • Bing’s video search is significantly better than Google’s, giving you a grid of large thumbnails that you can click on to play or preview if you hover over them.
  • Bing often gives twice as many autocomplete suggestions than Google does.
  • Bing can predict when airfares are about to go up or down if you’re searching for flights.
  • Bing also has a feature where if you type linkfromdomain:[site name] it will highlight the best ranked outgoing links from that site, helping you figure out which other sites your chosen site links to the most.

Also note that Bing powers Yahoo’s search engine.


The key feature of DuckDuckGo is that it doesn’t retain its users’ data, so it won’t track you or manipulate results based on your behaviour. So if you’re particularly spooked by Google’s all-seeing, all-knowing eye, this might be the one for you.

DuckDuckGo homepage

There’s lots more info on DuckDuckGo’s performance here.


As Google gets better and better at answering more complicated questions, it will never be able to match the personal touch available with Quora.


Ask any question and its erudite community will offer their replies. Or you can choose from any similar queries previously asked.


Dogpile may look like a search engine you cobbled together with clip-art, but that’s rather the point as it pulls in and ‘curates’ results from various different engines including Google, Yandex and Yahoo, but removes all the ads.

Dogpile Web Search


Of course if you’re going to give up Google, then you’ll also have to give up YouTube, which can be a terrifying prospect. But there is an alternative. And a pretty good one at that… Vimeo. The professional’s choice of video-sharing site, which has lots of HD video and no ads.

otis the cat reviews in videos on Vimeo


This is a Russian portal, offering many similar products and services as Google, and it’s the dominant search engine in Russia.

As you can see it offers results in a nice logical format, replete with favicons so you can clearly see the various channels for your branded queries.

search engine watch on Yandex


If you want to get into the nitty-gritty of a subject with a variety of different points of view away from the major publications, Boardreader surfaces results purely from forums, message boards and, of course, Reddit.

Boardreader Forum Search Engine


WolframAlpha is a ‘computational knowledge engine’, or super clever nerd to you and me. Ask it to calculate any data or ask it about any fact and it will give you the answer. Plus it does this awesome ‘computing’ thing while it thinks about your answer (which can take a short while.)

what really killed the dinosaurs Wolfram Alpha

It’s not always successful, you have to practice how to get the best from it. But at least it’s aware of the terrible 90s television show The Dinosaurs.


Another search engine that puts its users’ privacy at the forefront. With IxQuick none of your details are stored and no cookies are used. A user can set preferences, but they will be deleted after 90 days of inactivity.

Ixquick Search Engine

Oh look… Ask Jeeves is still around. Also he’s no longer a Wodehousian butler, but a computer generated bank manager. Weird.

Ask Jeeves

It’s still a slightly mediocre search engine pretending to be a question and answer site, but the ‘Popular Q&A’ results found on the right hand side are very handy if Jeeves himself can’t satisfy your query. And what a good use of the right-hand side space, huh Google.


SlideShare is a really handy place to source information from presentations, slide decks, webinars and whatever else you may have missed from not attending a conference.

You’ll also be surprised what information you can find there.

hamburgers on SlideShare


“Inhale the web” with this friendly looking hoover guy by creating your own topic page, which you can bookmark and see results from a huge number of channels in that one page (including Google, Bing News, Twitter, YouTube, Flickr).

Addictomatic Inhale the Web

Creative Commons Search

This is particularly handy if you need to find copyright free images for your website (as discussed in this post on image optimisation for SEO). Just type your query in then click on your chosen site you want to search.

CC Search


Because really, when it comes down to it, we could imagine a worse dystopian future than one in which we all communicate entirely in Gifs.

GIPHY homepage

The article Say goodbye to Google: 14 alternative search engines was first seen from

Wednesday, 24 February 2016

Why companies create content – part two: to gauge public opinion

Following on from Part One of this series where the topic of influencing brand perception was discussed, this new instalment looks at how content can help you tap into the mindset of the people you’re trying to sell to.

Part two: to gauge public opinion

Pretty much any content related book, article or conference talk you come across will at some point mention the term ‘audience-focused content’. As a phrase and a concept, it’s a simple one – create stuff that people are going to want – but the cogs that sit behind it can be both complicated and costly.

Rather than researching what people want then creating content to reflect that, there’s an argument to say your content could actually be your research method. By publishing then assessing how people react to what you put out there, the data can be used to create something bigger and better, or to inform other business actions.

Pre-internet (a scary thought) I recall being involved in various focus groups asking me what I thought of this pair of trainers, chocolate bar or gadget. For a handful of spotty-faced opinions, that would have involved the rigmarole of contracting a market research company, finding some willing participants, hiring a venue, recording the proceedings and collating the findings. 

These days you can just put up a vote on your Twitter feed.

Which of these packaging designs do you prefer? Should we serve a sour cream dip or salsa with our wrap? Which of these do you think should be our new TV ad? What should our new album be called?

People are used to being asked questions by brands, and companies are using these valuable contributions (along with other forms of data) to validate their marketing and product development efforts; “Our fans prefer the red one” is a perfectly valid rationale to bring up in a board meeting.

Ignore this opportunity at your peril.

Sticky Toffees

Everton Football Club have a proud tradition – despite not winning a league title since the mid-1980s they’re one of the best-supported clubs in England and social ad agency RadiumOne found in 2013 that if fan interactions formed the basis of the league table, they’d be the team taking home the silverware.

With that in mind, it seems all the more strange that they had such a catastrophic fail when it came to a rebranding exercise prior to the 2013/14 season.

A new club crest (below, in the middle) was announced to overwhelming online derision – there are few things that people feel more passionately about than their football club, and the fact that fans hadn’t been consulted caused uproar.

three everton fc logos

23,000 petition signatures later, the club had their hand forced; they had to do something. What followed was a series of apologies, consultations with fan and community groups and eventually an online vote which was won by the logo that most closely resembled the one they already had.

Rebranding twice within as many years is not a good thing for any business to be doing, let alone one which reaches of billions of people across the world every week. Wouldn’t this have all been much simpler if they had put a few options to the vote of their millions of social fans straight away?

That’s what New York City FC did.

Made in NY

This is a club that doesn’t have such an illustrious past – in fact they’ve only been in existence since early 2013, which may be why they seem to have a better grasp of the importance of crowdsourcing opinions to influence business decisions.

The public were asked to pick between these:

two new york fc logos

CBO Tim Pernetti said:

“Our supporters will always have a voice in our Club at New York City FC. We are truly excited about this opportunity to partner with them on this decision and we are counting on all New Yorkers and fans beyond the city to get involved, cast a vote and make New York City FC history.”

Fellow member of the City Football Group, Manchester City have taken lessons from this process for their own re-badging exercise.

manc city badges

In a statement that has obviously come from the same press office, Chief Executive Officer Ferran Soriano commented:

“We are looking for our fans to share their views as to what they consider to be the most authentic symbols of the Club. The views of our Cityzens are essential to the process; they will have a real say on the future of our badge.”

A host of online questionnaires, lectures on the history of the badge and articles in local newspapers detailing every element from the shape to the importance of each featured symbol was created, all building up to the big reveal.

What they were essentially doing was building a dossier of evidence to support their launch, using content to give people ample opportunity to offer feedback.

As long as you can keep out online mischief makers looking to derail the voting process (4chan famously managed to get Miley Cyrus to the top of Time Magazine’s 2013 Person of the Year poll), there’s no excuse not to involve those who have actively shown an interest in your business in decisions around the products you want them to buy.

From views to dwell time to shares to comments, every piece of content you publish gives you an idea of how people feel about what you’re saying, which can in turn inform the approach the rest of the business takes.

As an aside to this seven part series, check out Ayima’s free DIY Content Marketing Strategy ecourse, designed to help you improve the ROI of your content.

The article Why companies create content – part two: to gauge public opinion was first seen from

Stop thinking about long-tail keywords and start focusing on searcher intent

Over the years, the usefulness of certain types of keywords has been debated, analyzed, celebrated, and even disparaged.

Long-tail keywords – those specific phrases of low-volume but perhaps higher-quality queries from searchers who are closer to taking action on procuring the product or service they seek – have certainly received a heck of a lot of recognition for their value to marketers.

However, I am here to declare the demise of these keywords that we held in such high regard only a few short years ago.

Please let me explain…

As the use of search has evolved and search engine optimization has become commonplace, businesses have succeeded in increasing their visibility in search results and made adjustments to be most visible for those queries they care most about.

This, by itself, would be fine; a positive and helpful thing actually, if the end effect was search results pages all containing exactly what the user was searching for. However, the issue we’ve seen is that queries on many broad keywords no longer provide the relevant results that a searcher wants.

A search engine user looking for information now often uses one of these two methods to arrive at the search results they need:

  1. They start with a broad search and continue to refine that search until they get to appropriately relevant results.
  2. They mentally refine their search, knowing the broad results will not bring what they want. So they begin with a more specific search and refine fewer times.

Certainly, longer search queries are becoming the norm. Part of the issue here is that Google has populated broad queries with many different universal result offerings… News, Images, Videos, Knowledge Graph. This moves those specific, relevant pages that many searchers are actually looking for further down the page – or possibly onto the next page.

Then, on top of those universal results, we have results like Wikipedia and educational or governmental pages that don’t exactly fit the intent of the user’s search either.

Now that searchers are finding it necessary to further and further refine their queries to get to the precise results that they want, I suggest that what we once called long-tail queries are now simply queries. The keywords that users now commonly rely upon are becoming so lengthy and diffuse that the distinction is lapsing into serving little function.

Thus, the term ‘long-tail’, to me, no longer exists. The once novel concept of paying attention to long-tail keyword queries is now so commonplace that it can go without being said. Long-tail keywords are now just the queries we all use to actually find what we need. Our ability to identify specific combinations of words that lead to our desired results will continue to evolve.

This is why the SEO community is moving away from the targeting of specific keywords or queries to instead thinking about themes and the searcher’s intent.

If we, as marketers, ditch our focus on query length and instead drive the focus toward the theme of the content, we can then start to adjust that content to make sure that what we are truly providing is the search destination that our potential clients and customers have in mind.

Ready to incorporate this mindset? Start with asking these three questions of your business’s website:

  1. Are we satisfying the searchers’/customers’ journey?
  2. Are we giving them what they need and in the way they need it?
  3. What do our customers need from us that we aren’t currently providing, but should?

Kevin Gamache is Senior Search Strategist at Wire Stone, an independent digital marketing agency for global Fortune 1000 brands.

The article Stop thinking about long-tail keywords and start focusing on searcher intent was first seen from

Tuesday, 23 February 2016

Google has launched Accelerated Mobile Pages

Welcome to a speedier mobile web.

It’s been a massive week for the Google SERPs this week and it’s only Tuesday. As well as Google killing Right Hand Side Ads and shutting down its own comparison service, it seems that Google has also launched its Accelerated Mobile Pages project.

Although we had been expecting it around next weekend, as Google stated it would be here “late February”, it seems some people have been seeing results with a little AMP icon today (Tues 23 Feb).

The AMP project is an open source initiative which aims to improve the performance of the mobile web. As our own Rebecca Sentance explained in her article on Google’s AMP only yesterday, AMP pages are a “stripped-down version of the mobile web which runs on a reinvented version of HTML.”

Google has stated that a page created with AMP HTML can load anywhere from 15 to 85% faster than the non-AMP version of that page.

What this means for users is a much faster mobile web, and for publishers using AMP a likely boost in search rankings, as site speed and mobile friendliness are both vital for user experience.

It was informally speculated during the announcement of AMP back in December that these pages created using AMP might receive a ‘fast’ label (similar to the ‘mobile friendly’ label), but it seems that Google is going with AMP and a little lightning bolt…

amp pages symbol

For us within the search industry it may be obvious what this label means, but is that the case for regular users?

From asking around the non-SEW parts of the office, nobody guessed that the symbol meant faster loading pages. One person, who will remain anonymous replied, “louder pages.”

Fair enough to be honest.

The point is, there perhaps needs to be a lot more education from Google in order to tell people what this symbol means, if they’re not going to use something more obvious. Especially if AMP pages are only meant to benefit users and are Google’s preferred way of improving the mobile web.

The article Google has launched Accelerated Mobile Pages was first seen from

Evolution of call tracking in a mobile first world

*Sponsored content in collaboration with Marchex. Views expressed in this article are those of the guest author and do not necessarily reflect Search Engine Watch’s opinions.

If you answer your phone or want your phone to ring as a result of your marketing, in particular search engine marketing, then you’ll want to read the following interview.

I had the privilege of sitting down with Adarsh Nair, Senior Director, Search Product and Engineering at Marchex. Adarsh filled me in on the latest advances in call tracking and marketing automation that have been baked into their Call Analytics platform and specifically Marchex Search Analytics.

Kevin Lee: Why is tracking and 100% attribution at the keyword level important?

Adarsh: We hear a lot about big data and the power of data. Search is a very sophisticated media channel with hundreds of millions of keywords to bid from leading to trillions of dollars in sales. This is a big data problem. If you break it down to the basics, each keyword is a market of customers searching for something specific. And marketers have a cost to get their ad to show up in front of customers.

It is our strong belief that for the best ROI optimization, search marketers need to have the most granular data (keyword level), complete data (online and offline sale attribution) and the tools to make sense of data.

Keyword level attribution for online and offline sales is just the beginning to building a great search optimization strategy.

Marchex click to call stats

Why do some marketers prefer to take orders over the phone?

For industry verticals such as auto, financial services and travel, the product being sold is sometimes complex and expensive. These verticals also see a fair share of companies competing for the same customer. Getting a consumer on the phone and/or in a store increases the opportunity to convert a prospect into a customer through excellent customer service. The human connection in many cases makes the difference.

Separately, it is also important to note that we are seeing marketers responding to customer choice.

Consumers are choosing to call businesses from the web using their mobile phones. This has pushed Google, Facebook and other publishers to respond with mobile advertising formats that incorporate a click to call.

I get it. For many products, particularly those with lots of options or service offerings with many service levels it’s better to have a rep talk to a potential customer because the conversion rate to sale is higher AND the average sale price is higher. Plus you get fewer returns or customer service issues when the buyer gets the right product or service.

For many marketers, being able to attribute call conversions to search media at the keyword level results in ROI being properly stated and the PPC bid reserve price going up. Can you explain why this is important?

Having the ability to raise one’s keyword bid while maintaining a high measured ROI can facilitate higher positions for the keyword, if the competition doesn’t escalate bids.

Anyone who wants site visitors to call should be tracking phone conversions, but which industry categories are having the greatest success with your platform?

We see two key customer journey paths when it comes to calls from search. First, customers are choosing to click to call from the search ad itself. Second, customers click through to the landing page or the site, and call a number on the site. The key industry verticals for Marchex, where calls make up more than 20% of the overall conversions, are Auto, Financial services, Travel, Home Services, Telco and Cable services.

Can you provide specific examples that might shock readers as to just how big a difference call tracking can make?

One of our customers in financial services faced a challenge. Mobile CPCs were going up, leading to higher cost per acquisition (CPA), and around 40% of their total conversion came through calls. The customer had full visibility into online conversions driven from keywords, but did not have visibility into call conversions.

The lack of a complete picture that showed total conversion from a keyword caused optimization challenges and cost per acquisition remained high. Marchex Search Analytics helped the customer develop a complete picture of total conversions (online and offline) from each keyword.

With call conversion added in, the customer saw the top performing keywords change. CPAs of specific keywords dropped by more than 50%. The customer used the data to revamp their search strategy and in a three month timeframe reduced overall CPA by ~10% and drove ~15% more conversions. And this is for a company with a massive paid search budget.

In addition to simply understanding phone conversions and improving one’s PPC media optimization what other insights can one extract from calls using your platform?

In addition to calls, duration, conversions, there are three categories of data/insights that Marchex Search Analytics provides:

1) Deep call insights through machine learning: CallDNA is a Marchex technology the platform uses to provide deep insights into what happened on the call. Search marketers find a variety of uses for this data.

As an example, one of our customers in the Auto industry looks for a specific phrase in calls as it’s an indicator for future sales. Getting a sense of keywords that drive calls with the specific phrase helps our customer invest in the right keywords to drive demand. Another example is where a customer in the Telco/Cable vertical uses our platform to understand which keyword drives conversation vs. hangups/misdials. This helps the customer invest in keywords that drive the right kind of calls into their call centers.

2) Consumer touch points along the call flow: The call consumer journey is very similar to the website consumer journey. Very similar to how the website’s ease of use and responsiveness determines the consumer experience, the success of the call consumer journey is determined by how intuitive the IVR is, how responsive the agent on the phone is.

Marchex Search Analytics is able to surface the IVR input form the consumer back to the search marketer at the keyword level. Many search marketers use this feature to determine how many new customers are being driven through the call flow by tapping the ‘new customer’ IVR input at the keyword level

3) Enhanced conversion data: We also support advanced conversion data. Many platforms discuss bringing conversion counts or using a proxy for conversion like ‘calls above a certain duration’. Marchex Search Analytics partners with our customers to bring in conversion data at the keyword level that includes total sales transactions, revenue drive from those transactions, product SKUs that drove the conversion.

Are there any important things marketers should know between click to call originated calls vs. in-ad phone numbers being dialed vs. landing pages with a phone call to action?

In an increasingly mobile world, customers in industry verticals where calls are important are seeing more than 60% of their calls come from in-ad phone numbers. Customers should do the due diligence to understand if the call tracking solution they use can seamlessly provide call conversion data at the keyword level for both in-ad phone numbers (also referred to as call extensions or call only campaigns) and landing page based phone numbers.

For keyword level tracking of call from in-ad phone numbers, we advise against hacks such as mapping adgroups to single keywords. Such strategies are expensive, do not scale for enterprise customers and could have negative impact on quality score. Marchex Search Analytics works with the existing search campaign structure of our enterprise customers and we are able provide a seamless experience by pushing the new keyword level click to call data to bid optimization platforms automatically.

Finally, we are also seeing that keywords that drive calls from in-ad phone numbers typically differ from the keywords that drive calls from landing pages. Having granular attribution for calls from in-ad phone numbers and landing pages will be critical for best in class optimization strategies.

Many marketers want to know the system they pick has been around a while to be robust and stable. How long has Marchex been in the call tracking sector? What are the other reasons that marketers know and trust your system?

Marchex has been in the call tracking space for close to 10 years. Marchex is the trusted partner for Fortune 1000 enterprise brands and we are the largest call analytics provider with more than 300 million calls flowing through our systems every year. Finally, Marchex invests in product innovation and our enterprise customers choose Marchex due to cutting edge innovations such as search, display, video and site analytics for call conversions.

What else should marketers know when they evaluate call tracking solutions?

Call conversions are driven from a variety of media channels, which includes search, display, mobile video, email. While direct attribution of calls from search is important, marketers are now beginning to realize that display and mobile video are influencing call conversions in a big way. Marketers should consider a call analytics platform that has the capability to track view-through call conversions from display and mobile video and can provide cross channel attribution for call conversions.

I’m pleased to have learned a lot about call tracking from Adarsh, and I hope you, the reader learned something too and of course you can learn more here: Marchex Search Analytics.

Marchex *Sponsored content in collaboration with Marchex. Views expressed in this article are those of the guest author and do not necessarily reflect Search Engine Watch’s opinions.

The article Evolution of call tracking in a mobile first world was first seen from

Google kills Right Hand Side Ads: what does this mean for marketers and users?

As we reported over the weekend, Google has removed all PPC ads from the right-hand side of the search engine results page with immediate global effect. 

There’s been a great deal of speculation on what this means for businesses, advertisers and users alike, with many postulating that the top-of-the-page paid search is going to become even more cutthroat (and expensive), organic listings will be pushed even further off the first SERP (Google will start to show four ads at the top instead of three for “highly commercial” search terms) and that Product Listing Ads will gradually take over the SERP (PLAs are still allowed on the right-hand side).

The change has already happened.

Here’s a search for ‘london hotel’ carried out two days ago…

london hotel Google Search with right hand side ads

And here is the same search today…

london hotel Google Search

There are now four paid search results at the top, with nothing on the right. It looks oddly blank now, and worryingly the entire above the fold space is entirely filled with ads.

However there is one slightly positive change. There are more organic results below the fold. In fact there are nine blue links and two news stories, which is an improvement. But this is still probably a case of ‘too little too low-down’.

Google’s decision appears to be entirely commercially driven, it would be naive to think otherwise, but has Google gone too far in sacrificing its own user experience for the searcher?

Or will we eventually get to the point where the entire first SERP is filled with ads and we instinctively click straight to the second page, in the same way we skip past YouTube pre-rolls?

We asked some experts from the search community what they thought of the matter.

Thank you to Julia Logan (SEO consultant at, Kevin Gibbons (Managing Director at BlueGlass), Sam Silverwood-Cope (CMO at Pi Datametrics) and Larry Kim (Founder of Wordstream) for answering the following questions…

Why has Google decided to drop ads on the right hand side of search results? Is this a way to extract more revenue from top ads?

[Julia Logan] I would suppose so, given the eye tracking studies, and with reports of typical non-technical users hardly distinguishing between ads and organic results, this step tends to blur the line for such users even more – after all, sidebar ads stood out clearly as ads.

However, I was trying to look into the history of sidebar ads and found this article proving this is not their first attempt to ditch sidebar ads, although the previous one did not involve increasing the number of ads above the organic results.

[Kevin Gibbons] The obvious answer is revenue and I’m sure that is a big factor of course. But I think it’s likely to be a balance between this, and a more modern, perhaps centered, search experience which reflects mobile vs. desktop and tablet results. Ultimately changes like this have to be beneficial to the search experience, otherwise Google ends up chasing short-term revenue instead of long-term market share.

[Sam Silverwood-Cope] My Dad asked me the other day, “How come Google is free?” Well Dad, this is how it makes money. If some people don’t realise the top advert spots are actually advertising (like my dad), I think most are aware that the right-hand side are paid positions. Most people do not click on PPC ads for general searches.

So not content with the existing two or three adverts, plus the Google shopping results, plus any other self-promoting comparison widget they put up, Google in its wisdom, has decided to expand the real estate of PPC in the main bulk of the SERPs at the cost of an organic spot.

new york flights Google Search

What does this do for organic search? What should site owners and SEOs do in response?

[Julia Logan] We could of course panic and bemoan the death of above-the-fold organic SERPs but this may not necessarily be the case. With the rise of adblockers, whatever anybody is doing with their ads can potentially become irrelevant.

Assuming the worst case scenario, site owners and SEOs should do what they have always been doing – compete against paid ads. If you rank for a commercially meaningful keyword, make sure you do everything in your power to make your organic listing stand out – metatag optimisation (yes I do realise this is 2016 now), Schema and other options suitable for your particular site. Ads will evolve, becoming more interactive and visually attractive – this means you should not be left behind.

[Kevin Gibbons] My advice is to aim high. We’ve definitely see a significant shift in first page clickthrough rates over the last couple of years especially in organic search. Ranking on page one is often not good enough anymore, every term is different – but I’d recommend that you really should be aiming top three now, otherwise there’s likely to a big drop-off in clickthrough rates.

Also, become the brand that people think of before they even get to typing a query into Google. Whether it’s paid listings, competitors, vertical search or anything else that may get in the way of potential customers visiting your site, try to make sure they get to you first and then remember who you are, so that they come straight back the next time.

[Sam Silverwood-Cope] Despite my moaning, and hankering for the good old times, I think it makes things quite interesting for SEO. The additional PPC spot is supposed to be for premium terms (for now). These terms are highly expensive per click, so it’s up to the vendor to decide whether the top spot is worth it, or whether it would be an interesting bidding war to lose and then to vie for the organic spot above the fold.

A good strategy would be to push organic and take a lower PPC position. With the right tracking tool, alerts can be used on organic positions to react accordingly for the bidding. This blended search approach will be won by the most competent, well equipped digital teams.

Does this improve or harm the user experience?

[Julia Logan] If the user’s goal is to find whatever they are looking for, the answer will largely depend on whether the AdWords algorithm is better than the organic algorithm, and also whether businesses spend money on ads thoughtlessly and run ads with poor targeting.

[Kevin Gibbons] The jury’s out on this one; the negative could be that searchers want to see the natural listings rather than too many ads at the top, and the positive could be a cleaner layout and improved experience. I have to try and look at this from a non-SEO perspective, and as much as I’d like to see the organic results as high as possible, if I’m honest I think the new layout might improve the experience.

I would add that this isn’t an overnight change and I don’t expect this to be the last experiment we see either.

[Sam Silverwood-Cope] Like a young rock band committed to making quality music and “never selling out” then chasing the mainstream buck with the third album, Google doesn’t seem to prioritise its legacy of organic quality any more.

“In Google we Trust” meant we used this superb search engine above the basic or advertised-burden competition. Too many adverts, and especially poor adverts, will eventually turn the user off. But this will only happen when there is a competent viable competitor.

And finally let’s hear from Larry Kim, who offers the following optimistic advice to Paid Search marketers…

I had a good chuckle reading some of the doomsday predictions this morning.

We did some actual analysis here and what I can tell you is that Side Ad and Bottom ads account for 14.6% of total click volume (this is looking across thousands of accounts). Keep in mind that ‘Bottom of Page Ads’ aren’t going away. So, for starters, we’re talking less than 14.6% of clicks impacted by the change.


Now, those “lost” impressions and clicks can more than be made up by A) the addition of the new fourth ad spot B) 78% of SERPS have fewer than 4 ads above the organic results – there’s plenty of room for that to go down and C) the addition of up to four ads below the organic search results. It’s like we just re-organized the naming of ad positions.

As a result, I see no impact on AdWords auction dynamics (clicks, impressions, CPCs, etc.). The only ‘loser’ is organic search which is completely gone from above the fold space on desktop for any commercial query.

There are also incremental benefits to paid search from the change, for example, now all ads can use call-out extensions, sitelink extensions, location extensions, etc., which were previously only a benefit of top-of-page ads. And the ads appear ‘more native’ which may have additional benefits.

In quantifying the impact of this, I should also add that the change is for desktop only, which accounts for less than half of searches. So we’re talking 14.6%/2 = 7.3% of queries impacted.

Basically, keep calm. This is a net positive for paid desktop search.

The article Google kills Right Hand Side Ads: what does this mean for marketers and users? was first seen from