the rise and fall of authorship - travel content marketing

What does the rise & fall of Google Authorship mean for content marketing?

Google giveth and Google taketh away (usually to the sound of a million complaining SEOs) with the latest upset being the once hyped, now unceremoniously dumped Google Authorship.

the rise and fall of authorship - travel content marketing

Remember this?

Authorship was, in a nutshell, a mechanism to connect a piece of content with its author. Its most visible features were the thumbnail images and Google+ links you saw with increasing frequency alongside search results.

Authorship was the subject of much interest and speculation over recent years (including by yours truly) as it seemed to signify a major innovation to search and SEO: that individual authors might have a quantifiable authority which could help determine the credibility, and therefore search rankings, for content they had created – think AuthorRank in contrast to the long-established PageRank.

But recent developments hinted at a reversal to these ambitious plans. Back in December SEOs noticed a sudden drop in the number of Authorship photos shown in the results; in June they disappeared altogether leaving only the author’s Google+ byline which itself has now vanished without a trace. In an August 29 post, senior Googler John Mueller confirmed that Authorship was dead.

The initial reaction from the SEO community was mixed, ranging from anger to weary resignation to more than a few wry smiles among the savvier crowd.

google authorship travel content marketing

The fact is that Authorship was a flawed product, and for several reasons.  It was based on a clunky, unfriendly markup that only the more technically-minded bloggers ever bothered with, while major media outlets and other publishers of objectively high authority content rarely made the effort.

Search queries on SEO or online marketing were sure to yield ten results perfectly optimised for Authorship photos, but I doubt that many particle physicists bothered fiddling around with their Google+ profiles and coding the required rel=author markup onto their research papers.

It was suggested that the profile images were nothing more than an incentive to get people using Google+, and a great many users ended up just chasing the profile picture rather than thinking about what might actually enhance their authority or credibility – “Real Author Stuff” as Joel Klettke brilliantly called it, shortly before the feature was pulled.

Neither did it seem to work out too well for Google. Apparently its usefulness to their users didn’t justify the development time and computing resources it consumed and, as the more astute among us have suggested, higher click rates on organic results means lower click rates on paid Adwords listings, so feel free to connect those particular dots…

In short, Authorship was broken and was a prime candidate for the Google glue factory. But despite the inevitable upset from SEOs who had spent many hours getting their clients’ sites optimised, is Authorship’s demise really as significant as it seems?

Babies and bathwater

It’s important to make the distinction between Authorship and the concept of AuthorRank itself. The AgentRank patent predated Authorship by many years and although those little profile pictures have now vanished, the emphasis on authority remains: you can bet that Google is still deeply interested in understanding the credibility of individual authors.  As Danny Sullivan points out here, Google continues to talk about identifying subject-specific authorities, and is already promoting authority content creators in certain instances.

Of course killing off Authorship doesn’t mean that Google is chucking the AuthorRank baby out with the bathwater. It’s much more likely that they’ve developed better technologies to parse authors and algorithmically determine their authority in a particular field without relying on messy user opt-in and inconvenient markups.

Meanwhile all the original principles stand: credibility and authority remain absolutely central to SEO longevity. For content marketing this means a relentless focus on publishing material that is demonstrably high value: original, engaging and authentic. Not many brands have the internal resources to achieve this on their own, in which case build relationships and commission from the writers, bloggers and other content creators who have the legitimacy you’re seeking to project.

This goes way beyond the relatively narrow (and still largely hypothetical) subject of AuthorRank and into general best practices for effective content marketing: less focus on fads and gimmicks, more attention on providing authoritative content that audiences need and want.

Smarter search retargeting to support your content efforts

Even if you’ve never heard the phrase “retargeting,” chances are you’ve seen it in action. Ever notice how after you visit a site, the web suddenly seems to be filled with ads for that same site?

That’s retargeting (“remarketing” to Google) in action and usually, if you’ve noticed it happening, it means the advertiser is doing it wrong.  Done right, it’s a subtle but effective way at keeping people engaged with your brand once they’ve visited your site, keeping your product at the backs of their mind, and helping ensure an eventual conversion, lead or booking.

It can take a travel consumer up to 46 website visits[ref]Travel Content Journey: Expedia Media Solutions, November 2013[/ref] over weeks of research, review searching and price comparison before they make a booking and retargeting gives brands an opportunity to stay relevant throughout that customer journey.

This is especially powerful in the content marketing context: Retargeting gives you the ability to tailor specific ads to individual lists of customers, for instance people that have read or downloaded a certain piece of content can be shown specific ads to gently encourage them towards revisiting your website at a later stage and making a booking.

The problem is that this process of list building, i.e. segmenting customer groups and determining what ads to show them, how frequently, and until what cut-off time, has tended to be laborious and tricky to get right. Showing ads insufficiently is a waste of time, while overdoing it risks upsetting some of your most valuable prospects by giving them that creepy feeling they’re being followed and watched.

Fortunately “Smart Lists,” a new feature launched last week, makes it possible to automate list building and use Google’s technology to optimise ad display and rotation for each list.

Or, in Google’s own words:

Smart Lists are built using machine learning across the millions of Google Analytics websites which have opted in to share anonymized conversion data, using dozens of signals like visit duration, page depth, location, device, referrer, and browser to predict which of your users are most likely to convert during a later visit.

Based on their on-site actions, Analytics is able to calibrate your remarketing campaigns to align with each user’s value.

It’s likely that this feature will be of most use on smaller paid search & retargeting campaigns that don’t have the necessary resources or skills to properly optimise the campaigns in-house.  It’s also important to note that Google is using transaction data (i.e. ecommerce sales, not regular conversions) from millions of other Analytics profiles to optimise lists and campaigns. This will likely skew the feature’s benefits in favour of other ecommerce sites.

That caveat notwithstanding it’s still worth investigating for any advertiser that is investing time and money on traffic generation, especially with content campaigns. The opportunity to close the loop on otherwise “lost” traffic and bring people back to your site should not be ignored, particularly when Google is making it even easier to implement.

Give us a call if you want to learn more about how retargeting can support your content marketing efforts.

Further reading: 

Smarter remarketing with Google Analytics: Google Analytics Blog, 9 April, 2014

Google Analytics Adds “Smart Lists” To Automate Remarketing List Optimization: Search Engine Land, 11 April 2014

Screw tightened on guest posting, vocal complaints but no big surprises

Guest posting as an SEO tactic is back under the spotlight, with Google making big waves among the industry and leaving the more unimaginative marketers among us wondering whether they’ll be left with any viable link building techniques in the near future.

The most recent flare up came last month with the announcement that Matt Cutts and Google’s search quality team, had imposed a manual penalty on My Blog Guest, a well known guest posting service. The site has been wiped out from the search results, as has any other site that used the service to publish guest posts. According to reports, the site had 73,000 users in 2013, with an average of 256 articles posted per day. That’s a lot of manual penalties and a lot of lost rankings.

Cue frenetic and very vocal complaints from across the SEO world, while certain sager voices suggested that fair warning had been given back in January and that it shouldn’t come as much surprise that any mechanism for the mass distribution of content in exchange for links would eventually come under scrutiny.

There has been a lot of nit picking over no-follows and the definition of a “guest posting network Vs a community” but the fundamental point is that guest blogging, the act of giving a piece of content to a 3rd party website in exchange for a link to your own site, has rapidly become the most recent SEO technique to be abused and devalued by the industry itself. As soon as a critical mass of practitioners attempt to scale any link building technique it’s only a matter of time before Google has to crack down to prevent manipulation of its search results and a worsening of search quality for its users.

MBG and others were a manifestation of that process: a mechanism for thousands of people to publish free content and build cheap links. The fact that the content was probably (we don’t know for sure, never having used the platform) average-to-low quality, and that MBG required do-follow (equity-passing) links, only worsened their case.

All in all the only surprise here is how surprised everyone acted when the whole thing blew up. The writing has been on the wall for months and it’s yet another vindication for those of us who eschew any attempt at mass produced link building.

The real issue is not whether this is “fair” or what it means for “link building” (the short answer is to stop trying to “build” links and earn them instead.) The interesting point is how Google went about this most recent action.

Spammy link building tactics were firmly on the radar long before the first Penguin update in April 2012. That was an algorithmic development to identify and devalue shoddy links across the entire index whereas these latest moves have been manual actions against specific sites, and high profile ones at that. The problem for Google is that they can’t algorithmically identify a “guest post” as such, which means they’re forced to manually take out the most visible content syndicators and guest post networks instead, possibly even just to send a message to the industry.

This should come as some relief for the significant number of SEO practitioners (and brands) that still rely on similar techniques for legitimate purposes.  The principle of publishing a piece of content on another site is still perfectly viable: it all depends on how you’re doing it, and for what purpose.  If your exclusive, or even primary, goal is simply to build links then you’re going to wind up in trouble.  If you’re trying to connect with audiences and build your brand in appropriate venues with appropriate content, then you have nothing to worry about.

Further reading: 

What You Should and Should NOT Be Using Guest Blogging for: SEO Theory, 19 March 2014

Building Relationships, Not Links: Why Guest Blogging Will Never Die: Clickz.com, 2 April 2014

 

The Facebook free ride is over (but did it ever really exist?)

ValleyWag’s Sam Biddle reported this week that Facebook intends to tighten the screw even further on brands that use the platform for promotional purposes, effectively killing the notion that the social media site can be used for free to reach fans and followers.

If Biddle’s (anonymous) source is correct this will probably be the final nail in the coffin for free Facebook promotion, the moment that the social media behemoth finally declares that if brands want to reach their followers, they’re going to have to pay for the privilege. Biddle writes:

“…The social network is “in the process of” slashing “organic page reach” down to 1 or 2 percent. This would affect “all brands”—meaning an advertising giant like Nike, which has spent a great deal of internet effort collecting over 16 million Facebook likes, would only be able to affect of around a 160,000 of them when it pushes out a post. Companies like Gawker, too, rely on gratis Facebook propagation for a huge amount of their audience. Companies on Facebook will have to pay or be pointless.

This isn’t a particularly new issue. There has been a steady drip of stories over recent months on the changing relationship between brands and their Facebook followers, most notably the December announcement of an algorithm change that smashed brand visibility in the News Feed with an average 44% decline in reach for (unpaid) promotional posts.

There was some respite for brands last month with announcements of yet another tweak that would increase the visibility of brand posts that also tagged other relevant brands, personalities or entities. This certainly has some value for curators but can hardly reverse the long-term trend of reducing free exposure in favour of pushing brands towards paid promotion and ads.

So is the free ride over? Well yes, but a better question would be, did the free ride ever exist? The idea that a brand with x number of followers could reach all (or even a significant chunk) of those people with a post was entirely fictitious from the start, and probably went a long way in convincing everyone from small business owners to C-level executives of the need to divert their precious time and money into developing and maintaining a Facebook “presence” in the first place.

The fact is that Facebook’s “Edge Rank” algorithm has always stacked the odds against promotional and branded messages, because they’ve known (rightly) that users don’t want to be bombarded with ads when they’re squandering their downtime with their virtual friends. Only the most successful of Facebook entities have been able to surf the Edge Rank wave with any significant degree of reach, and doing so demands serious resources in content creation and creativity.  To use Biddle’s example, with collateral like this Nike probably did ok. Your local Italian restaurant with its 400 followers was always posting in the dark to virtually no-one.  This was long before the recent changes that have upset so many corporate social media types.

There’s Always a But…

On the other hand, perhaps there’s a real silver lining to this largely fictitious cloud. A lot of the impetus behind these moves has been a drive towards promoting the kinds of quality content that users presumably want to see in their News Feeds (ok, the extra ad revenue doesn’t hurt either.)

facebook focus on in-depth and related links

“Quality” is a recurring theme across all digital channels these days and it seems like Facebook is also jumping on the bandwagon.

Facebook intends to cut down on News Feed clutter such as (unpaid) brand promotions, click-bait memes and low quality articles. In their place they’re promoting in-depth and authority articles, news content, as well as links to related content elsewhere.

This all fits with the overarching desire for Facebook to become the main “hub” from which its users discover and share the content that matters to them. Fluff, junk and too many promotional messages clearly do not fit within this strategic objective.

Brands that want to achieve real reach on Facebook now face a choice.  They can continue to push their promotional messages out to their fans and be willing to pay good money to make sure they’re seen.  Advertisers are already given a reasonable suite of analytics tools and you can expect these to improve over time. (Although there are VERY ominous questions being raised about the quality and veracity of paid clicks on the site.)

The alternative is to consider how brands might reach their intended audiences organically, i.e. by users sharing their content to other users without overly relying on ad spend.

This route takes a lot more thought, creativity and resources to pump-prime the kinds of content and engagement that works (see comments on Edge Rank above). But companies that are truly able to identify and fulfil the needs of their target audience might find that this slow but inexorable march towards quality, depth and authority will start to pay off.

Will small travel sites benefit from a “softer” Panda?

News that Google is planning on “softening” its Panda algorithm has been met with cautious optimism that some of the intense pressure on small travel sites in the SEO battle may soon be alleviated.

Matt Cutts, Google’s chief anti-spam engineer, confirmed at the SMX conference last week that his team is looking at changes that would help smaller businesses and websites in the race for rankings.

Since its release in early 2011, the Panda algorithm and its subsequent updates have ruthlessly targeted sites perceived as publishing “low quality” content. The initial updates were successful in removing objectively low quality sites such as content farms and spam directories from the results but a growing criticism has been the way that smaller and lower authority yet still legitimate sites have also been airbrushed from the results in favour of much bigger and more established brands.

In the travel space this meant that search queries for hotels, packages, tours and activities keywords, which used to return fairly diverse results from a variety of local, independent and other smaller sites, are now thoroughly dominated by a handful of the big beasts – TripAdvisor, Expedia, Orbitz and other major brands.

In this 2013 survey on Tnooz, more than a dozen independent websites reported between 20% and 70% drops in traffic from Google, over a time period roughly corresponding with the Panda updates.

Although speculation abounds that this wasn’t accidental (pressure on organic results drives Google’s Adsense revenue) it’s also possible that smaller sites are simply more likely to trigger Panda’s low quality indicators: user experience, reader engagement, site stickiness, well-organised site hierarchies and architecture and rock solid, high value content.  Getting this right takes investment, time and expertise – putting smaller sites at an immediate disadvantage.

That many of the bigger brands have consolidated their positions without paying much heed to these factors is an example of the underlying unfairness of the SEO space.

But Google doesn’t care about fair, it cares about delivering the best search experience for its users. And it turns out that consumers might actually value some diversity and independence in their search results as opposed to a monoculture of big brands, hence the announcement of a new, cuddlier, softer Panda.

So far no details have been offered on the specifics of the changes or the timing of their release. However in a series of tweets last year, Cutts requested “concrete examples of small sites/mom-n-pop sites” that were being unfairly excluded from the rankings.  The feedback form doesn’t give much away but Cutts’ language might: “mom-n-pop” suggests they’re looking at the extreme long tail of sites, most likely services that should be targeting local search queries to local consumers.

It remains to be seen what benefits, if any, this would bring to sites in the various travel niches, and with increasing pressure on organic search across the board this could all be too little too late. But the simple fact that the issue of diversity is being addressed from a user experience perspective could still be a promising sign.

Further Reading:

After Google Panda, Standout Content is Essential: Search Engine Watch, July 1, 2013 (A very good analysis on how the Panda algorithm evaluates content quality and makes decisions on “good” or “bad” sites.)