Changing User Intents
Google’s search quality rater document highlights how the intent of searches can change over time for a specific keyword.
A generic search for [iPhone] is likely to be related to the most recent model. A search for [President Bush] likely was related to the 41st president until his son was elected & then it was most likely to be related to 43.
Faster Ranking Shifts
About 17 years ago when Google was young they did monthly updates where most of any ranking signal shift that would happen would get folded into the rankings. The web today is much faster in terms of the rate of change, amount of news consumption, increasing political polarization, social media channels that amplify outrage and how quickly any cultural snippet can be taken out of context.
Yesterday President Trump had some interesting stuff to say about bleach. In spite of there being an anime series by the same name, news coverage of the presser has driven great interest in the topic.
And that interest is already folded into the organic search results through Google News insertion, Twitter tweet insertion, and the query deserves freshness (QDF) algorithm driving insertion of news stories in other organic search ranking slots.
If a lot of people are searching for something and many trusted news organizations are publishing information about a topic then there is little risk in folding fresh information into the result set.
Temporary Versus Permanent Change
When the intent of a keyword changes sometimes the change is transitory & sometimes it is not.
One of the most common ad-driven business models online is to take something that was once paid, make it free, and then layer ads or some other premium features on top to monetize a different part of the value chain. TripAdvisor democratized hotel reviews. Zillow made foreclosure information easily accessible for free, etc.
The success of remote working & communication services like Skype, Zoom, Basecamp, Slack, Trello, and the ongoing remote work experiment the world is going through will permanently change some consumer behaviors & how businesses operate.
A Pew survey mentioned 43% of Americans stated someone in their house recently lost their job, had their hours reduced, and/or took pay cuts. Hundreds of thousands of people are applying to work in Amazon’s grueling fulfillment centers.
To many of these people a lone wolf online job would be a dream come true.
If you had a two hour daily commute and were just as efficient working at home most days would you be in a rush to head back to the office?
How many former fulltime employees are going to become freelancers building their own small businesses they work on directly while augmenting it with platform work on other services like Uber, Lyft, DoorDash, Upwork, Fiverr, 99 Designs, or even influencer platforms like Intellifluence?
If big publishers are getting disintermediated by monopoly platforms & ad networks are offering crumbs of crumbs there’s no harm in selling custom ads directly or having your early publishing efforts subsidized through custom side deals as you build market awareness and invest into building other products and services to sell.
As technology improves, we spend more time online, more activities happen online, and more work becomes remote. All this leads to the distinction between online and offline losing meaning other than perhaps in terms of cost structure & likelihood of bankruptcy.
Before Panda / After Panda
Before the Panda update each additional page which was created was another lotto ticket and a chance to win. If users had a crappy user experience on a page or site maybe you didn’t make the sale, but if the goal of the page was to have the content so crappy that ads were more appealing that could lead to fantastic monetization while it lasted.
That strategy worked well for eHow, fueling the pump-n-dump Demand Media IPO.
Demand Media had to analyze eHow and pay to delete over a million articles which they deemed to have a negative economic value in the post-Panda world.
After the Panda update having many thin pages laying around and creating more thin pages was layering risk on top of risk. It made sense to shift to a smaller, tighter, deeper & more differentiated publishing model.
Entropy & Decay
The web goes through a constant state of reinvention.
Old YouTube Flash embeds break.
HTTP content calls in sites that were upgraded to HTTPS break.
Software which is not updated has security exploits.
If you have a large website and do not regularly update where you are linking to your site is almost certainly linking to porn and malware sites somewhere.
As users shifted to mobile websites that ignored mobile interfaces became relatively less appealing.
Changing web browser behaviors can break website logins and how data is shared across websites dependent on third party services.
Ads eat a growing share of real estate on dominant platforms while organic reach slides.
Everything on the web is constantly dying as competition improves, technology changes and language gets redefined.
Even if a change in user intent is transitory, in some cases it can make sense to re-work a page to address a sudden surge of interest to improve time on site, user engagement metrics & make the content on your page more citation-worthy. If news writers are still chasing a trend then having an in-depth background piece of content with more depth gives them something they may want to link at.
Since the Covid-19 implosion of the global economy came into effect I’ve seen two different clients have a sort of sudden surge in traffic which would make little to no sense unless one considered currently spreading news stories.
News coverage creates interest in topics, shapes perspectives of topics, and creates demand for solutions.
If you read the right people on Twitter sometimes you can be days, weeks or even months ahead of the broader news narrative. Some people are great at spotting the second, third and fourth order effects of changes. You can spot stories bubbling up and participate in the trends.
An Accelerating Rate of Change
When the web was slower & easier you could find an affiliate niche and succeed in it sometimes for years before solid competition would arrive. One of the things I was most floored about this year from a marketing perspective was how quickly spammers ramped up a full court press amplifying the fear the news media was pitching. I think I get something like a hundred spam emails a day pitching facemasks and other COVID-19 solutions. I probably see 50+ other daily ads from services like Outbrain & similar.
The web moves so much faster that the SEC is already taking COVID-19 related actions against dozens of companies. Google banned advertising protective masks and recently announced they are rolling out advertiser ID verification to increase transparency.
If Google is looking at their advertisers with a greater degree of suspicion even into an economic downturn when Expedia is pulling $4 billion from their ad budget & Amazon is cutting back on their Google ad budget and Google decides to freeze hiring then it makes far more sense to keep reinvesting into improving any page which is getting a solid stream of organic search traffic.
After Amazon cut their Google ad budget in March Google decided to expand Google Shopping to include free listings. When any of the platforms is losing badly they can afford to subsidize that area and operate it at a loss to try to gain marketshare while making the dominant player in that category look more extreme.
When a player is dominant in a category they can squeeze down on partners. Amazon once again cut affiliate payouts and the Wall Street Journal published an article citing 20 current and former Amazon insiders who stated Amazon uses third party merchant sales data to determine which products to clone:
Amazon employees accessed documents and data about a bestselling car-trunk organizer sold by a third-party vendor. The information included total sales, how much the vendor paid Amazon for marketing and shipping, and how much Amazon made on each sale. Amazon’s private-label arm later introduced its own car-trunk organizers. … Amazon’s private-label business encompasses more than 45 brands with some 243,000 products, from AmazonBasics batteries to Stone & Beam furniture. Amazon says those brands account for 1% of its $158 billion in annual retail sales, not counting Amazon’s devices such as its Echo speakers, Kindle e-readers and Ring doorbell cameras.
Amazon does not even need to sell their private label products to shift their economics. As Amazon clones products they force the branded ad buy for a company to show up for their own branded terms, taking another bite out of the partner: “Fortem spends as much as $60,000 a month on Amazon advertisements for its items to come up at the top of searches, said Mr. Maslakou.”
Amazon has grown so dominant they’ve not only cut their affiliate & search advertising while hiring hundreds of thousands of employees, but they’ve also dramatically slowed down shipping times while pulling back on their on-site people also purchase promotions to get users to order less.
Multiple Ways to Improve
If you have a page which is ranking that gets a sudden spike in traffic it makes a lot of sense to consider current news & try to consider if the intent of the searcher has changed. If it has, address it as best you can in the most relevant way possible, even if the change is temporary, then consider switching back to the old version of the page or reorganizing your content if/when/as the trend has passed.
One of the pages mentioned above was a pre-Panda “me too” type page which was suddenly flooded with thousands of user visitors. A quality inbound link can easily cost $100 to multiples of that. If a page is already getting thousands of visitors, why not invest a couple hundred dollars into dramatically improving it, knowing that some of those drive by users will likely eventually share it? Make the page an in-depth guide with great graphics and some of those 10,000’s of visitors will eventually link to it, as they were already interested in the topic, the page already gets a great stream of traffic, and the content quality is solid.
Last week a client had a big spike from a news topic that changed the intent of a keyword. Their time on site from those visitors was under a minute. After the page was re-created to reflect changing consumer intent their time on site jumped to over 3 minutes for users entering that page. Those users had a far lower bounce rate, a far better user experience, are going to be more likely to trust the site enough to seek it out again, and this sends a signal to Google that the site is still maintained & relevant to the modern search market.
There are many ways to chase the traffic stream
- create new content on new pages
- gut the old page & publish entirely new content
- re-arrange the old page while publishing new relevant breaking news at the top
In general I think the third option is often the best approach because you are aligning the page which already sees the traffic stream with the content they are looking for, while also ensuring any users from the prior intent can still access what they are looking for.
If the trend is huge, or the change in intent is permanent then you could also move the old content to a legacy URL archived page while making the high-traffic page focus on the spiking news topic.
The above advice applies to pages which rank for keywords that change in intent, but it can also apply to any web page which has a strong flow of user traffic. Keep improving the things people see most because improvements there have the biggest returns. How can you make a page deeper, better, more differentiated from the rest of the web?
Does Usage Data Matter?
Objectively, if people visit your website and do not find what they were looking for they are going to click the back button and be done with you.
Outdated content that has become irrelevant due to changing user tastes is only marginally better than outright spam.
While Google suggests they largely do not use bounce rate or user data in their rankings, they have also claimed end user data was the best way they could determine if the user was satisfied with a particular search result. Five years ago Bill Slawski wrote a blog post about long clicks which quoted Steven Levy’s In The Plex book:
“On the most basic level, Google could see how satisfied users were. To paraphrase Tolstoy, happy users were all the same. The best sign of their happiness was the “Long Click” — This occurred when someone went to a search result, ideally the top one, and did not return. That meant Google has successfully fulfilled the query.”
Think of how many people use the Chrome web browser or have Android tracking devices on them all hours of the day. There is no way Google would be able to track those billions of users every single day without finding a whole lot of signal in the noise.