If you’re looking at a decrease in traffic since the Medic update, you may have heard that E-A-T is the fix to focus on. However, as Dan Leibson proves today, sometimes all you need to ‘E-A-T’ is a sandwich.
There has been a lot going around about Google’s recent “Medic” update, and how it affects local search and local SEOs.
One of the main theories going around has been that the update was centered around E-A-T (expertise, authoritativeness, trustworthy). While those are indeed all things that Google strives to rank, I want to direct everyone to this excellent piece by AJ Kohn, who points out that those are squishy subjects that machines have a hard time understanding (if they even can at all).
At the end of the day, Google is still a search engine run by machines that work on math, data and things that exist on websites right now. And as much as Googlers seem to want it to run on the hopes and dreams of the future of search, that isn’t a reality today.
So, what does all of this look like in the context of client work? I have gotten the opportunity to look at a few different sites hit by Medic and help to provide recovery services. I would bet on basic technical SEO and content strategy long before I would bet on machine learning, artificial intelligence and human rater guidelines for these sites.
That’s not to say that things like Natural Language Processing (NLP), machine learning/AI and Google’s Rater Guidelines aren’t important, just that they don’t fully explain what I have seen in terms of sites. Additionally, these aren’t the most logical answers in these instances.
I should caveat the below case studies by saying I obviously can’t speak for all the sites on the internet that were affected by this, and would never presume to. They also don’t speak for the subset of sites that were impacted by Medic, and then had that impact rolled back. So instead of more SEO theory, let’s dive into some examples.
Client One: Multi-location Medical Practice in California
Weak E-A-T, you say? The core practitioners regularly publish work in peer-reviewed journals.
How about their Better Business Bureau profile? Well, not really anything to see there (literally):
What they did have, though, was a number of typical SEO problems that have nothing to do with being experts at anything (other than perhaps SEO).
Some of the top issues affecting this site were:
- Multiple, conflicting robot declarations per page
- Heavily conflicting content targeting, particularly with legacy pages
- Poor canonical deployment
- Issues with Google’s ability to render page content
These are all things that, during the course of a penalty recovery engagement, you would likely look at and surface as immediate, glaring problems.
So far, we have only been able to get this client to implement one low-level recommendation on internal links, and ‘lo and behold’ to that subsection of pages (the blog) they are starting to recover some traffic in a week where they didn’t do any publishing:
Client Two: Insurance Agent
Again, ouch! They’re probably really weak at the intersection of machine language, natural language processing and the expansion of the link graph, right?
Nope. More like a toxic link profile…
This is one of the places that has some of the worst links, when we look at YoY traffic pre- and post-medic.
This is what a huge chunk of their backlinks look like online (the link itself is the period after ‘illnesses’).
Another large chunk of the link profile has commercial anchor text (sorry, no screenshots of those).
In addition, they also have several technical SEO/content issues:
- Blank CMS pages getting indexed because they are being linked to across the site
- Duplicate copy used on numerous pages
- And so on and so on
Again, after an algo update and during a penalty recovery engagement, this is all stuff that you would surface to the client as huge potential problems.
Dan, stop breaking me down. Build me back up!
No problem, that’s actually my goal, and pretty easy!
None of this stuff is new. It’s Google turning the same screws related to the link graph and content that they have been for years. Whether it’s Panda or Penguin or a core quality update or Fred or Medic or one of the hundreds of other updates they’ve rolled out in the past couple of years, spun out in new and exciting ways, is kinda irrelevant.
Google wants expert content.
Google wants authoritative and trustworthy sites.
And they want that in a way that can be understood by a machine.
But if you’ve been hit by Medic, before you panic about E-A-T, make sure you have the following bases covered:
- Have good technical SEO
- Make sure you work on your information architecture to be crawlable and indexable
- Make sure your pages render for GoogleBot (and users)
- Make sure you have a good link profile (whatever that means!)
- And if you really want to, try to work on your content to be inclusive of things like Natural Language Processing and the new realities of mathematical understandings of language. (But honestly, not even the top 1% of the internet is doing that right now.)
Keep Calm, and Local SEO Guide On!