Archive for Search Engine Optimization

Sep
18

How Often Should You Perform Technical Website Crawls for SEO? via @lefelstein

Posted by: | Comments Comments Off on How Often Should You Perform Technical Website Crawls for SEO? via @lefelstein

How Often Should You Perform Technical Website Crawls for SEO?

Any seasoned SEO professional will tell you how important website crawls are for maintaining strong technical SEO.

But here lies the better questions – how frequently should you perform website crawls?

And how often are they actually performing them?

In this post, we’ll discuss what SEO publications suggest as a “best practice” web crawling cadence and the actual rate SEO pros are performing them.

Then, I’ll demonstrate the benefits of a ramped-up web crawling cadence by sharing a case study from FOX.com.

What’s a Website Crawl, Anyway?

Using specialized tools such as Screaming Frog or DeepCrawl, you can take a look “under the hood” of a website – much like a mechanic does when inspecting cars.

But instead of inspecting the mechanical parts of a car, you are inspecting the optimizable elements of a website – including the quality of its metadata, XML sitemaps, response codes, and more.

When something isn’t working as expected in SEO, it’s up to you to diagnose the problem and find the solution to fix it.

Example of the Screaming Frog SEO Spider UI.

What Are the Industry Experts Saying About Website Crawl Cadence ‘Best Practice’?

Industry publications seem to be in agreement that “mini” technical audits should be conducted on a monthly basis and “in-depth” technical audits should be conducted on a quarterly or semi-quarterly basis.

Advertisement

Continue Reading Below

However, there is little chatter specifically discussing the “optimal frequency” for performing website crawls:

  • “Part of your ongoing SEO strategy includes regular audits to allow you to find and fix issues quickly (we recommend quarterly).” – Erika Varagouli, SEMrush
  • “It’s good practice to do an automated scan once a month. This will be often enough to bring up major issues, like any on-page errors in the form of broken links, page titles and meta-data or duplicate content. – Digital Marketing Institute
  • “I perform an SEO audit for my clients the first month, monthly (mini-audit), and quarterly (in-depth audit).” – Anna Crowe, Search Engine Journal

A website crawl and technical audit are not the same thing (thank you for the clear separation of the two, Barry Adams).

However, it’s fair to assume that these publications would recommend running a website crawl at least as frequently as they run mini-audits on a monthly basis.

What I Learned About SEO Crawl Cadence in ‘Actual Practice’

Best practice is one thing.

But how often do SEO pros run website crawls for client sites in actual practice?

To get an idea, I took to Twitter. (Yes, Twitter polls do have their obvious limitations – but it’s one of the simplest means to get some tangible data.)

Three days, and nearly 2,000 votes later, the results were in:

 

Approximately 57% of SEO pros who participated in my poll fell into the “monthly or longer” bucket, while 43% fell into the “weekly” or shorter bucket.

Advertisement

Continue Reading Below

In other words, we were all over the map.

You may not be too surprised by these poll results.

After all, both ends of the spectrum could make complete sense – depending on the type and size of the websites you are managing.

That said, I had an experience at FOX two months ago that made me thankful that we run weekly website crawls across our major domains.

I’d like to share it with you all here – in case it encourages you to increase the cadence of your technical website crawls.

How Weekly Website Crawls Helped FOX Take Swift Action on an allRoutes.json Bug

In late July 2020, the FOX SEO team ran a routine weekly website crawl of FOX.com and discovered that 100% of our TV episode pages were unexpectedly serving error status codes (due to a bug with the allRoutes.json file).

Although the pages displayed fine for users, they were throwing 404s to Google bot – making them ineligible to appear in Google search results.

Not only is “watch time” a major KPI for the site, these pages also generate significant ad revenue for the company.

Needless to say, this crawl guided us towards a serious problem.

Users were able to navigate and interact with /watch/ pages fine. However, the pages threw 404 codes to Google bots.

Thanks to the crawl we diagnosed the problem quickly.

But the solution was complex.

Over the course of three weeks (July 23 – August 13), we honed in on this specific problem and confirmed a steep decline in SEO clicks and impressions from this set of pages.

Advertisement

Continue Reading Below

The bug was fixed by mid-August and we saw clicks and impressions trending upwards back to normal numbers.

From July 23 – August 13, FOX.com experienced a steep decline in SEO clicks and impressions from /watch/ pages – due to a bug with the allroutes.json file.

Then, I had a thought:

If we had waited several weeks to a month before running the crawl (which is largely considered best practice), the improper response codes would have done incremental damage to the site’s SEO traffic and ad revenue.

Not only from the problem itself, but the added time to create and execute a solution.

Advertisement

Continue Reading Below

While it’s completely possible to have diagnosed this bug without a full website crawl, it would have made the task unnecessarily difficult.

The screenshot below illustrates that without the right Google Search Console (GSC) filters being established, this particular issue hid under a veil within the Search Results report.

The GSC Coverage report was also 10 days slow to identify errors on our watch pages and only offered examples of affected URLs.

via How Often Should You Perform Technical Website Crawls for SEO? via @lefelstein

 

Comments Comments Off on How Often Should You Perform Technical Website Crawls for SEO? via @lefelstein
Aug
29

Managing Algorithimc Volatility

Posted by: | Comments Comments Off on Managing Algorithimc Volatility

Upon the recently announced Google update I’ve seen some people Tweet things like

  • if you are afraid of algorithm updates, you must be a crappy SEO
  • if you are technically perfect in your SEO, updates will only help you

I read those sorts of lines and cringe.

Here’s why…

Fragility

Different businesses, business models, and business structures have varying degrees of fragility.

If your business is almost entirely based on serving clients then no matter what you do there is going to be a diverse range of outcomes for clients on any major update.

Let’s say 40% of your clients are utterly unaffected by an update & of those who saw any noticeable impact there was a 2:1 ratio in your favor, with twice as many clients improving as falling.

Is that a good update? Does that work well for you?

If you do nothing other than client services as your entire business model, then that update will likely suck for you even though the net client impact was positive.

Why?

Many businesses are hurting after the Covid-19 crisis. Entire categories have been gutted & many people are looking for any reason possible to pull back on budget. Some of the clients who won big on the update might end up cutting their SEO budget figuring they had already won big and that problem was already sorted.

Some of the clients that fell hard are also likely to either cut their budget or call endlessly asking for updates and stressing the hell out of your team.

Capacity Utilization Impacts Profit Margins

Your capacity utilization depends on how high you can keep your steady state load relative to what your load looks like at peaks. When there are big updates management or founders can decide to work double shifts and do other things to temporarily deal with increased loads at the peak, but that can still be stressful as hell & eat away at your mental and physical health as sleep and exercise are curtailed while diet gets worse. The stress can be immense if clients want results almost immediately & the next big algorithm update which reflects your current work may not happen for another quarter year.

How many clients want to be told that their investments went sour but the problem was they needed to double their investment while cashflow is tight and wait a season or two while holding on to hope?

Category-based Fragility

Businesses which appear to be diversified often are not.

  • Everything in hospitality was clipped by Covid-19.
  • 40% of small businesses across the United States have stopped making rent payments.
  • When restaurants massively close that’s going to hit Yelp’s business hard.
  • Auto sales are off sharply.

Likewise there can be other commonalities in sites which get hit during an update. Not only could it include business category, but it could also be business size, promotional strategies, etc.

Sustained profits either come from brand strength, creative differentiation, or systemization. Many prospective clients do not have the budget to build a strong brand nor the willingness to create something that is truly differentiated. That leaves systemization. Systemization can leave footprints which act as statistical outliers that can be easily neutralized.

Sharp changes can happen at any point in time.

For years Google was funding absolute garbage like Mahalo autogenerated spam and eHow with each month being a new record. It is very hard to say “we are doing it wrong” or “we need to change everything” when it works month after month after month.

Then an update happens and poof.

  • Was eHow decent back in the first Internet bubble? Sure. But it lost money.
  • Was it decent after it got bought out for a song and had the paywall dropped in favor of using the new Google AdSense program? Sure.
  • Was it decent the day Demand Media acquired it? Sure.
  • Was it decent on the day of the Demand Media IPO? Almost certainly not. But there was a lag between that day and getting penalized.

Panda Trivia

The first Panda update missed eHow because journalists were so outraged by the narrative associated with the pump-n-dump IPO. They feared their jobs going away and being displaced by that low level garbage, particularly as the market cap of Demand Media eclipsed the New York Times.

Journalist coverage of the pump-n-dump IPO added credence to it from an algorithmic perspective. By constantly writing hate about eHow they made eHow look like a popular brand, generating algorithmic signals that carried the site until Google created an extension which allowed journalists and other webmasters to vote against the site they had been voting for through all their outrage coverage.

Algorithms & the Very Visible Hand

And all algorithmic channels like organic search, the Facebook news feed, or Amazon’s product pages go through large shifts across time. If they don’t, they get gamed, repetitive, and lose relevance as consumer tastes change and upstarts like Tiktok emerge.

Consolidation by the Attention Merchants

Frequent product updates, cloning of upstarts, or outright acquisitions are required to maintain control of distribution:

“The startups of the Rebellion benefited tremendously from 2009 to 2012. But from 2013 on, the spoils of smartphone growth went to an entirely different group: the Empire. … A network effect to engage your users, AND preferred distribution channels to grow, AND the best resources to build products? Oh my! It’s no wonder why the Empire has captured so much smartphone value and created a dark time for the Rebellion. … Now startups are fighting for only 5% of the top spots as the Top Free Apps list is dominated by incumbents. Facebook (4 apps), Google (6 apps), and Amazon (4 apps) EACH have as many apps in the Top 100 list as all the new startups combined.”

Apple & Amazon

Emojis are popular, so those features got copied, those apps got blocked & then apps using the official emojis also got blocked from distribution. The same thing happens with products on Amazon.com in terms of getting undercut by a house brand which was funded by using the vendor’s sales data. Re-buy your brand or else.

Facebook

Before the Facebook IPO some thought buying Zynga shares was a backdoor way to invest into Facebook because gaming was such a large part of the ecosystem. That turned out to be a dumb thesis and horrible trade. At times other things trended including quizzes, videos, live videos, news, self hosted Instant Articles, etc.

Over time the general trend was edge rank of professional publishers fell as a greater share of inventory went to content from friends & advertisers. The metrics associated with the ads often overstated their contribution to sales due to bogus math and selection bias.

Internet-first publishers like CollegeHumor struggled to keep up with the changes & influencers waiting for a Facebook deal had to monetize using third parties:

“I did 1.8 billion views last year,” [Ryan Hamilton] said. “I made no money from Facebook. Not even a dollar.” … “While waiting for Facebook to invite them into a revenue-sharing program, some influencers struck deals with viral publishers such as Diply and LittleThings, which paid the creators to share links on their pages. Those publishers paid top influencers around $500 per link, often with multiple links being posted per day, according to a person who reached such deals.”

YouTube

YouTube had a Panda-like update back in 2012 to favor watch time over raw view counts. They also adjust the ranking algorithms on breaking news topics to favor large & trusted channels over conspiracy theorist content, alternative health advice, hate speech & ridiculous memes like the Tide pod challenge.

All unproven channels need to start somewhat open to gain usage, feedback & marketshare. Once they become real businesses they clamp down. Some of the clamp down can be editorial, forced by regulators, or simply anticompetitive monpolistic abuse.

Kid videos were a huge area on YouTube (perhaps still are) but that area got cleaned up after autogenerated junk videos were covered & the FTC clipped YouTube for delivering targeted ads on channels which primarily catered to children.

Dominant channels can enforce tying & bundling to wipe out competitors:

“Google’s response to the threat from AppNexus was that of a classic monopolist. They announced that YouTube would no longer allow third-party advertising technology. This was a devastating move for AppNexus and other independent ad technology companies. YouTube was (and is) the largest ad-supported video publisher, with more than 50% market share in most major markets. … Over the next few months, Google’s ad technology team went to each of our clients and told them that, regardless of how much they liked working with AppNexus, they would have to also use Google’s ad technology products to continue buying YouTube. This is the definition of bundling, and we had no recourse. Even WPP, our largest customer and largest investors, had no choice but to start using Google’s technology. AppNexus growth slowed, and we were forced to lay off 100 employees in 2016.”

Everyone Else

Every moderately large platform like eBay, Etsy, Zillow, TripAdvisor or the above sorts of companies runs into these sorts of issues with changing distribution & how they charge for distribution.

Building Anti-fragility Into Your Business Model

Growing as fast as you can until the economy craters or an algorithm clips you almost guarantees a hard fall along with an inability to deal with it.

Markets ebb and flow. And that would be true even if the above algorithmic platforms did not make large, sudden shifts.

Build Optionality Into Your Business Model

If your business primarily relies on publishing your own websites or you have a mix of a few clients and your own sites then you have a bit more optionality to your approach in dealing with updates.

Even if you only have one site and your business goes to crap maybe you at least temporarily take on a few more consulting clients or do other gig work to make ends meet.

Focus on What is Working

If you have a number of websites you can pour more resources into whatever sites reacted positively to the update while (at least temporarily) ignoring any site that was burned to a crisp.

Ignore the Dead Projects

The holding cost of many websites is close to zero unless they use proprietary and complex content management systems. Waiting out a penalty until you run out of obvious improvements on your winning sites is not a bad strategy. Plus, if you think the burned site is going to be perpetually burned to a crisp (alternative health anyone?) then you could sell links off it or generate other alternative revenue streams not directly reliant on search rankings.

Build a Cushion

If you have cash savings maybe you guy out and buy some websites or domain names from other people who are scared of the volatility or got clipped for issues you think you could easily fix.

When the tide goes out debt leverage limits your optionality. Savings gives you optionality. Having slack in your schedule also gives you optionality.

The person with a lot of experience & savings would love to see highly volatile search markets because those will wash out some of the competition, curtail investments from existing players, and make other potential competitors more hesitant to enter the market.

via Managing Algorithimc Volatility

 

Comments Comments Off on Managing Algorithimc Volatility

Why SEO & Machine Learning Are Joining Forces

The global datasphere will grow from 33 zettabytes (each equal to a trillion gigabytes) in 2018 to 175 ZB by 2025.

In marketing, our role as the stewards of much of this data is growing, as well.

As of last year, more data is being stored in the enterprise core than in all the world’s existing endpoints, according to a report by IDC.

The great challenge for marketers and SEO professionals is activating and using that data.

In 2025, each connected person will have at least one data interaction every 18 seconds and nearly 30% of the world’s data will need real-time processing.

There’s no way human marketers can handle this processing on our own.

And more and more, as our machine-learning-enabled tools process and analyze search data, they’re learning and improving their understanding of it as they go.

ADVERTISEMENT

CONTINUE READING BELOW

Machine Learning in Search

Perhaps the best-known use of machine learning in search is Google’s own RankBrain, an algorithm that helps the search engine better understand the relevance and context of – and the relationship between – words.

Machine learning enables Google to understand the idea behind the query.

Machine learning allows the algorithm to continuously expand that understanding as new words and queries are introduced.

And as algorithms get better at determining which content best meets the needs of each searcher, we are being challenged to create content that meets those needs – and to optimize it so that relevance is clear.

It’s no coincidence that as we’re experiencing this explosion in data, interest in SEO is growing, as well.

SEO & Data Science

SEO has grown to be a viable, respectable mainstream marketing career.

As I write this, there are 823,000 people on LinkedIn with “SEO” in their profile and 8,600 who specifically categorize their core service offerings as SEO.

Looking worldwide, those figures balloon to 3.2 million and 25,000, respectively.

ADVERTISEMENT

CONTINUE READING BELOW

But this is just a small sampling of the SEO industry.

There are those in SEO who identify as content marketers, digital marketing strategists or practitioners, site developers, analytics pros, consultants, advisors, and more.

Our industry is massive in size and scope, as SEO now touches nearly every aspect of the business.

So much more is being asked of SEO professionals now, thanks to that massive increase in data we have to deal with.

Yet according to our research at BrightEdge, only 31.5% of organizations have a data scientist at their company.

Working alongside machine learning rather offers tech-savvy SEO professionals a number of important advantages.

1. Enhanced Performance in Your Field of Specialization

Employers and clients alike are driven by results.

Do you know how to use the machine-learning-powered tools in your area of specialization?

Whether in paid search, technical SEO, content creation and optimization, link building or some other facet of SEO, those who can prove superior performance through the use of machine-learning-enabled SEO tools are increasing their own value.

2. Start Ahead & Stay Ahead

Search is a live auction. If you’re waiting to see what customers think and only then getting ready to respond, you’re already behind.

Machine-learning-powered tools enable marketers to activate real-time insights, to personalize and optimize content in the moment for each users’ individual needs.

3. Economies of Scale

You are exponentially more valuable as an SEO practitioner and leader if you can demonstrate the ability to scale your efforts.

The real power of machine learning is in its ability to convert more data than we know what to do with into actionable insights and automated actions that marketers can use to really move the needle.

To do that is hard.

For example, to build BrightEdge Autopilot we had to process over 345 petabytes of data over the course of many years to help fine-tune machine learning and automated products.

ADVERTISEMENT

CONTINUE READING BELOW

Machines aren’t angling for a promotion; they don’t harbor preconceptions or care about past mistakes.

They are entirely subjective, taking opinions and personalities and other potential bottlenecks out of the process of data evaluation.

What marketers are left with are pure, accurate data outputs that can then be activated at scale to improve search visibility and interactions with customers.

4. Room to Grow

Mastering your SEO toolset gives you more room to grow in your profession, and as a person who just so happens to love the work you do.

Machine learning, in particular, empowers us to reap insights from larger datasets and gives us access to far more intelligence than when we could only learn from that we manually analyzed ourselves.

It is your specialized insight and industry knowledge that determines which outputs are useful and how they should be applied.

Machine learning can tell you very quickly how your audience’s behaviors have changed during a major market disruption, such as our recent experience with COVID-19.

ADVERTISEMENT

CONTINUE READING BELOW

But how you interpret and respond to those changes is still very much the domain of marketing and SEO professionals.

Machine learning can help you recognize patterns in visitor behavior that point to opportunities and areas in need of improvement.

What technology cannot do is replace the creative and analytical human thought process and experience that determines the best next steps to take in response to those insights.

The people of SEO cannot be replaced. In fact, they’re more important than ever.

The tools we use may be quite sophisticated; machine-learning-enabled tools can even make decisions and implement optimizations.

However, it is the people of SEO who drive the creative and analytical processes that machines simply cannot replace:

  • Creative analysts.
  • Data scientist (who control input into machines).
  • Analytics.
  • Content producers.
  • Culture builders and success evangelists.
  • Expert users who facilitate sales and help customers.
  • Strategic planning across digital channels.

And there are agile marketers who may do any combination of the above.

ADVERTISEMENT

CONTINUE READING BELOW

They are key in facilitating collaboration with other digital departments to ensure a truly holistic SEO strategy.

In their HBR article Collaborative Intelligence: Humans and AI Are Joining Forces, H. James Wilson and Paul R. Daugherty explain the three key roles humans undertake in every interaction with machine-learning-powered technology:

  • Train: We need to teach the machine to perform certain tasks.
  • Explain: We must make sense of the outcome of the task, especially if it is unexpected or counterintuitive.
  • Sustain: It is up to us to ensure that the technology is used logically and responsibly.

Applying this lens to our SEO tech, we see these three tenets hold true.

We need to decide which SEO tasks to intelligently automate and give our tools the proper input.

We need to take the output and make sense of it, focusing only on those insights with business-building potential.

We are responsible for ensuring that searcher privacy is protected, that the value of the technology outweighs the cost, and that it is otherwise being made good use of.

You can build your value as an SEO and learn to work more effectively with machine-learning-powered tech by building these skills:

ADVERTISEMENT

CONTINUE READING BELOW

  • Data proficiency: According to Stanford researchers, the share of AI jobs grew from 0.3% in 2012 to 0.8% of total jobs posted in the U.S. in 2019. AI labor demand is growing, especially in high-tech services and the manufacturing sector.
  • Communication: As the arbiter of so much customer data, it is critical that we communicate key insights and value in ways other department heads and decision-makers can understand.
  • Agility: More than a trait or quality, agility is a skill developed through constant experimentation.

Embracing machine learning and automation means building synergy with human creativity and skills.

It can make us more creative and effective by uncovering SEO insights and patterns we would never have recognized otherwise.

It can help us discover new topics, identify content gaps, optimize for specific types of queries and results, and more.

What’s more, it can save vital time on tasks that are too time-consuming, too repetitive and laborious, so we can scale performance.

And as that happens, we develop new skills and progress also as part of a symbiotic relationship between people and technology.

via Why SEO & Machine Learning Are Joining Forces

 

Comments (0)

Have you ever wondered “Why Doesn’t anyone visit my website or social media content?” Not being indexed in Google could be the main reason.

When people search for your content in must be indexed in order to “show up”.  There are two kinds of content on the web, content that just exists and content that exists and is indexed.  You might ask yourself, what is indexing and how do I do it?

According to Wikipedia indexing is:

Web indexing (or Internet indexing) refers to various methods for indexing the contents of a website or of the Internet as a whole. Individual websites or intranets may use a back-of-the-book index, while search engines usually use keywords and metadata to provide a more useful vocabulary for Internet or onsite searching.

Think of the indexing in the Public Library.  Would you ever be able to find a specific book if it weren’t for the good ole Dewey Decimal Classification.  The web works the same way. Content must be indexed in order for Google to find it.  It would be pretty hard to find a book on the library shelf if it wasn’t where it was supposed to be.

The new weekly blog post has just been created, it took you all week!

You did all the hard work;

  • research to make it an extremely interesting
  • research to make it informative
  • graphics were inserted
  • video was embeded from YouTube
  • outbound links were researched and inserted
  • added inbound links, found a great post you created before that was applicable
  • keyword focused.

Plus you used your Facebook account to distribute it to your friends and followers. Why is it that no one seems to have read it? You thought it would show up in Google right away. It’s likely because it isn’t indexed.

How Can I Tell If My Site is Indexed

Open a new tab in your browser

Type in:   Site: http://www.YourWebsiteName.com Below is what I would type in for Butterfly Networking.  You just switch out your website name instead of mine.

 

 

Check the results, does your site show up?  If yes great you are definitely doing something right. If no, more work is required.

In the case of Google they send out GoogleBots AKA:spiders to crawl the net searching for content.  All of the search engines send out these spiders, however it appears Google sends out more and they “crawl” deeper than the others.  Yes, these spiders do crawl Facebook and other social media sites.  However they have to be “crawled” just like other web pages before they can be indexed.  Google has to crawl your content before they “decide” to index it.  Not all content is indexed.  Computer programs determine which sites to crawl, how often and how many pages to fetch from each site.  https://www.google.com/search/howsearchworks/crawling-indexing/

How Do I Get my Site Indexed?

Within the Google suite of webmaster tools you can submit your site or even an individual post to Google’s list to be crawled.  When I first heard about this process I thought OMG that is so complicated.  It’s not actually bad.  There are a number of steps, the first being signed into your Google account. If you don’t have a Google account you can create one for free.  Once signed in you can sign up for Google Search Console, they have directions on how to complete this process.

  1. Go to Google Search Console (former Google Webmaster Tools),
  2. Sign into your Google account and
  3. Click the red button to add your website. Once you’ve clicked the button to add your site, just
  4. Type your website URL in the box.
  5. Click the blue button to continue.
  6. Please make sure you enter your complete url.

If you want to make sure all of the pages within your website are indexed you can perform the same “site:” search as above.

Once complete repeat the same process for Bing and the other search engines.

Rather not do this over and over again?  I can do it for you if you would prefer.

If you have a WordPress blog you can also create a sitemap and submit that, but that’s a different blog post. 🙂

Comments (0)

Windmill Networking: Understanding, Leveraging & Maximizing LinkedIn: An Unofficial, Step-by-Step Guide to Creating & Implementing Your LinkedIn Brand – Social Networking in a Web 2.0 World (Paperback)

Twitter For Dummies (Paperback)

Facebook Marketing For Dummies (Paperback)