Archive for Google +

Google Webmaster Blogs Has Moved

As you know, Google renamed Google Webmaster Central to Google Search Central. But with that, as Google told us, they are moving the blogs to a new location. That move was made yesterday (they actually did some weird things that broke Wednesday night) but the move completed on Thursday.

The new blog location is at developers.google.com/search/blog. If you go to the old address, https://webmasters.googleblog.com/, it will do some sort of weird redirect. At the time I checked it, it was not sending a redirect header server status code but maybe it will be fixed when this is published.

In any event, here is what the blog now looks like (click to enlarge):

click for full size

I like the internal pages, it shows you the navigation to the old posts by date all in the side bar:

click for full size

 

via Google Webmaster Blogs Moved Locations

 

Categories : Google +
Comments (0)

How to Read Google Algorithm Updates

Links = Rank

Old Google (pre-Panda) was to some degree largely the following: links = rank.

Once you had enough links to a site you could literally pour content into a site like water and have the domain’s aggregate link authority help anything on that site rank well quickly.

As much as PageRank was hyped & important, having a diverse range of linking domains and keyword-focused anchor text were important.

Brand = Rank

After Vince then Panda a site’s brand awareness (or, rather, ranking signals that might best simulate it) were folded into the ability to rank well.

Panda considered factors beyond links & when it first rolled out it would clip anything on a particular domain or subdomain. Some sites like HubPages shifted their content into subdomains by users. And some aggressive spammers would rotate their entire site onto different subdomains repeatedly each time a Panda update happened. That allowed those sites to immediately recover from the first couple Panda updates, but eventually Google closed off that loophole.

Any signal which gets relied on eventually gets abused intentionally or unintentionally. And over time it leads to a “sameness” of the result set unless other signals are used:

Google is absolute garbage for searching anything related to a product. If I’m trying to learn something invariably I am required to search another source like Reddit through Google. For example, I became introduced to the concept of weighted blankets and was intrigued. So I Google “why use a weighted blanket” and “weighted blanket benefits”. Just by virtue of the word “weighted blanket” being in the search I got pages and pages of nothing but ads trying to sell them, and zero meaningful discourse on why I would use one

Getting More Granular

Over time as Google got more refined with Panda broad-based sites outside of the news vertical often fell on tough times unless they were dedicated to some specific media format or had a lot of user engagement metrics like a strong social network site. That is a big part of why the New York Times sold About.com for less than they paid for it & after IAC bought it they broke it down into a variety of sites like: Verywell (health), the Spruce (home decor), the Balance (personal finance), Lifewire (technology), Tripsavvy (travel) and ThoughtCo (education & self-improvement).

Penguin further clipped aggressive anchor text built on low quality links. When the Penguin update rolled out Google also rolled out an on-page spam classifier to further obfuscate the update. And the Penguin update was sandwiched by Panda updates on either side, making it hard for people to reverse engineer any signal out of weekly winners and losers lists from services that aggregate massive amounts of keyword rank tracking data.

So much of the link graph has been decimated that Google reversed their stance on nofollow to where in March 1st of this year they started treating it as a hint versus a directive for ranking purposes. Many mainstream media websites were overusing nofollow or not citing sources at all, so this additional layer of obfuscation on Google’s part will allow them to find more signal in that noise.

May 4, 2020 Algo Update

On May 4th Google rolled out another major core update.

Later today, we are releasing a broad core algorithm update, as we do several times per year. It is called the May 2020 Core Update. Our guidance about such updates remains as we’ve covered before. Please see this blog post for more about that:https://t.co/e5ZQUAlt0G— Google SearchLiaison (@searchliaison) May 4, 2020

I saw some sites which had their rankings suppressed for years see a big jump. But many things changed at once.

Wedge Issues

On some political search queries which were primarily classified as being news related Google is trying to limit political blowback by showing official sites and data scraped from official sites instead of putting news front & center.

“Google’s pretty much made it explicit that they’re not going to propagate news sites when it comes to election related queries and you scroll and you get a giant election widget in your phone and it shows you all the different data on the primary results and then you go down, you find Wikipedia, you find other like historical references, and before you even get to a single news article, it’s pretty crazy how Google’s changed the way that the SERP is intended.”

That change reflects the permanent change to the news media ecosystem brought on by the web.

The Internet commoditized the distribution of facts. The “news” media responded by pivoting wholesale into opinions and entertainment.— Naval (@naval) May 26, 2016

YMYL

A blog post by Lily Ray from Path Interactive used Sistrix data to show many of the sites which saw high volatility were in the healthcare vertical & other your money, your life (YMYL) categories.

Aggressive Monetization

One of the more interesting pieces of feedback on the update was from Rank Ranger, where they looked at particular pages that jumped or fell hard on the update. They noticed sites that put ads or ad-like content front and center may have seen sharp falls on some of those big money pages which were aggressively monetized:

Seeing this all but cements the notion (in my mind at least) that Google did not want content unrelated to the main purpose of the page to appear above the fold to the exclusion of the page’s main content! Now for the second wrinkle in my theory…. A lot of the pages being swapped out for new ones did not use the above-indicated format where a series of “navigation boxes” dominated the page above the fold.

The above shift had a big impact on some sites which are worth serious money. Intuit paid over $7 billion to acquire Credit Karma, but their credit card affiliate pages recently slid hard.

Credit Karma lost 40% traffic from May core update. That’s insane, they do major TV ads and likely pay millions in SEO expenses. Think about that folks. Your site isn’t safe. Google changes what they want radically with every update, while telling us nothing!— SEOwner (@tehseowner) May 14, 2020

The above sort of shift reflects Google getting more granular with their algorithms. Early Panda was all or nothing. Then it started to have different levels of impact throughout different portions of a site.

Brand was sort of a band aid or a rising tide that lifted all (branded) boats. Now we are seeing Google get more granular with their algorithms where a strong brand might not be enough if they view the monetization as being excessive. That same focus on page layout can have a more adverse impact on small niche websites.

One of my old legacy clients had a site which was primarily monetized by the Amazon affiliate program. About a month ago Amazon chopped affiliate commissions in half & then the aggressive ad placement caused search traffic to the site to get chopped in half when rankings slid on this update.

Their site has been trending down over the past couple years largely due to neglect as it was always a small side project. They recently improved some of the content about a month or so ago and that ended up leading to a bit of a boost, but then this update came. As long as that ad placement doesn’t change the declines are likely to continue.

They just recently removed that ad unit, but that meant another drop in income as until there is another big algo update they’re likely to stay at around half search traffic. So now they have a half of a half of a half. Good thing the site did not have any full time employees or they’d be among the millions of newly unemployed. That experience though really reflects how websites can be almost like debt levered companies in terms of going under virtually overnight. Who can have revenue slide around 88% and then take increase investment in the property using the remaining 12% while they wait for the site to be rescored for a quarter year or more?

“If you have been negatively impacted by a core update, you (mostly) cannot see recovery from that until another core update. In addition, you will only see recovery if you significantly improve the site over the long-term. If you haven’t done enough to improve the site overall, you might have to wait several updates to see an increase as you keep improving the site. And since core updates are typically separated by 3-4 months, that means you might need to wait a while.”

Almost nobody can afford to do that unless the site is just a side project.

Google could choose to run major updates more frequently, allowing sites to recover more quickly, but they gain economic benefit in defunding SEO investments & adding opportunity cost to aggressive SEO strategies by ensuring ranking declines on major updates last a season or more.

Choosing a Strategy vs Letting Things Come at You

They probably should have lowered their ad density when they did those other upgrades. If they had they likely would have seen rankings at worst flat or likely up as some other competing sites fell. Instead they are rolling with a half of a half of a half on the revenue front. Glenn Gabe preaches the importance of fixing all the problems you can find rather than just fixing one or two things and hoping it is enough. If you have a site which is on the edge you sort of have to consider the trade offs between various approaches to monetization.

  • monetize it lightly and hope the site does well for many years
  • monetize it slightly aggressively while using the extra income to further improve the site elsewhere and ensure you have enough to get by any lean months
  • aggressively monetize the shortly after a major ranking update if it was previously lightly monetized & then hope to sell it off a month or two later before the next major algorithm update clips it again

Outcomes will depend partly on timing and luck, but consciously choosing a strategy is likely to yield better returns than doing a bit of mix-n-match while having your head buried in the sand.

Reading the Algo Updates

You can spend 50 or 100 hours reading blog posts about the update and learn precisely nothing in the process if you do not know which authors are bullshitting and which authors are writing about the correct signals.

But how do you know who knows what they are talking about?

It is more than a bit tricky as the people who know the most often do not have any economic advantage in writing specifics about the update. If you primarily monetize your own websites, then the ignorance of the broader market is a big part of your competitive advantage.

Making things even trickier, the less you know the more likely Google would be to trust you with sending official messaging through you. If you syndicate their messaging without questioning it, you get a treat – more exclusives. If you question their messaging in a way that undermines their goals, you’d quickly become persona non grata – something cNet learned many years ago when they published Eric Schmidt’s address.

It would be unlikely you’d see the following sort of Tweet from say Blue Hat SEO or Fantomaster or such.

I asked Gary about E-A-T. He said it’s largely based on links and mentions on authoritative sites. i.e. if the Washington post mentions you, that’s good.

He recommended reading the sections in the QRG on E-A-T as it outlines things well.@methode #Pubcon— Marie Haynes (@Marie_Haynes) February 21, 2018

To be able to read the algorithms well you have to have some market sectors and keyword groups you know well. Passively collecting an archive of historical data makes the big changes stand out quickly.

Everyone who depends on SEO to make a living should subscribe to an online rank tracking service or run something like Serposcope locally to track at least a dozen or two dozen keywords. If you track rankings locally it makes sense to use a set of web proxies and run the queries slowly through each so you don’t get blocked.

You should track at least a diverse range to get a true sense of the algorithmic changes.

  • a couple different industries
  • a couple different geographic markets (or at least some local-intent vs national-intent terms within a country)
  • some head, midtail and longtail keywords
  • sites of different size, age & brand awareness within a particular market

Some tools make it easy to quickly add or remove graphing of anything which moved big and is in the top 50 or 100 results, which can help you quickly find outliers. And some tools also make it easy to compare their rankings over time. As updates develop you’ll often see multiple sites making big moves at the same time & if you know a lot about the keyword, the market & the sites you can get a good idea of what might have been likely to change to cause those shifts.

Once you see someone mention outliers most people miss that align with what you see in a data set, your level of confidence increases and you can spend more time trying to unravel what signals changed.

I’ve read influential industry writers mention that links were heavily discounted on this update. I have also read Tweets like this one which could potentially indicate the opposite.

Check out https://t.co/1GhD2U01ch . Up even more than Pinterest and ranking for some real freaky shit.— Paul Macnamara (@TheRealpmac) May 12, 2020

If I had little to no data, I wouldn’t be able to get any signal out of that range of opinions. I’d sort of be stuck at “who knows.”

By having my own data I track I can quickly figure out which message is more inline with what I saw in my subset of data & form a more solid hypothesis.

No Single Smoking Gun

As Glenn Gabe is fond of saying, sites that tank usually have multiple major issues.

Google rolls out major updates infrequently enough that they can sandwich a couple different aspects into major updates at the same time in order to make it harder to reverse engineer updates. So it does help to read widely with an open mind and imagine what signal shifts could cause the sorts of ranking shifts you are seeing.

Sometimes site level data is more than enough to figure out what changed, but as the above Credit Karma example showed sometimes you need to get far more granular and look at page-level data to form a solid hypothesis.

As the World Changes, the Web Also Changes

About 15 years ago online dating was seen as a weird niche for recluses who perhaps typically repulsed real people in person. Now there are all sorts of niche specialty dating sites including a variety of DTF type apps. What was once weird & absurd had over time become normal.

The COVID-19 scare is going to cause lasting shifts in consumer behavior that accelerate the movement of commerce online. A decade of change will happen in a year or two across many markets.

Telemedicine will grow quickly. Facebook is adding commerce featured directly onto their platform through partnering with Shopify. Spotify is spending big money to buy exclusives rights to distribute widely followed podcasters like Joe Rogan. Uber recently offered to acquire GrubHub. Google and Apple will continue adding financing features to their mobile devices. Movie theaters have lost much of their appeal.

Tons of offline “value” businesses ended up having no value after months of revenue disappearing while large outstanding debts accumulated interest. There is a belief that some of those brands will have strong latent brand value that carries over online, but if they were weak even when the offline stores acting like interactive billboards subsidized consumer awareness of their brands then as those stores close the consumer awareness & loyalty from in-person interactions will also dry up. A shell of a company rebuilt around the Toys R’ Us brand is unlikely to beat out Amazon’s parallel offering or a company which still runs stores offline.

Big box retailers like Target & Walmart are growing their online sales at hundreds of percent year over year.

There will be waves of bankruptcies, dramatic shifts in commercial real estate prices (already reflected in plunging REIT prices), and more people working remotely (shifting residential real estate demand from the urban core back out into suburbs).

People who work remote are easier to hire and easier to fire. Those who keep leveling up their skills will eventually get rewarded while those who don’t will rotate jobs every year or two. The lack of stability will increase demand for education, though much of that incremental demand will be around new technologies and specific sectors – certificates or informal training programs instead of degrees.

More and more activities will become normal online activities.

The University of California has about a half-million students & in the fall semester they are going to try to have most of those classes happen online. How much usage data does Google gain as thousands of institutions put more and more of their infrastructure and service online?

Colleges have to convince students for the next year that a remote education is worth every bit as much as an in-person one, and then pivot back before students actually start believing it.

It’s like only being able to sell your competitor’s product for a year.— Naval (@naval) May 6, 2020

A lot of B & C level schools are going to go under as the like-vs-like comparison gets easier. Back when I ran a membership site here a college paid us to have students gain access to our membership area of the site. As online education gets normalized many unofficial trade-related sites will look more economically attractive on a relative basis.

If core institutions of the state deliver most of their services online, then other companies can be expected to follow. When big cities publish lists of crimes they will not respond to during economic downturns they are effectively subsidizing more crime. That in turn makes moving to somewhere a bit more rural & cheaper make sense, particularly when you no longer need to live near your employer.

via How to Read Google Algorithm Updates

 

Categories : Google +
Comments (0)

Why is it important to be “found by Google”?

Google is pretty much the king when it comes to search, here’s an amazing fact for you 61,109 Google searches in 1 second”.  That really is an amazing stat and you definitely want your website and/or blog to show up in some of those searches.  Even if it was 1/10000 it would be a HUGE pile of traffic!

Imagine having all those eyes on your content, not only would it help your website ranking but your sales and bottom line too.

Google has extensive reach and when people search for the product or service you definitely want to be found. Let everyone else’s site be one of those “billboards in the middle of the desert”, that no one sees!

People are becoming more and more dependant on their computers, tablets, and/or phones.  How many people do you know who don’t go anywhere, including sleeping, without their phone?  I’ll bet that is quite a few people. Many people today have their phone chargers right beside the bed or even sleep with the phone beside them.

Google focuses on the end user (the searcher) it’s important to Google that people find what they are looking for. As an end user I am very happy they do this.  As a business owner with products and services to sell it definitely makes it more work for me, however it is definitely worth it in the long run! Think about all of the free products that Google has made available, a great example for business owners is Google Hangouts. They do their very best to keep we searchers happy and coming back to Google again and again.

It takes time money and effort to get Google to “find” your site and content. Being found on Google is “know like and trust” factor, people trust the results they receive from Google. When you are on the first page of the search results it is seen by many as the authoritative source.  Wouldn’t it be awesome to be seen as the authority in your niche.  There are a number of factors that influence where your website shows up the search, here are the main ones;

  • Quality content posted on a regular basis – this provides your visitors with value and when people find your content of value they will share it more often.
  • Include other sources using hyperlinks – this will increase your credibility and any forward links you include will be noticed and rewarded by Google.
  • Are you popular, are people linking to you? – Just asking people to link to your site likely won’t be enough, you have to give them a reason to link to you.
  • the way your website is set up – Be sure to have a business plan and a marketing strategy before you start.  This will be extremely valuable for creating website traffic and converting visitors to customers.
  • website name
  • age of your website (even purchasing your hosting for more than one year makes a difference)

One of the super easy and quick ways to be found is to make sure you are using tags in all of the pages and posts on your website.  What are tags you may ask…

In information systems, a tag is a keyword or term assigned to a piece of information (such as an Internet bookmark, digital image, database record, or computer file). This kind of metadata helps describe an item and allows it to be found again by browsing or searching. Source Wikipedia

Categories are general, this is where you implement your long term strategy.  An example is “Content Creation Tools” vs a specific tool name (eg: XYZ Best Content Finder).  Content creation tools come and go.  There are times when the creator realizes it will be too expensive to upgrade to the latest Google changes and decides not to update their product.  As a result the category XYZ Best Content Finder would be moot, dead wood on your website.

When creating your categories think about your long term strategy and what you want to be found for in the next five years. Use your big broad topics that are not likely to change in 5 years.

Tags are specific, and they play a huge role in helping your sites rank.  An example of a specific would be “XYZ Best Content Finder”.

The benefits of using tags;

  • fully indexable,
  • searchable
  • rankable
  • improve the visibility of your website, increase traffic and boost your brand’s revenue
  • improve usability for all the visitors who come to your website
  • easier for search engine to know what your post is all about
  • improves visitor experience (remember, this is what Google is looking for)
  • boosts your website ranking

However you will have to create the right content, add images, videos & then create 100s of backlinks to rank your site higher. Finding the right tags can be even more of a challenge!

I found this great tool that will help you tag your posts, a new WordPress plugin find all the right keywords and tags to add to your posts so you don’t waste hours trying to figure it out yourself.  The work of tagging your blog post can be done in seconds rather than hours.

I have a bonus for you, buy WP Tag Machine today, forward your purchased receipt to Carla@ButterflyNetworking.com and I will have 10 of your blog posts indexed by Google in 48 hours or less.  Using both of these tools will really help Skyrocket your Ability to Be Found By Google! 🙂

Categories : Google +
Comments (0)

Take Advantage of the Circles on Google+

Google+ has joined the ranks for social media. Have you had a look at it yet? I find that it fits in between Facebook and Twitter. Learn more on Google Plus circles.

“We hope you enjoyed this video and learned something new. Please join in the conversation by posting your comments right here on the blog.”

Google Plus is this great new medium where we can connect, interact, have fun, meet and greet. It’s an excellent platform for getting to know people, staying connected with people.

How can you get more from the circle concept that they created on Google+?

A circle, if you’re familiar with Facebook or Twitter, is like a Facebook friends list or perhaps one of your private Twitter lists. The concept is the same, you’re putting like-minded people together, or people that have something in common in a circle, like all your family, all your friends, all your business associates, all the people you met at a certain conference. That is the basic way to use Google Circle.

Add to Google+ circle

[Image Source]

You can organize by category or relationship or both. Whatever works for you, it’s very flexible. Read More→

Categories : Articles, Google +
Comments (22)

Windmill Networking: Understanding, Leveraging & Maximizing LinkedIn: An Unofficial, Step-by-Step Guide to Creating & Implementing Your LinkedIn Brand – Social Networking in a Web 2.0 World (Paperback)

Twitter For Dummies (Paperback)

Facebook Marketing For Dummies (Paperback)