Author Archive

Augmented reality (AR) in Search lets you bring 3D objects and animals into your space so you can turn your living room into a virtual zoo, explore the Apollo 11spacecraft up close, or take a picture with Santa. I love seeing how much fun families are having with this experience at home. AR in Search can also help you discover and explore new concepts. Here are a few new ways you can use AR (and a little imagination) to learn at home.

Take a virtual trip through the human body

It’s one thing to read about the human heart, and another to see one up close to understand how it pumps blood to provide oxygen. We’re partnering withBioDigital so that you can explore11 human body systems with AR in Search on mobile. Search forcirculatory system and tap “View in 3D” to see a heart up close or look upskeletal system to trace the bones in the human body and see how they connect. Read labels on each body part to learn more about it or view life-size images in AR to better understand its scale.

  • E703_ARSearch_Skeletal_Blog_v04_nl_Wide.gif


  • E703_ARSearch_Muscular_Blog_v03_nl_Wide.gif


  • E703_ARSearch_Circulatory_Blog_v05_nl_Wide.gif


Get a magnified view of our microscopic world

Seeing is often understanding. But tiny organisms, like cells, are hard to visualize unless you can magnify them to understand what’s inside. We’ve partnered withVisible Bodyto createAR models of animal, plant and bacteria cells, including some of their key organelles. Search foranimal cell and zoom into its nucleus to see how it stores DNA or search formitochondria to learn what’s inside it. With AR, you can bring a 3D cell into your space to rotate it, zoom in and view details about its different components.

  • E703_ARSearch_AnimalCell_Blog_v03_nl_Wide.gif


  • E703_ARSearch_PlantCell_Blog_v02_nl_Wide.gif


  • E703_ARSearch_Mitochondrion_Blog_v05_nl_Wide.gif


Turn your home into a museum

Many museums may be closed right now, but with Google Arts & Culture and institutions like the Smithsonian National Air and Space Museum, you can turn your home into one using AR. Search forApollo 11 on your phone to see its command module in 3D, look upNeil Armstrong to get a life-size view of his spacesuit, or step inside theChauvet Cave to get an up-close look at some of the world’s oldest known cave paintings, which are usually closed off to the public.

E703_ARSearch_Armstrong_Blog_v03_nl (1).gif

Easily explore, record and share

To help you quickly explore related content, we’re rolling out a new carousel format on Android, as well as a recording option to share social-worthy AR videos with friends and family.

E703_ARSearch_Carousel_Blog_v01_nl (1).gif

Explore content with the carousel format on Android

We hope that you enjoy exploring all of these 3D and AR experiences on Google. Tag us on social with #Google3Dand let us know how you’re using AR to learn and explore new things in your home. We can’t wait to hear where your imagination takes you next!

via Make at-home learning more fun with 3D and AR in Search

Travel back in time with AR dinosaurs in Search

A behind-the-scenes look at how
“Jurassic World” AR dinosaurs are made 

Using technology from Ludia’s “Jurassic World Alive” game, these AR dinosaurs are some of the most realistic models out there. Check out this video to see how an AR Brachiosaurus is made, including 3D modeling, texturing and animation.

“To create the 3D dinosaurs, our concept artists first did preliminary research to discover information about each creature,” says Camilo Sanin, Ludia’s Lead on Character Creations. “Not only did we draw research from various forms of literature, our artists also worked with paleontologists and the ‘Jurassic World’ team to make the assets as accurate and realistic as possible. Even the smallest of details, such as irregularities of skin color and patterns, are important.” 

Unlike some of Google’s AR animals, like a dog or tiger, dinosaurs pose a new technical challenge: their massive size. The new auto-scale feature on Android can now automatically calculate the distance between your phone and a surface in your space and resize the dinosaur so it fits on your phone screen. If you tap “View actual size,” AR tracking technology automatically repositions the dinosaur in your space to make room for it.

via Travel back in time with AR dinosaurs in Search




Categories : Uncategorized
Comments (0)

How to Read Google Algorithm Updates

Links = Rank

Old Google (pre-Panda) was to some degree largely the following: links = rank.

Once you had enough links to a site you could literally pour content into a site like water and have the domain’s aggregate link authority help anything on that site rank well quickly.

As much as PageRank was hyped & important, having a diverse range of linking domains and keyword-focused anchor text were important.

Brand = Rank

After Vince then Panda a site’s brand awareness (or, rather, ranking signals that might best simulate it) were folded into the ability to rank well.

Panda considered factors beyond links & when it first rolled out it would clip anything on a particular domain or subdomain. Some sites like HubPages shifted their content into subdomains by users. And some aggressive spammers would rotate their entire site onto different subdomains repeatedly each time a Panda update happened. That allowed those sites to immediately recover from the first couple Panda updates, but eventually Google closed off that loophole.

Any signal which gets relied on eventually gets abused intentionally or unintentionally. And over time it leads to a “sameness” of the result set unless other signals are used:

Google is absolute garbage for searching anything related to a product. If I’m trying to learn something invariably I am required to search another source like Reddit through Google. For example, I became introduced to the concept of weighted blankets and was intrigued. So I Google “why use a weighted blanket” and “weighted blanket benefits”. Just by virtue of the word “weighted blanket” being in the search I got pages and pages of nothing but ads trying to sell them, and zero meaningful discourse on why I would use one

Getting More Granular

Over time as Google got more refined with Panda broad-based sites outside of the news vertical often fell on tough times unless they were dedicated to some specific media format or had a lot of user engagement metrics like a strong social network site. That is a big part of why the New York Times sold for less than they paid for it & after IAC bought it they broke it down into a variety of sites like: Verywell (health), the Spruce (home decor), the Balance (personal finance), Lifewire (technology), Tripsavvy (travel) and ThoughtCo (education & self-improvement).

Penguin further clipped aggressive anchor text built on low quality links. When the Penguin update rolled out Google also rolled out an on-page spam classifier to further obfuscate the update. And the Penguin update was sandwiched by Panda updates on either side, making it hard for people to reverse engineer any signal out of weekly winners and losers lists from services that aggregate massive amounts of keyword rank tracking data.

So much of the link graph has been decimated that Google reversed their stance on nofollow to where in March 1st of this year they started treating it as a hint versus a directive for ranking purposes. Many mainstream media websites were overusing nofollow or not citing sources at all, so this additional layer of obfuscation on Google’s part will allow them to find more signal in that noise.

May 4, 2020 Algo Update

On May 4th Google rolled out another major core update.

Later today, we are releasing a broad core algorithm update, as we do several times per year. It is called the May 2020 Core Update. Our guidance about such updates remains as we’ve covered before. Please see this blog post for more about that:— Google SearchLiaison (@searchliaison) May 4, 2020

I saw some sites which had their rankings suppressed for years see a big jump. But many things changed at once.

Wedge Issues

On some political search queries which were primarily classified as being news related Google is trying to limit political blowback by showing official sites and data scraped from official sites instead of putting news front & center.

“Google’s pretty much made it explicit that they’re not going to propagate news sites when it comes to election related queries and you scroll and you get a giant election widget in your phone and it shows you all the different data on the primary results and then you go down, you find Wikipedia, you find other like historical references, and before you even get to a single news article, it’s pretty crazy how Google’s changed the way that the SERP is intended.”

That change reflects the permanent change to the news media ecosystem brought on by the web.

The Internet commoditized the distribution of facts. The “news” media responded by pivoting wholesale into opinions and entertainment.— Naval (@naval) May 26, 2016


A blog post by Lily Ray from Path Interactive used Sistrix data to show many of the sites which saw high volatility were in the healthcare vertical & other your money, your life (YMYL) categories.

Aggressive Monetization

One of the more interesting pieces of feedback on the update was from Rank Ranger, where they looked at particular pages that jumped or fell hard on the update. They noticed sites that put ads or ad-like content front and center may have seen sharp falls on some of those big money pages which were aggressively monetized:

Seeing this all but cements the notion (in my mind at least) that Google did not want content unrelated to the main purpose of the page to appear above the fold to the exclusion of the page’s main content! Now for the second wrinkle in my theory…. A lot of the pages being swapped out for new ones did not use the above-indicated format where a series of “navigation boxes” dominated the page above the fold.

The above shift had a big impact on some sites which are worth serious money. Intuit paid over $7 billion to acquire Credit Karma, but their credit card affiliate pages recently slid hard.

Credit Karma lost 40% traffic from May core update. That’s insane, they do major TV ads and likely pay millions in SEO expenses. Think about that folks. Your site isn’t safe. Google changes what they want radically with every update, while telling us nothing!— SEOwner (@tehseowner) May 14, 2020

The above sort of shift reflects Google getting more granular with their algorithms. Early Panda was all or nothing. Then it started to have different levels of impact throughout different portions of a site.

Brand was sort of a band aid or a rising tide that lifted all (branded) boats. Now we are seeing Google get more granular with their algorithms where a strong brand might not be enough if they view the monetization as being excessive. That same focus on page layout can have a more adverse impact on small niche websites.

One of my old legacy clients had a site which was primarily monetized by the Amazon affiliate program. About a month ago Amazon chopped affiliate commissions in half & then the aggressive ad placement caused search traffic to the site to get chopped in half when rankings slid on this update.

Their site has been trending down over the past couple years largely due to neglect as it was always a small side project. They recently improved some of the content about a month or so ago and that ended up leading to a bit of a boost, but then this update came. As long as that ad placement doesn’t change the declines are likely to continue.

They just recently removed that ad unit, but that meant another drop in income as until there is another big algo update they’re likely to stay at around half search traffic. So now they have a half of a half of a half. Good thing the site did not have any full time employees or they’d be among the millions of newly unemployed. That experience though really reflects how websites can be almost like debt levered companies in terms of going under virtually overnight. Who can have revenue slide around 88% and then take increase investment in the property using the remaining 12% while they wait for the site to be rescored for a quarter year or more?

“If you have been negatively impacted by a core update, you (mostly) cannot see recovery from that until another core update. In addition, you will only see recovery if you significantly improve the site over the long-term. If you haven’t done enough to improve the site overall, you might have to wait several updates to see an increase as you keep improving the site. And since core updates are typically separated by 3-4 months, that means you might need to wait a while.”

Almost nobody can afford to do that unless the site is just a side project.

Google could choose to run major updates more frequently, allowing sites to recover more quickly, but they gain economic benefit in defunding SEO investments & adding opportunity cost to aggressive SEO strategies by ensuring ranking declines on major updates last a season or more.

Choosing a Strategy vs Letting Things Come at You

They probably should have lowered their ad density when they did those other upgrades. If they had they likely would have seen rankings at worst flat or likely up as some other competing sites fell. Instead they are rolling with a half of a half of a half on the revenue front. Glenn Gabe preaches the importance of fixing all the problems you can find rather than just fixing one or two things and hoping it is enough. If you have a site which is on the edge you sort of have to consider the trade offs between various approaches to monetization.

  • monetize it lightly and hope the site does well for many years
  • monetize it slightly aggressively while using the extra income to further improve the site elsewhere and ensure you have enough to get by any lean months
  • aggressively monetize the shortly after a major ranking update if it was previously lightly monetized & then hope to sell it off a month or two later before the next major algorithm update clips it again

Outcomes will depend partly on timing and luck, but consciously choosing a strategy is likely to yield better returns than doing a bit of mix-n-match while having your head buried in the sand.

Reading the Algo Updates

You can spend 50 or 100 hours reading blog posts about the update and learn precisely nothing in the process if you do not know which authors are bullshitting and which authors are writing about the correct signals.

But how do you know who knows what they are talking about?

It is more than a bit tricky as the people who know the most often do not have any economic advantage in writing specifics about the update. If you primarily monetize your own websites, then the ignorance of the broader market is a big part of your competitive advantage.

Making things even trickier, the less you know the more likely Google would be to trust you with sending official messaging through you. If you syndicate their messaging without questioning it, you get a treat – more exclusives. If you question their messaging in a way that undermines their goals, you’d quickly become persona non grata – something cNet learned many years ago when they published Eric Schmidt’s address.

It would be unlikely you’d see the following sort of Tweet from say Blue Hat SEO or Fantomaster or such.

I asked Gary about E-A-T. He said it’s largely based on links and mentions on authoritative sites. i.e. if the Washington post mentions you, that’s good.

He recommended reading the sections in the QRG on E-A-T as it outlines things well.@methode #Pubcon— Marie Haynes (@Marie_Haynes) February 21, 2018

To be able to read the algorithms well you have to have some market sectors and keyword groups you know well. Passively collecting an archive of historical data makes the big changes stand out quickly.

Everyone who depends on SEO to make a living should subscribe to an online rank tracking service or run something like Serposcope locally to track at least a dozen or two dozen keywords. If you track rankings locally it makes sense to use a set of web proxies and run the queries slowly through each so you don’t get blocked.

You should track at least a diverse range to get a true sense of the algorithmic changes.

  • a couple different industries
  • a couple different geographic markets (or at least some local-intent vs national-intent terms within a country)
  • some head, midtail and longtail keywords
  • sites of different size, age & brand awareness within a particular market

Some tools make it easy to quickly add or remove graphing of anything which moved big and is in the top 50 or 100 results, which can help you quickly find outliers. And some tools also make it easy to compare their rankings over time. As updates develop you’ll often see multiple sites making big moves at the same time & if you know a lot about the keyword, the market & the sites you can get a good idea of what might have been likely to change to cause those shifts.

Once you see someone mention outliers most people miss that align with what you see in a data set, your level of confidence increases and you can spend more time trying to unravel what signals changed.

I’ve read influential industry writers mention that links were heavily discounted on this update. I have also read Tweets like this one which could potentially indicate the opposite.

Check out . Up even more than Pinterest and ranking for some real freaky shit.— Paul Macnamara (@TheRealpmac) May 12, 2020

If I had little to no data, I wouldn’t be able to get any signal out of that range of opinions. I’d sort of be stuck at “who knows.”

By having my own data I track I can quickly figure out which message is more inline with what I saw in my subset of data & form a more solid hypothesis.

No Single Smoking Gun

As Glenn Gabe is fond of saying, sites that tank usually have multiple major issues.

Google rolls out major updates infrequently enough that they can sandwich a couple different aspects into major updates at the same time in order to make it harder to reverse engineer updates. So it does help to read widely with an open mind and imagine what signal shifts could cause the sorts of ranking shifts you are seeing.

Sometimes site level data is more than enough to figure out what changed, but as the above Credit Karma example showed sometimes you need to get far more granular and look at page-level data to form a solid hypothesis.

As the World Changes, the Web Also Changes

About 15 years ago online dating was seen as a weird niche for recluses who perhaps typically repulsed real people in person. Now there are all sorts of niche specialty dating sites including a variety of DTF type apps. What was once weird & absurd had over time become normal.

The COVID-19 scare is going to cause lasting shifts in consumer behavior that accelerate the movement of commerce online. A decade of change will happen in a year or two across many markets.

Telemedicine will grow quickly. Facebook is adding commerce featured directly onto their platform through partnering with Shopify. Spotify is spending big money to buy exclusives rights to distribute widely followed podcasters like Joe Rogan. Uber recently offered to acquire GrubHub. Google and Apple will continue adding financing features to their mobile devices. Movie theaters have lost much of their appeal.

Tons of offline “value” businesses ended up having no value after months of revenue disappearing while large outstanding debts accumulated interest. There is a belief that some of those brands will have strong latent brand value that carries over online, but if they were weak even when the offline stores acting like interactive billboards subsidized consumer awareness of their brands then as those stores close the consumer awareness & loyalty from in-person interactions will also dry up. A shell of a company rebuilt around the Toys R’ Us brand is unlikely to beat out Amazon’s parallel offering or a company which still runs stores offline.

Big box retailers like Target & Walmart are growing their online sales at hundreds of percent year over year.

There will be waves of bankruptcies, dramatic shifts in commercial real estate prices (already reflected in plunging REIT prices), and more people working remotely (shifting residential real estate demand from the urban core back out into suburbs).

People who work remote are easier to hire and easier to fire. Those who keep leveling up their skills will eventually get rewarded while those who don’t will rotate jobs every year or two. The lack of stability will increase demand for education, though much of that incremental demand will be around new technologies and specific sectors – certificates or informal training programs instead of degrees.

More and more activities will become normal online activities.

The University of California has about a half-million students & in the fall semester they are going to try to have most of those classes happen online. How much usage data does Google gain as thousands of institutions put more and more of their infrastructure and service online?

Colleges have to convince students for the next year that a remote education is worth every bit as much as an in-person one, and then pivot back before students actually start believing it.

It’s like only being able to sell your competitor’s product for a year.— Naval (@naval) May 6, 2020

A lot of B & C level schools are going to go under as the like-vs-like comparison gets easier. Back when I ran a membership site here a college paid us to have students gain access to our membership area of the site. As online education gets normalized many unofficial trade-related sites will look more economically attractive on a relative basis.

If core institutions of the state deliver most of their services online, then other companies can be expected to follow. When big cities publish lists of crimes they will not respond to during economic downturns they are effectively subsidizing more crime. That in turn makes moving to somewhere a bit more rural & cheaper make sense, particularly when you no longer need to live near your employer.

via How to Read Google Algorithm Updates


Categories : Uncategorized
Comments (0)

Treating Pancreatitis in Dogs

Posted by: | Comments (0)

Everything You Need to Know

Ever since its launch, the AMP Project has been surrounded by controversy.

It promises fast page speeds and additional visibility on Google result pages but demands submission to a stripped-down form of HTML.

Essentially, putting your website on a diet to make it more attractive to users.

While there are glowing case studies, for many, the implementation was haphazard and the results confusing.

Leaving the marketing industry with this question:

Is AMP important for SEO?

Today, we delve into whether AMP is worth it by looking at:

While the AMP framework extends beyond AMP pages with Web Stories (a.k.a., AMP Stories), AMP Email, and AMP Ads, these are all in the early stages. I won’t be covering those formats in this article.

What Is AMP?

AMP, formerly known as Accelerated Mobile Pages, are lightweight pages designed to give mobile users a lightning-fast, more engaging experience.


Continue Reading Below

It’s “an open-source HTML framework that provides a straightforward way to create webpages that are fast, smooth-loading and prioritize the user experience above all else.”

Their words, not mine.

For most sites, it involves creating a stripped-down, mobile-optimized AMP copy of existing HTML5 page content.

When such an AMP alternative is available, the user is served the AMP version, over the canonical page.

Not too dissimilar from Facebook Instant Articles or Apple News, which also have the stated goal of making mobile content faster and easier to consume.

The key difference between the formats?

AMP supports the distribution of content on the open web without going through a platform-specific app.

And This Is Where It Gets Political

Let’s start with “open source.”

This is technically true.

The project is backed by WordPress, LinkedIn, Twitter, Pinterest, and Bing, just to name a few.

But Google is the key code contributor and main promoter. So much so that people often refer to it as “Google AMP.”


Continue Reading Below

AMP “prioritizes the user experience” through the enforcement of restrictions on ads and user interface design.

  • Limiting CSS to 75KB.
  • Limiting JavaScript to 150KB.
  • Moving all the fluff out of the critical rendering path.

While these restrictions are already enough to create “webpages that are fast”, it’s not the secret sauce that makes them instant.

Here’s the Rather Technical Part

To achieve the lightning load speed, AMP pages are sent to be hosted on Google’s servers.

This allows Google to cache, preload, and prerender AMP content before a user clicks the link in the search results.

When users click AMP content in Google, it may be displayed in one of two ways.

  • Google AMP Viewer: Where the source of the content publisher is displayed at the top, but the URL remains a Google domain.
  • Signed exchange (SXG): Which allows the browser to treat the page as if it belongs to your domain. Signed AMP content is delivered in addition to, rather than instead of, regular AMP HTML. Out of the two, Google will prioritize linking to the signed content but only for browsers that support it, which is currently only Chrome, and only for standard results, not top stories carousel. This makes the scope of SXG rather limited.

Overall, by “forking” HTML, pre-rendering AMP content, and giving preferential treatment to AMP pages, Google can influence how websites are built and monetized to shape the internet in its favor.

So it’s unsurprising that all these actions have been criticized by many in the tech and SEO industries as an attempt by Google to exert further control over the web.


Continue Reading Below

Yet despite the condemnation, sites are drawn to AMP as it has some attractive benefits.

Advantages of AMP Pages

There are many potential advantages of AMP depending on your site – including less data consumption, improved server performance, a free CDN, and higher ad viewability.

But I want to focus on the two most commonly realized benefits of AMP for SEO.

Faster Page Load Times

While AMP itself isn’t a ranking factor, speed is.

It is especially for 2021 with core web vitals becoming a ranking factor.

If implemented correctly, the load time improvements are often sizeable.

Google claims the median load time for an AMP page is less than one second. This is well within the core web vital requirements.

Plus, speed often has a run-on effect of a more optimized user experience, witnessed by:

  • Lower bounce rates.
  • Higher time on site.
  • Increased conversion rates.


Continue Reading Below

Additional Visibility in Google Search Results

AMP pages are always accompanied by an enhanced appearance in Google SERPs on mobile.

The Lighting Bolt Icon in Google SERPs

At the most basic level, AMP pages are highlighted with the lightning bolt icon in the Google SERPs.

Some SEOs have argued this designation increases the click-through rate of pages as users will choose AMP results knowing it will lead them to a fast loading page.

While this may result in a marginal increase in select industries, I’ve not seen any statistically relevant data to back up that claim for the mass market.

AMP + Structured Data = More Changes of Getting Rich Results

Secondly, AMP in combination with valid structured data increases the likelihood to appear with a host carousel (shows for courses, movies, recipes, and restaurants) or with rich result features such as headline text and larger-than-thumbnail images.


Continue Reading Below

Although AMP is not compulsory for these enhanced features.

Swipe to Visit

Thirdly, there is an exclusive Swipe to Visit functionality for AMP pages in Google Images on mobile.

When a user has selected an image, they see a preview of the site header, which they can swipe up to visit the source page.

Yet, the main driver of additional visibility is that AMP is a requirement for inclusion in the coveted top stories carousel, with the exception of COVID-19 related content.


Continue Reading Below

But that is set to change in 2021 with the Core Web Vitals update to allow both AMP and non-AMP pages, with ranking focused on-page experience metrics.

And with that announcement many SEOs have begun to wonder if the remaining pros of AMP outweigh the cons.

Disadvantages of AMP

From a Developer’s Perspective

  • By design, it is a restrictive framework and will likely always be so in order to deliver on the promised speed.
  • It’s an extra burden to implement and subsequently remain valid with the ever-evolving AMP standard. Plugins can give a head start, but they rarely work perfectly out of the box.
  • It creates technical debt as both the AMP and canonical pages of the code need to be kept in-sync unless you go all in AMP native.
  • Light speed is not guaranteed without the AMP Cache. For sources that link to AMP pages without using an AMP Cache (for example, Twitter), additional performance optimizations are needed for optimal speed.

From a Sales Perspective

  • The mere presence of AMP inventory creates complexity as if you implement best practice with separate ad units for accurate reporting, you have double the ad units to manage.
  • The framework limits ad features. Notably, AMP doesn’t support “disruptive” ads, such as interstitials or expandables, and direct-sold ads can be complex to implement.

From a Marketer’s Perspective

  • It costs double the crawl for one piece of content as Google wants to ensure parity.
  • For many publishers, it drives impressions but not necessarily engagement metrics due to the top stories carousel ‘swipe’ functionality encouraging users to read more from other sources.
  • It’s an extra burden to optimize. Like a regular page, just because it’s live doesn’t mean it’s SEO-friendly. You will need to partner with your development team to get the most out of AMP.
  • Google’s AMP Viewer dilutes brand identity as a Google domain, not the publisher’s, is shown in the address bar. This can be rather confusing for users who have been trained that the URL in the address bar has significance. The fix of showing the actual site at the top of the AMP pages takes up precious space above the fold. Signed exchange is a step in the right direction, but isn’t available for most traffic.

How to Optimize AMP Pages

It’s not always the case that AMP adopters see results rise.


Continue Reading Below

When that’s not the case, there are two potential reasons.

  • Either wasn’t the right fit, we come to this in the next section.
  • Or it wasn’t implemented thoroughly and correctly. AMP is not often as simple as plugin and play.

So what is involved from an SEO perspective to achieve the visibility boost with AMP?

Outside of stating the obvious that AMP pages should be crawlable and indexable, here are the top optimization actions.

Ensure Discoverability

Add information about the AMP page to the non-AMP page and vice versa, in the form of rel=”amphtml” (on the AMP) and rel=”canonical” (on the non-AMP) versions as <link> tags in the <head>.

SEO-Friendly AMP URLs

There are many ways you could communicate the URL is AMP.

  • Parameter: ?amp
  • Subdomain: amp.domain.tld
  • Front-end language: domain/folder/article.amp.html
  • Subfolder: domain/folder/article/amp


Continue Reading Below

The subfolder option is generally the most SEO-friendly and flexible.

This option along with front-end language are also the only two recommended by Google.

Consistent User Interface

While there may need to be minor variations due to AMP restrictions, the user interface and design scheme should be materially similar when looking at AMP vs canonical versions of the same page.

Fully Functional

Personalization and interactive elements such as navigation menu, social media sharing icons, related content, forms, login, and – yes – even ads, should work the same way as the canonical version.

Verify SEO Element Parity

The code behind the scenes, such as hreflang, H1s, alt image text, and especially valid structured data should not only be present but the same on both the canonical and AMP pages as inconsistencies can hinder SEO visibility.

AMP-Friendly Logo

The logos used must meet the AMP guidelines, else it will be displayed poorly or not at all in the top stories carousel.

Don’t Add AMP URLs to XML Sitemap (Unless You’re Native AMP)

Only canonical URLs belong in XML sitemaps.


Continue Reading Below

The rel=”amphtml” provides enough signals for Google to discover the AMP pages.

When a correctly paired AMP page is indexed by Google, it will be the version served to the user.

This is no small effort and leaves many marketers wondering, is AMP worth the effort?

Which Sites Should Implement AMP?

The official AMP website is full of case studies demonstrating the framework’s positive impact on publishers, retailers, and other industries.

On the flip side, there are also scathing articles from industry experts and many case study fails.

The decision on AMP is not clear cut for all sites.

The cliche of “it depends” rings true.

However, there are clear decision factors for the optimal answer.

If your users are primarily from desktop, AMP is not for you.

While AMP pages do work on desktop, they don’t display with rich features and aren’t served from the AMP Cache. As such, both of the main benefits become unavailable.


Continue Reading Below

If you create “newsy” content and are already up and running on AMP, it’s worth keeping it optimized until AMP is no longer the gatekeeper of the top stories carousel.

At which time, check to see whether the top stories results in your sector are dominated by AMP pages or is there evidence non-AMP sites often rank right alongside them, and if so what are the requirements.

Only if conditions are favorable, test the impact of removing AMP (be sure to follow best practices) if your non-AMP pages can achieve the core web vitals requirement of largest contentful paint (LCP) within 2.5 seconds of when the page first starts loading.

For sites that are willing and skilled enough to get below the 2.5-second standard, the potential speed and organic session increases are unlikely to be a convincing case for converting to and/or maintaining AMP.

The time would likely be better invested in other opportunities.

For sites that can’t reach the 2.5-second standard alone, having key landing pages in the standardized solution from AMP can be a fast route for this SEO win.


Continue Reading Below

But check whether the functionality can be implemented with AMP components.

And remember, unless you plan to move the whole site to native AMP, the increased speed is not for all page views, only those who come from distributed sources that support AMP.

For sites that are yet to be developed or those going through a major overhaul such as a redesign or CMS change, ask yourself if native AMP is the best solution to provide all necessary functionality now and in the foreseeable future.

Spoiler alert, the answer is likely no.

Assessing the Impact of AMP

No matter whether you are embracing AMP, abandoning AMP or just an ongoing user, you should measure the true impact it has on user experience and site visibility.

Do this in four steps.

1. Confirm the AMP Code Validates on All Relevant Pages

Spot checking a couple of pages with the AMP Test Tool is a start, but not enough.


Continue Reading Below

Google Search Console has a dedicated AMP status report, alerting you to the reason your AMP URLs aren’t eligible to be indexed (errors) or may not perform in the SERPs (warnings).

2. Verify the Structured Data Parses Correctly

For applicable AMP content types, the Rich Results Test Tool is helpful for one-at-a-time spot checks – be sure to enter the AMP URL, not the canonical.

I find the relevant Enhancement reports in Google Search Console to also be useful, although the articles structured data isn’t covered.

A reputable SEO crawling tool is often the best option for scale.

3. Understand the AMP Drivers of Visibility on Google

In the Google Search Console Performance reports, there are a few dimensions to analyze:


Continue Reading Below

  • Search type: Web, Search appearance: AMP non-rich results: This report shows metrics for blue links with lightning bolts.
  • Search type: Web, Search appearance: AMP article: It shows metrics for visually decorated search results such as those in carousels or with an image. Do note, these are also counted in rich results, they are not mutually exclusive.
  • Search type: Image, Search appearance: AMP on Image result: This shows metrics for images search results hosted on AMP pages within Google Images tab.
  • Search type: Video, Search appearance: AMP article: The report shows metrics for video results hosted on AMP pages within Google Videos tab.
  • Search type: News, Search appearance: AMP article: Shows metrics for AMP pages within Google News tab in search results, not the Google News app.
  • Discover, Discover appearance: AMP article: This shows metrics for AMP pages within Google Discover.

When analyzing the data, remember that filtering by search appearance will aggregate data by page rather than by property in the table only and be limited to the dimensions available.

Data in the graph will still be grouped by property.

This can lead to some large discrepancies on the image, video, and news tabs.

4. Understand the Drivers & Performance of AMP Sessions

Search Console only shows the Google side of the picture, you can get many more insights from Google Analytics.

But before Google Analytics can be trusted to accurately report on AMP, you must implement “sessions stitching.”

Ensure your Google Analytics setup utilizes client IDs to unify sessions across AMP & Non-AMP versions.

Also, double-check event tracking or other such conversion integrations are firing correctly on AMP pages.

Then you can delve into the number of sessions, conversions, and other KPIs driven by AMP by filtering by data source as a secondary dimension.


Continue Reading Below

Some sessions may come from unexpected sources until you understand how various platforms link AMP.

Common sources and their explanations in relation to AMP sessions, excluding manual UTM tags:

While UTM tagging your own social posts and on-site share buttons can help clear up some of this confusion, there will always be some extraneous sources outside of Google properties, but they are unlikely to be significant.

What is often most actionable is to understand the impact on user experience KPIs such as bounce rates, pages per session, or conversion rate.

Be sure to compare apples to apples.

This entails creating a custom report to compare AMP (identified by the data source) to closest corresponding non-AMP (generally, device as mobile and source / medium as google / organic and regex landing path to all pages types have an AMP version) rather than the overall site performance.


Continue Reading Below

To Sum Things Up

AMP’s reception and case study outcomes have been a mixed bag.

What’s clear is that many publishers have been enjoying the AMP exclusive placement of the top stories carousel.

But as this feature is opened up to those who meet the page experience requirements, it’s likely the benefit of AMP for visibility will be much reduced. So much so, it may be hard to argue a case for it.

So that leaves us with speed – which will become even more important as Core Web Vitals become an SEO ranking factor.

If your mobile site is able to achieve a 2.5-second LCP for key landing pages, you’re unlikely to see a return on resource investment with AMP.

If hitting 2.5 seconds is not possible, supporting AMP for key landing pages is something to consider especially if you have a significant portion of mobile google / organic sessions that can subsequently enjoy the benefit of the AMP Cache prerendering.

via AMP & SEO: Everything You Need to Know via @jes_scholz


Categories : Uncategorized
Comments (0)

How to Make Your Social Media Content More Accessible

As many people in the United States have shifted to remote work and remote learning in light of the global pandemic, internet accessibility is more important than ever. 53% of Americans go as far as to say that the internet is an “essential” service during this time. Additionally, social media usage has increased significantly as well. Twitter alone saw a 24% increase in daily active users in the first quarter of 2020 compared to the same period last year. This includes official government accounts that are increasingly using social media to announce guidelines in the time of Coronavirus virtually. For instance, the CDC’s Twitter account has grown to 2.9 Million followers as of July 2020.

Typically when we discuss accessibility concerns as digital marketers, we are referring to accessible website guidelines. This has been an area of focus for Portent for a while now, and recently we have been encouraging our clients to consider accessibility in their social media strategy as well. Social media platforms are still playing catch-up when it comes to accessibility features (including a lack of options for advertisers), but we have determined there are three key areas that businesses can immediately address.

ALT Text

At Portent, we talk about ALT attributes with our clients regularly. ALT text is read by screen readers in place of images, or displayed in place of images when the file is not loaded. ALT text on social media is similar, but executed on each platform vs. on-site. Most platforms have automatic ALT text functionality, but because it is written by AI, there are instances where the automatic ALT text does not accurately convey the meaning or emotion in the image. Also important to note, ALT text should not be confused with the link description, which is a standard field in most social media posts.

To ensure that your images are described appropriately to your entire audience, we recommend manually updating the ALT Text as part of your standard deployment process.

You can find the instructions for adding ALT text to each platform here:

Video Captions

Captions for social media videos have been a best practice for a few years, as more and more users engage with social media content with the sound off. Additionally, including captions ensures that those with accessibility concerns can consume the content as well. We always recommend including captions on both organic and paid videos.

via How to Make Your Social Media Content More Accessible


Categories : Uncategorized
Comments (0)

Why SEO & Machine Learning Are Joining Forces

The global datasphere will grow from 33 zettabytes (each equal to a trillion gigabytes) in 2018 to 175 ZB by 2025.

In marketing, our role as the stewards of much of this data is growing, as well.

As of last year, more data is being stored in the enterprise core than in all the world’s existing endpoints, according to a report by IDC.

The great challenge for marketers and SEO professionals is activating and using that data.

In 2025, each connected person will have at least one data interaction every 18 seconds and nearly 30% of the world’s data will need real-time processing.

There’s no way human marketers can handle this processing on our own.

And more and more, as our machine-learning-enabled tools process and analyze search data, they’re learning and improving their understanding of it as they go.



Machine Learning in Search

Perhaps the best-known use of machine learning in search is Google’s own RankBrain, an algorithm that helps the search engine better understand the relevance and context of – and the relationship between – words.

Machine learning enables Google to understand the idea behind the query.

Machine learning allows the algorithm to continuously expand that understanding as new words and queries are introduced.

And as algorithms get better at determining which content best meets the needs of each searcher, we are being challenged to create content that meets those needs – and to optimize it so that relevance is clear.

It’s no coincidence that as we’re experiencing this explosion in data, interest in SEO is growing, as well.

SEO & Data Science

SEO has grown to be a viable, respectable mainstream marketing career.

As I write this, there are 823,000 people on LinkedIn with “SEO” in their profile and 8,600 who specifically categorize their core service offerings as SEO.

Looking worldwide, those figures balloon to 3.2 million and 25,000, respectively.



But this is just a small sampling of the SEO industry.

There are those in SEO who identify as content marketers, digital marketing strategists or practitioners, site developers, analytics pros, consultants, advisors, and more.

Our industry is massive in size and scope, as SEO now touches nearly every aspect of the business.

So much more is being asked of SEO professionals now, thanks to that massive increase in data we have to deal with.

Yet according to our research at BrightEdge, only 31.5% of organizations have a data scientist at their company.

Working alongside machine learning rather offers tech-savvy SEO professionals a number of important advantages.

1. Enhanced Performance in Your Field of Specialization

Employers and clients alike are driven by results.

Do you know how to use the machine-learning-powered tools in your area of specialization?

Whether in paid search, technical SEO, content creation and optimization, link building or some other facet of SEO, those who can prove superior performance through the use of machine-learning-enabled SEO tools are increasing their own value.

2. Start Ahead & Stay Ahead

Search is a live auction. If you’re waiting to see what customers think and only then getting ready to respond, you’re already behind.

Machine-learning-powered tools enable marketers to activate real-time insights, to personalize and optimize content in the moment for each users’ individual needs.

3. Economies of Scale

You are exponentially more valuable as an SEO practitioner and leader if you can demonstrate the ability to scale your efforts.

The real power of machine learning is in its ability to convert more data than we know what to do with into actionable insights and automated actions that marketers can use to really move the needle.

To do that is hard.

For example, to build BrightEdge Autopilot we had to process over 345 petabytes of data over the course of many years to help fine-tune machine learning and automated products.



Machines aren’t angling for a promotion; they don’t harbor preconceptions or care about past mistakes.

They are entirely subjective, taking opinions and personalities and other potential bottlenecks out of the process of data evaluation.

What marketers are left with are pure, accurate data outputs that can then be activated at scale to improve search visibility and interactions with customers.

4. Room to Grow

Mastering your SEO toolset gives you more room to grow in your profession, and as a person who just so happens to love the work you do.

Machine learning, in particular, empowers us to reap insights from larger datasets and gives us access to far more intelligence than when we could only learn from that we manually analyzed ourselves.

It is your specialized insight and industry knowledge that determines which outputs are useful and how they should be applied.

Machine learning can tell you very quickly how your audience’s behaviors have changed during a major market disruption, such as our recent experience with COVID-19.



But how you interpret and respond to those changes is still very much the domain of marketing and SEO professionals.

Machine learning can help you recognize patterns in visitor behavior that point to opportunities and areas in need of improvement.

What technology cannot do is replace the creative and analytical human thought process and experience that determines the best next steps to take in response to those insights.

The people of SEO cannot be replaced. In fact, they’re more important than ever.

The tools we use may be quite sophisticated; machine-learning-enabled tools can even make decisions and implement optimizations.

However, it is the people of SEO who drive the creative and analytical processes that machines simply cannot replace:

  • Creative analysts.
  • Data scientist (who control input into machines).
  • Analytics.
  • Content producers.
  • Culture builders and success evangelists.
  • Expert users who facilitate sales and help customers.
  • Strategic planning across digital channels.

And there are agile marketers who may do any combination of the above.



They are key in facilitating collaboration with other digital departments to ensure a truly holistic SEO strategy.

In their HBR article Collaborative Intelligence: Humans and AI Are Joining Forces, H. James Wilson and Paul R. Daugherty explain the three key roles humans undertake in every interaction with machine-learning-powered technology:

  • Train: We need to teach the machine to perform certain tasks.
  • Explain: We must make sense of the outcome of the task, especially if it is unexpected or counterintuitive.
  • Sustain: It is up to us to ensure that the technology is used logically and responsibly.

Applying this lens to our SEO tech, we see these three tenets hold true.

We need to decide which SEO tasks to intelligently automate and give our tools the proper input.

We need to take the output and make sense of it, focusing only on those insights with business-building potential.

We are responsible for ensuring that searcher privacy is protected, that the value of the technology outweighs the cost, and that it is otherwise being made good use of.

You can build your value as an SEO and learn to work more effectively with machine-learning-powered tech by building these skills:



  • Data proficiency: According to Stanford researchers, the share of AI jobs grew from 0.3% in 2012 to 0.8% of total jobs posted in the U.S. in 2019. AI labor demand is growing, especially in high-tech services and the manufacturing sector.
  • Communication: As the arbiter of so much customer data, it is critical that we communicate key insights and value in ways other department heads and decision-makers can understand.
  • Agility: More than a trait or quality, agility is a skill developed through constant experimentation.

Embracing machine learning and automation means building synergy with human creativity and skills.

It can make us more creative and effective by uncovering SEO insights and patterns we would never have recognized otherwise.

It can help us discover new topics, identify content gaps, optimize for specific types of queries and results, and more.

What’s more, it can save vital time on tasks that are too time-consuming, too repetitive and laborious, so we can scale performance.

And as that happens, we develop new skills and progress also as part of a symbiotic relationship between people and technology.

via Why SEO & Machine Learning Are Joining Forces


Categories : Uncategorized
Comments (0)

Windmill Networking: Understanding, Leveraging & Maximizing LinkedIn: An Unofficial, Step-by-Step Guide to Creating & Implementing Your LinkedIn Brand – Social Networking in a Web 2.0 World (Paperback)

Twitter For Dummies (Paperback)

Facebook Marketing For Dummies (Paperback)