Googler Says Web 3.0 Won’t Kill SEO

All the craze right now is Web3 or Web 3.0, the new iteration of the World Wide Web based on the blockchain, which incorporates concepts including decentralization and token-based economics. And some are concerned it will kill SEO, that Web 3.0 will be the final nail in the coffin for SEO.

Well, it won’t. Don’t believe me, ask Google’s John Mueller who also said no, it won’t on Reddit. Don’t trust John? Go look at the other comments on that thread, mostly all saying it is fear mongering…

The question posed in the thread was “Will web 3.0 kill the SEO? Just a fear of mine that soon I’ll be jobless.”

The response from John Mueller was a simple “no.” But here are some of the other responses:

No, because search engines will still be used on Web 3.0. Anything that is used on the internet now, will be on Web 3.0. This is because there is no replacement for search engines yet. People will still be using Youtube in the metaverse same as Google.

Web 3.0 isn’t yet what it plans to be. But I don’t expect it will impact SEO much, if at all, for some time to come.

I look at web 3.0 like voice search – it’s going to be nice to have, but no real impact on SEO – at least not initially.

I have been in the same technical field for a long time. I no longer list anything specific on my resume for the first 10 years or so. Very general statements. Because its all obsolete. My point is the technical things you do will evolve but the general aim of SEO will remain and will have new tools and processes.

I think you might not understand what some claim the web3.0 is, and I don’t blame you – this is not a criticism.

The best way to sum what is being pushed as the web3, in my opinion: it’s a way to store, access and transfer some types of information, by using “public databases”.

SEO, as you know, is about making some information (like content, or product listings) more accessible and easy to find online.

The web as you know it won’t change. You’ll still have search engines, marketplaces, social networks, e-commerce, etc.

As long as there are search engines there’ll be seen need for seo.

Enlighten me, why would web 3 mean that people no longer use the internet to search for information? Are there Web 3.0 specific search engines that may topple Google’s dominance? If there are can they be games or optimised for?

TL;DR: If Web 3 means people will search for information differently then SEO will most likely evolve to find a way to capitalise on that behaviour.

There was a story from Bloomberg yesterday named Facebook and Google’s Ad Addiction Can’t Last Forever Facebook and Google are the least diversified of the tech giants, relying on ads for 98% and 81% of revenue, respectively, hinting at potential future issues. “Technologists, for one, are also talking about a radical shift to Web3, where large online platforms will be replaced by systems underpinned by blockchain, a move that would require rethinking the companies’ revenue model. Regulators, meanwhile, are targeting Google and Facebook’s dominance of the digital ad space; and young people’s gravitation to gaming, messaging and TikTok has already threatened Facebook’s all-important engagement metrics with advertisers.”

There are more responses – but it seems like we are all safe, for now – at least until Web 4.0…

Forum discussion at Reddit.

Search News Buzz Video Recap: Google URL Inspection API, Manual Actions Galore, New Partner Programs, Earnings, MUM, Web3 & More

I posted the monthly Google webmaster report for February 2022, so check it out. Google released a new API for the Google URL Inspection Tool – it is exciting. Google seems to be clearing out the manual action backlog. Google also issued a slew of new manual actions for Google News and Google Discover. Some are surprised to hear that Google really doesn’t use MUM in Google Search for ranking, they do use it for two specific purposes; COVID vaccine names and a related video feature. Google said that no one was infected with malware over the untitled search spam issue. Google said the location of an internal link on your page does not really matter much. Google also said it does not give full weight to all links and does not count links on a domain level. Google Search Console’s snapshot chart in web search now works for domain properties. Google Search Console changed how it handles breadcrumb and HowTo structured data. Google added a single help document for SafeSearch with some awesome troubleshooting tips. Did you know Google may share your Google Discover likes with publishers. Google Merchant Center added auto-tagging for free listings. Google launched its new Partner Program after two years of delays. Google Data Studio can now bring in your Google Ads Performance Max data. Google Maps posted more details about its review spam and review enforcement policies and actions. Google is testing buying guides in the search results. Web 3 won’t be killing off SEO anytime soon. Google and Microsoft both released earnings and showed 32% increases in ad revenues. And if you want to help sponsor those vlogs, go to patreon.com/barryschwartz. That was the search news this week at the Search Engine Roundtable.

Sponsored by BruceClay who has been doing search marketing optimization since 1996 and also has an amazing SEO training platform.

Make sure to subscribe to our video feed or subscribe directly on iTunes, Apple Podcasts, Spotify, Google Podcasts or your favorite podcast player to be notified of these updates and download the video in the background. Here is the YouTube version of the feed:


For the original iTunes version, click here.

Search Topics of Discussion:

Please do subscribe on YouTube or subscribe via iTunes or on your favorite RSS reader. Don’t forget to comment below with the right answer and good luck!

What Does an SEO Company Actually Do?

You can place your site in a desirable and noticeable area on a search engine results page when you partner with a knowledgeable SEO company. This company should be supplying reports, answering your questions, and getting a great deal of work done– however do you know what an SEO business actually does!.

?.!? The entire process can be rather involved, however it basically boils down to a great deal of research study and developing a well-structured and informative website that supplies worth.

SEO is a financial investment, and it’s a technique that takes time. You need to understand what your SEO firm is doing to make sure that your consumers can always find your site.

What is SEO?

Completion objective for online search engine is to provide excellent results for users, so the search engines utilize algorithms to examine sites to find the very best possible match for a provided search. SEO, or seo, is the process of making a website interest online search engine.

Premium content and optimized backend components of a website aid increase the worth of that website to the search engines and your possible users. When a search engine determines a website that provides useful info, it is most likely to include it on the results page for a pertinent search term.

Now that we’ve covered the fundamental meanings of what we’re trying to achieve, let’s enter into what we do to make it take place.

Keeping track of the Algorithm

Algorithms are a huge part of SEO, however the thing is, not a lot is learnt about them. Search engines like Google keep the details under wraps and seldom offer insight into the inner functions. Experienced SEO business research how search engines react to various websites to much better understand how the algorithms work.

The more we learn more about algorithm behavior, the better we can enhance a website and make certain it gets consisted of on an outcomes page. It implies staying current with market standards and a lot of testing and revamping to make sure whatever is running efficiently.

Getting Started

Before an SEO business can describe a technique to help your website rank, it requires to understand your existing status and what is occurring within your market. Some of the concerns an SEO will wish to address include:

  • How does your website presently carry out?
  • What are the relevant keywords for your market and your business?
  • Does your site have special title tags and meta descriptions? Does your site have schema markup?
  • Is your website user-friendly?
  • What websites are you connecting to, and what websites link back to yours?

After discovering all about your website, we take a look at the competition. We need to know what your competitors are doing right and what they are forgetting to do, since that could open chances for your website to shine. There are likewise particular expectations and requirements within a market, and it’s vital to guarantee your website is using the best practices.

SEO Services

After all the research study has been done, your SEO company will produce a custom-made method to assist your site rank. This method will deal with locations that require improvement to develop a well-rounded, effective and inviting site that appeals to users (and search engines). A technique will likely be a diverse method and could include:

Keyword Research

Keywords are perhaps the most popular aspect of SEO, and for a good reason– they matter. Keywords line up with the search terms people are most likely to use, and they clue in online search engine to the gist of the material on a page.

An SEO company will explore all the possible keywords that relate to your services or items and determine how difficult they will be to rank for. It might be worth it to go after those tough words if they show to be the most profitable.

Content Creation

Quality content pulls users into a website by providing practical information about your services or products relevant and beneficial pointers. Great material assists create sales by turning users into clients and shows online search engine that your site has something of genuine value to provide.

Your SEO company should enhance your website content and provide a consistent stream of new content that pertains to your products.

Mobile Experience

Your site needs to deliver a great experience on any gadget, however the majority of traffic is mobile. Your SEO business will ensure that your site can accommodate mobile users and guarantee they have a favorable experience.

Load Time

Even the most patient person will not stay if a website takes a long period of time to load. A laggy website or a slow load time will push users away, and online search engine will take notice. Experienced SEOs know how to discover the repairs that will speed up your site and make it fill fast enough to keep your customers on the website.

Metadata

Metadata is basically data about information. This details is provided on the backend of websites, and it helps search engines understand what is taking place on a given site. Metadata helps online search engine find the appropriate websites based on a search inquiry, so your strategists will use this chance to make your set more attractive to users and search engines.

On-Going Process

After the strategy is in place and your site has been enhanced, it is time to hurry and wait … sort of.

It can take months to see arise from an SEO technique, however that does not imply your company isn’t doing anything during this time. The whole procedure is ongoing. Data will be compiled as online search engine and users interact with your website, and there is still lots of work to be done after that very first wave of information is available in.

“Clickthrough rates” and “session period” provide more insight into how users interact with your website, and this is a chance to tweak everything further to much better get in touch with your audience.

There is always room to grow and enhance, and there is always a competitor seeking to surpass your spot. Adjusting and keeping up with your SEO method is important.

What’s more, SEO is constantly altering. Some elements remain relatively constant, however search engines consistently upgrade their algorithms. Often updates can be substantial and dramatically change the course of the internet, however more often than not, the modifications are minor. These updates are intended to assist improve the procedure to deliver better outcomes, and updates can impact how an online search engine reacts to a website.

A skilled SEO company stays current with updates and is all set to change your strategy accordingly.

Concentrate on What You Know

Producing, implementing, and keeping track of an SEO strategy is lengthy, especially if you do not have experience with this kind of work. An SEO agency can handle all of this, so you can concentrate on what you do best while taking care of the new leads entering into your enhanced site.

We’re here to help you find out if we’re the best fit to be your SEO company.

SEO Roundup: February 4, 2022

The last two weeks have seen several feature releases from Google. In addition, Google documentation updates and explanatory content from John Mueller (Google’s Search Advocate) and other sources provided search engine optimizers with clarity about a handful of long-running uncertainties.

In terms of practical features, web admins can now take advantage of indexifembedded tags that tell Google how to handle embedded content. There’s also a new search section on mobile results. The upcoming release of the “Topics” feature, in line with Google’s depreciation of third-party cookies, is also an important development.

In other news, Google made a few changes to its online guidance. SafeSearch documentation has been merged, and a new note in Google Webmaster Guidelines lays out the relationship between “Car” and “Product” schema markup. John Mueller also provided some insights into how Google evaluates internal links and Danny Sullivan (Google’s Search Liaison) explained how “deduplication” works in relation to “Top stories.”

Let’s dig into the latest updates, announcements, and search-related analysis from the last two weeks.

Topics to Replace FLoC as Part of Google’s Privacy Sandbox Initiative

On January 25th, Google announced that it would be retiring “Federated Learning of Cohorts” (FLoC) and replacing it with an alternative targeting technology called “Topics.”

“Topics” is part of Google’s “Privacy Sandbox,” an initiative tasked with developing digital tools that allow publishers and advertisers to continue to leverage data about user behavior as cookies become redundant.

“Topics” will share subjects that individual browsers have expressed interest in with third-party sites, thus negating the need to provide confidential personal information. Google published a nifty little explainer video that shows how everything works.

In an official blog post, Google wrote: “With Topics, your browser determines a handful of topics, like “Fitness” or “Travel & Transportation,” that represent your top interests for that week based on your browsing history. Topics are kept for only three weeks and old topics are deleted.”

It’s still early days, and a developer trial will launch in Chrome shortly. What the final tech will look like remains to be seen. Nonetheless, it’s a change that will affect all businesses that rely on visitor data to serve ads and generate audience insights. It highlights the importance of web admins taking Google’s depreciation of cookies seriously.

New Robots Meta Tag (indexifembedded) Added to Documentation

Google has introduced a new robots tag – indexifembedded – that lets web admins stipulate that they’d like Google to index content that’s embedded in iframes (and some other HTML tags) on third-party pages (or elsewhere on the same site) even if the “unembedded” content on the parent page contains the noindex tag.

Google says that it may prove to be a particularly useful tool for media publishers, who often allow third parties to embed content but don’t want to publish the content on their own site.

You can learn more about the new tag in this recently published post on the Google Search Console blog. Currently, Google is the only search engine that supports this feature.

New Mobile Search Feature on Google Mobile

Google has released a new feature called “People search next” on its mobile search pages. A Google spokesman confirmed the rollout with Search Engine Land.

“People search next” appears alongside other features like “Related searches” and “People also search for.” It would appear that it is only available in the US at the time of writing.

“People search next” is interesting from an SEO perspective for two reasons. First, it provides content creators with new ideas for keyword-focused pages. Second, search result page widgets like this one potentially take up space occupied by results from third-party websites. It’s essential for SEOs to be aware of these changes as they may affect how ranking strategies are formulated.

Google Updates SafeSearch Documentation

Google has updated its SafeSearch documentation but the guidance remains the same. All documentation is now available in one place instead of spread across different subsections of the Google Search Central documentation.

If you haven’t already, you should make sure that your website is optimized for SafeSearch. The instructions show you how to check if some or all of your web pages are being filtered and how to remedy any mistakes on Google’s part.

Google Clarifies Car and Product Schema

Google has added a note instructing web admins about how to label “Car” markup in a way that doesn’t obviate eligibility for “Product” review snippets.

The note reads: “Currently Car is not supported automatically as a subtype of Product. So for now, you will need to include both Car and Product types if you would like to attach ratings to it and be eligible for the Search feature.

In a nutshell, this means that you should use both car and product schema on vehicle listing pages. If you only use “Car” schema markup, product reviews may not appear in search results.

Google Removes Time Ranges From Recipe Schema Markup

If you publish recipes on your blog, then Google’s update to its “Recipe” documentation will be of note and you’ll need to make some minor changes to your schema markup.

All references to ranges have been removed and Google no longer supports time ranges for “Time” properties. Instead, Google advises publishers to use “an exact time; time ranges aren’t supported.”

John Mueller Provides Some Insights Into How Google Evaluates Internal Links Based on Page Location

“Internal linking” refers to the practice of linking to different pages of a website so as to create an optimized “flow” of authority (or “link equity”). The argument runs that well-structured internal link architectures are correlated with higher rankings.

In an office-hours hangout, Search Advocate John Mueller said that the location of internal links on a page (header, footer, in-content, etc.) doesn’t matter from Google’s perspective. So it looks like SEO’s don’t need to worry about exactly where they place internal links. It is likely a much better approach to focus on optimizing user experience.

Danny Sullivan Provides Insight Into Deduplication Process for “Top stories”

In reply to a tweet by Executive Editor of The Verge Dieter Bohn, Search Liaison Danny Sullivan shed some light on how Google’s deduplication process works in relation to “Top stories.”

In a nutshell, Google will remove a link to a webpage from the main results if that link appears first in “Top stories.” However, if the “Top stories” widget appears after the normal results, the link is not removed.

Danny Sullivan said, “…we deduplicate a link from web results if a link appears as the first link in Top Stories and if the Top Stories box appears before web results. If it comes after, we don’t.

SEO Bright Now: February 4, 2022

The last two weeks have seen several feature releases from Google. In addition, Google documentation updates and explanatory content from John Mueller (Google’s Search Advocate) and other sources provided search engine optimizers with clarity about a handful of long-running uncertainties.

In terms of practical features, web admins can now take advantage of indexifembedded tags that tell Google how to handle embedded content. There’s also a new search section on mobile results. The upcoming release of the “Topics” feature, in line with Google’s depreciation of third-party cookies, is also an important development. 

In other news, Google made a few changes to its online guidance. SafeSearch documentation has been merged, and a new note in Google Webmaster Guidelines lays out the relationship between “Car” and “Product” schema markup. John Mueller also provided some insights into how Google evaluates internal links and Danny Sullivan (Google’s Search Liaison) explained how “deduplication” works in relation to “Top stories.” 

Let’s dig into the latest updates, announcements, and search-related analysis from the last two weeks. 

Topics to Replace FLoC as Part of Google’s Privacy Sandbox Initiative

On January 25th, Google announced that it would be retiring “Federated Learning of Cohorts” (FLoC) and replacing it with an alternative targeting technology called “Topics.” 

“Topics” is part of Google’s “Privacy Sandbox,” an initiative tasked with developing digital tools that allow publishers and advertisers to continue to leverage data about user behavior as cookies become redundant. 

“Topics” will share subjects that individual browsers have expressed interest in with third-party sites, thus negating the need to provide confidential personal information. Google published a nifty little explainer video that shows how everything works. 

In an official blog post, Google wrote: “With Topics, your browser determines a handful of topics, like “Fitness” or “Travel & Transportation,” that represent your top interests for that week based on your browsing history. Topics are kept for only three weeks and old topics are deleted.”

It’s still early days, and a developer trial will launch in Chrome shortly. What the final tech will look like remains to be seen. Nonetheless, it’s a change that will affect all businesses that rely on visitor data to serve ads and generate audience insights. It highlights the importance of web admins taking Google’s depreciation of cookies seriously. 

New Robots Meta Tag (indexifembedded) Added to Documentation

Google has introduced a new robots tag – indexifembedded – that lets web admins stipulate that they’d like Google to index content that’s embedded in iframes (and some other HTML tags) on third-party pages (or elsewhere on the same site) even if the “unembedded” content on the parent page contains the noindex tag.

Google says that it may prove to be a particularly useful tool for media publishers, who often allow third parties to embed content but don’t want to publish the content on their own site. 

You can learn more about the new tag in this recently published post on the Google Search Console blog. Currently, Google is the only search engine that supports this feature.

New Mobile Search Feature on Google Mobile

Google has released a new feature called “People search next” on its mobile search pages. A Google spokesman confirmed the rollout with Search Engine Land. 

“People search next” appears alongside other features like “Related searches” and “People also search for.” It would appear that it is only available in the US at the time of writing. 

“People search next” is interesting from an SEO perspective for two reasons. First, it provides content creators with new ideas for keyword-focused pages. Second, search result page widgets like this one potentially take up space occupied by results from third-party websites. It’s essential for SEOs to be aware of these changes as they may affect how ranking strategies are formulated. 

Google Updates SafeSearch Documentation

Google has updated its SafeSearch documentation but the guidance remains the same. All documentation is now available in one place instead of spread across different subsections of the Google Search Central documentation. 

If you haven’t already, you should make sure that your website is optimized for SafeSearch. The instructions show you how to check if some or all of your web pages are being filtered and how to remedy any mistakes on Google’s part. 

Google Clarifies Car and Product Schema

Google has added a note instructing web admins about how to label “Car” markup in a way that doesn’t obviate eligibility for “Product” review snippets. 

The note reads: “Currently Car is not supported automatically as a subtype of Product. So for now, you will need to include both Car and Product types if you would like to attach ratings to it and be eligible for the Search feature.

In a nutshell, this means that you should use both car and product schema on vehicle listing pages. If you only use “Car” schema markup, product reviews may not appear in search results. 

Google Removes Time Ranges From Recipe Schema Markup

If you publish recipes on your blog, then Google’s update to its “Recipe” documentation will be of note and you’ll need to make some minor changes to your schema markup. 

All references to ranges have been removed and Google no longer supports time ranges for “Time” properties. Instead, Google advises publishers to use “an exact time; time ranges aren’t supported.”

John Mueller Provides Some Insights Into How Google Evaluates Internal Links Based on Page Location

“Internal linking” refers to the practice of linking to different pages of a website so as to create an optimized “flow” of authority (or “link equity”). The argument runs that well-structured internal link architectures are correlated with higher rankings. 

In an office-hours hangout, Search Advocate John Mueller said that the location of internal links on a page (header, footer, in-content, etc.) doesn’t matter from Google’s perspective. So it looks like SEO’s don’t need to worry about exactly where they place internal links. It is likely a much better approach to focus on optimizing user experience. 

Danny Sullivan Provides Insight Into Deduplication Process for “Top stories”

In reply to a tweet by Executive Editor of The Verge Dieter Bohn, Search Liaison Danny Sullivan shed some light on how Google’s deduplication process works in relation to “Top stories.”

In a nutshell, Google will remove a link to a webpage from the main results if that link appears first in “Top stories.” However, if the “Top stories” widget appears after the normal results, the link is not removed. 

Danny Sullivan said, “…we deduplicate a link from web results if a link appears as the first link in Top Stories and if the Top Stories box appears before web results. If it comes after, we don’t.

Optimizing for the Google 3-Pack

What is Google 3-Pack?

The 3-Pack is Google’s method used to display the top three results for local SERP results. It references the user’s location to make the search more relevant. For instance, if a user searched “restaurants near me”, Google 3-Pack would display three restaurants near the user’s current location.

Google 3-Pack has undergone a couple of major changes, first in August of 2015 when the Google 7-Pack was cut down to three, creating the 3-Pack. At the time, the update placed more emphasis on links to the brand’s business websites. The way information was displayed also changed, making it easier for mobile users to navigate the 3-Pack by showing addresses and business hours, rather than phone numbers.

In December 2021, Google 3-Pack saw another major update. Google described the update as a rebalancing of its three key ranking factors for 3-Pack results: proximity, relevance and prominence. The rather rapid impacts to results made it clear that the rebalancing was to place greater emphasis on proximity as a factor. The 2021 update also integrated the results with a map.

How do I rank locally on Google?

You cannot optimize specifically for Google 3-Pack because businesses that appear on Google 3-Pack are location dependent. In other words, the local SEO results change depending on the searcher’s location.

However, you can optimize your web presence in a way that will increase your chances of appearing in the 3-Pack.

How to optimize your web presence to increase likelihood of showing in the Google 3-Pack.

  1. Make sure your Google Business Profile is filled in and up-to-date. This is the easiest component for you to control and is weighted heavily – about 25% – in your ranking potential.
  2. Solicit customer reviews on your Google Business Profile. Receiving multiple positive reviews can be beneficial when it comes to boosting your visibility, both as a ranking factor (prominence), and because Google reviews and stars will be plainly visible to users on the 3-Pack.
  3. Make sure your listings throughout the web are accurate and consistent. This includes business name, address, and phone number (NAP).
  4. Cultivate locally relevant links to boost your visibility. There are a number of ways you can cultivate reputable links, such as sponsoring local nonprofits and events or joining a Chamber of Commerce.
  5. Build standard backlinks to raise your organic rank. Backlinks are best cultivated through outstanding, well-distributed content that attracts readers.
  6. Build a strong presence on social media and other platforms to establish yourself as a Google local listing with loyal followers.
  7. Improve the user experience (UX) on your website, as Google favors better experiences. Since 2015, for example, Google has rolled out updates asking websites to be responsive and allow better usability for the mobile experience.

Appearing in the Google 3-Pack attracts attention to your brand and encourages people to engage with you. Keep these guidelines in mind when you create your web presence to maximize your chances of appearing on these lists.

Google Tests Search Results Without Descriptions Again

Google has confirmed it has been testing Google Search result listings without any descriptions, again. Google first tested this in 2015 and is now testing it again for the past couple of weeks.

Aishwarya Tapadar from Google confirmed this in a Google Web Search Help thread saying “this is a small experiment that will be ending in the next day or so.”

Here is a screenshot from that thread with what this looks like (click to enlarge):

click for full size

Aishwarya Tapadar from Google added “we appreciate your patience and apologize for any inconvenience.”

I am not a fan of search results without descriptions but I guess Google is collecting data for some legislation purposes?

Forum discussion at Google Web Search Help.

Search News Buzz Video Recap: Google Algorithm Fluctuations, FLoC FLoPs, New Google Robots Tag & More

This week, we had another unconfirmed Google search ranking update on January 22nd. Google has admitted defeat with its FLoC cookieless proposal and is now going with Topics API. Google launched a new robots tag named indexifembedded that controls indexing of embedded content. Google Search Console had an image search reporting bug it is working to fix, it is just a reporting issue. Google Ads has new tools to help you transfer from smart shopping and local campaigns to performance max. Google probably won’t go live with a label in the search results that meet the page experience update criteria. Newzdash data shows that 67% of Google search results have duplicated top stories and web search results. An SEO poll shows that most SEOs agree that if Google gave 100% transparency in the search results it would lead to poorer quality results. A dental office is named Dentist Near Me but is that a good local SEO strategy? We also have an SEO poll on near me queries. Google’s John Mueller said there is no schema for product images in search. Google launched a new search refinement named People Search Next. Google Maps now shows “updates from customers” which might be a concern for review management. Google is deprecating the Google My Business API on April 20th, it will be replaced. Google is once against testing search results without snippet descriptions. Google confirms it is testing showing favicons in the search ads. Google image search related colored theme design. Google is discontinuing Cameos on Google. Google AdSense is separating out YouTube earnings with other earnings. Google Assistant now lets us say “stop” without saying “hey Google” first.And if you want to help sponsor those vlogs, go to patreon.com/barryschwartz. That was the search news this week at the Search Engine Roundtable.

Sponsored by Sponsored by Duda, the website building platform ranked #1 in Google’s recent CWV report, is offering 15% OFF their annual plans with a special promo code for you at agency.duda.co/barry.

Make sure to subscribe to our video feed or subscribe directly on iTunes, Apple Podcasts, Spotify, Google Podcasts or your favorite podcast player to be notified of these updates and download the video in the background. Here is the YouTube version of the feed:


For the original iTunes version, click here.

Search Topics of Discussion:

Please do subscribe on YouTube or subscribe via iTunes or on your favorite RSS reader. Don’t forget to comment below with the right answer and good luck!

Improve Engagement and Rankings with Meta Descriptions

What is a meta description? A meta description is the small blurb that appears underneath your website on the search engine results page (SERP). It is designed to provide users with a brief summary of the content on your page so they know if the page will answer their question. Traditionally, meta description length has maxed out at 155 characters for desktop and 120 characters for mobile, with the exception of experimental periods such as when Google temporarily extended the length to a 320 characters.

Why should I care about meta description length?

It is important to stay within the character limit on the meta description length to avoid having part of your description cut off by the search engine and appear incomplete in the SERP. Remember that the limit for mobile search is shorter (120 characters) than desktop (155 characters), so if a majority of your traffic is mobile, your meta description length should conform to that standard. Staying within the meta description length will create a better experience for users.

Do meta descriptions matter?

Yes. The meta description is a valuable tool both for users and the search engine when it comes to SEO management. When the page appears on the SERP, users will scan it to see if it answers their query. While Google has stated that meta descriptions do not directly factor into search rankings, well-written meta descriptions can help improve click-through rate (CTR). In turn, this can raise your traffic and engagement, improving your rankings in the SERPs.

How do you write a meta description?

Here is a five-step breakdown, based on recommendations and suggestions Google has stated in its webmaster guidelines, of how to write effective descriptions that will help improve the click-through rate from your listings on the SERPs and become an expert at SEO.

1. Accuracy and quality 

The primary concern for a quality meta description is always accuracy. The summary should correctly describe the content and provide motivation for users to click this particular link. Keep in mind that just like the rest of your content, keyword stuffing or using only lists of keywords provides little context or helpful information for the user. This creates a poor user experience and will not encourage clicks. Meta descriptions should be snippets of high-quality content.

2. Character limits 

Put the most important text near the beginning of the description. Google does not set character limits for meta description length, but it does limit the number of characters displayed to users on the SERP. Best practice is to keep meta description length between 120 and 150 characters. This ensures your entire description will appear on both desktop and mobile.

3. Consistency and originality 

Since meta descriptions do not always get displayed to users, site owners have the tendency to overlook their importance. Brands should make sure that every page has a unique, quality description, particularly for pages that have no text on them. The same description should not be used across multiple pages of the website. Each page of content offers something different for the user, and thus the meta description should be similarly unique and articulate what makes the individual page important.

4. Use a call to action 

Think of the meta description as the body copy in a search ad. Describe what the page has to offer and then use action language words, like “Learn how to…”, “Discover how to…,” “Read about…,” “Take advantage…,” “Sign up for a free trial…”.

5. Robots directives when needed 

Google allows site owners to use the “nosnippet” robots directive if they want to prevent the search engine from displaying any type of snippet in the SERP. This would make the result only show the title. The snippets that appear beneath your website links on the SERP play an important role in generating attention and traffic for your website. They demonstrate your relevance to prospective readers.

Meta descriptions can have a powerful impact on whether people click your result on the SERP and engage with your content. They are one of the most important ways you can control what is being shown in search results. Work to create descriptions that inspire and entice the user so that they are inclined to click and see what you have to say about the topic at hand.

Google Algorithm Updates: A Running Timeline of Major Changes

Latest Google algorithm updates:

  1. Product Review Updates (April, December 2021)
  2. Page Experience Update (2021)
  3. Core Updates (June, July, November 2021)
  4. Core Update (December 2020)
  5. Core Update (May 2020)
  6. Core Update (January 2020)
  7. BERT (October 2019)
  8. Core Update (September 2019)
  9. Site Diversity (June 2019)
  10. “Medic” Core Update (2018)
  11. Google “Fred” (2017)
  12. Interstitial Penalty (2017)
  13. Google AdWords SERP Update (2016)
  14. Google RankBrain Update (2015)
  15. Google Quality Update (2015)
  16. Google Mobile Update (2015)
  17. Google Hummingbird Update (2013)
  18. Google Penguin Update (2012)
  19. Google Panda Update (2011)

How have the latest Google algorithm updates impacted search results?

Every year, Google updates or adjusts its algorithm hundreds of times. The vast majority of the time, the updates do not noticeably impact SERP and website owners do not even notice. However, there have been a few significant times when Google has made updates that cause obvious changes in rankings and traffic rates.

This is a basic overview of some of these major changes so you can understand how the algorithm has developed over the past few years.

Product Review Updates (April, December 2021)

In 2021, SEOs saw a new kind of update, the product review update. The first product review update, which Google pointedly described as not a core update, was targeted at English-language content rolled out over a two-week period at the end of April. In terms of impact, the update was significant, but less significant than a core update and had the main effect of prioritizing the highest quality, most useful product reviews in the SERP (search engine results pages).

The December update followed the blueprint of the April update, ostensibly improving upon it. The timing of the rollout, which took about three weeks and concluded just a few days before Christmas, rankled more than a few e-commerce retailers.

Page Experience Update (2021)

Google’s Page Experience update began rolling out in mid-June after an initial postponement. The update introduced key performance metrics known as Core Web Vitals that now factor into rankings. It was long anticipated and, by design, did not result in major ranking changes. The long rollout of two-and-a-half months and plenty of time to prepare also helped. The 2021 update impacted the mobile user experience with the desktop update rolling out in February 2022.

Core Updates (June, July, November 2021)

Despite a late start, in 2021 Google followed the three core update template it established in 2020, though it could be argued the June and July updates were parts one and two of the same update. Core updates tend to roll out quickly and the three 2021 updates were no different, but many SEOs questioned the timing of the November update, which began rolling out right before Thanksgiving and continued rolling out for about two weeks right through Thanksgiving, Black Friday and Cyber Monday, many retailers biggest selling season. Google continued to offer its standard guidance for core updates.

Core Update (December 2020)

Following E-A-T guidelines is once again invaluable to this Google Core Update. Google also recommends getting to know quality rater guidelines in order to understand how Google systems work and how your content is rated. Creating informative, unique and optimized content that speaks to your readers will continue to help the rankings of your site.

Core Update (May 2020)

Even though COVID-19 hit in 2020 and businesses and sites struggled to keep up with how much more digital the entire world immediately became, Google went ahead with a huge core update known as the May 2020 Core Update. Because of the pandemic, search intent changed and Google made it easier for people to find relevant answers to their questions with the update.

Core Update (January 2020) 

The January 2020 core update broadly impact search results worldwide. Because it does not target any one specific thing, Google recommends that users pay attention to E.A.T, or expertise, authoritativeness, and trustworthiness. The content that continues to align with these objectives will be the content that then sees the best rankings. Better content contributes to this superior user experience, and the easier it should be for brands to see their material rise on the SERP, regardless of core updates.

BERT (2019)

BERT – Bidirectional Encoder Representations from Transformers – is a neural network-based technique for natural language processing and has the ability to better understand the full context of your query by looking at all of the words in your search and delving deeper into the relevant information you’re seeking. This update was so significant that Google needed to buy new and more powerful computer hardware to process the information from the crawl.

Core Update (September 2019)

This update appears to have been broadly targeted at downgrading sites with low-quality content as well as a rollback to fix some unintended impact from prior core updates.

Site Diversity (2019)

The Site Diversity update is an adjustment that seeks to eliminate multiple listings from the same domain from the SERP. Multiple listings are now seen less often.

“Medic” Core Update (2018)

The “Medic” Core Update was a broad core algorithm update, one of the updates that Google does several times a year. While Google did not confirm the specific purpose of this update, it had a large impact on health, finance, and your money your life (YMYL) pages. SEO experts speculated that this Google update boosted the rank of high-quality articles that offer advice on major life issues, such as finances and health.

The Google “Fred” Update (2017)

An unconfirmed algo update, Fred had an outsized impact on organic listings, with a number of sites experiencing traffic declines from 50 to 90%. The exact parameters of Fred have never been confirmed by Google, but seem to crack down on sites that emphasize display ads and/or traffic monetization widgets over content as well as making said elements difficult to differentiate from actual on-page content.

Mobile Interstitial Penalty (2017)

This SEO penalty applied to sites running interstitial ads that blocked the user’s view of the content on the page. This was not a blanket penalty on all interstitials. Instead, it focused on intrusive interstitials on mobile and interstitials that require the user to dismiss them manually.

The AdWords Update (2016)

In Q1 2016, Google fundamentally changed the way that paid search listings appeared on the SERP. They removed the traditional 4-pack placement in the righthand column (where the Knowledge Graph element now appears) and integrated them into the top of the main listings. The integration trend would continue, with the icons labelling listings as paid advertisements gradually being deemphasized over time.

The RankBrain Update (2015)

When RankBrain went live, it introduced artificial intelligence to the Google algorithm. This part of the algorithm has the power to monitor user behavior and response to queries to ‘learn’ more about intent and the value of certain pages. It is now Google’s 3rd most important ranking signal.

The Quality Update (2015)

This update, also known as Phantom II, was noticed a few weeks after the mobile update went live. This update rewarded sites that focused on the user experience and high-quality content while penalizing those with too many ads and certain types of user-generated content. Once again, thin content was hit hard. This is likely one of the reasons that thin, ad-heavy user-generated sites, like HubPages, were penalized while other user-generated sites with lots of high-quality content, like Quora, saw a boost.

The Mobile Update (2015)

The mobile update forced all sites to become mobile-friendly or risk being penalized in the SERPs. Rather than mobile-optimization being reserved for the brands at the forefront of the industry, every site needs to have a responsive version.

The Hummingbird Update (2013)

The Hummingbird update was a change to Google’s algorithm to make it smarter at interpreting semantic search. It was designed to help Google better understand intent and context. This forced marketers to shift towards longtail keywords. It also encouraged marketers to develop pieces based more on user intent and needs rather than a single keyword.

The Penguin Update (2012)

About a year after the Panda update, the Penguin update was released, creating another push towards quality content. This update targeted spam by looking at backlinks. It rewarded those with quality, organic backlinks and penalized those with artificial backlink profiles.

The Panda Update (2011)

This update was first launched in 2011, but it has had several updates over the years. In the beginning of 2016, Panda was added to Google’s core ranking algorithm.

Panda targets spam and weak content that does not help the end-user. Thin content, duplicate content and content with too many ads are all penalized.

How do I succeed when the Google algorithm keeps changing?

If you look at the timeline of Google algorithm changes, you will notice that there is a clear purpose and pattern. Each algorithm update is geared towards improving user experience and helping searchers find the information they need as quickly as possible. The Google updates all focus on weeding out poor content and boosting the content that fills this need.

When developing content for your site, you need to:

  1. Think less about the search engine and more about your end-user
  2. Create content that will engage readers at every stage of the buyer’s journey
  3. Develop a site that is easy to navigate
  4. Use a variety of types of content, including images, videos, infographics, and text
  5. Perpetually monitor your site so that you can identify any changes in traffic rates and correct any drops as quickly as possible.

Google’s algorithm is always changing because it is trying to provide the best information as quickly as possible to its users. To keep a high SERP rank and presence no matter how the algorithm changes, create high-quality, user-friendly content.