4 steps to becoming an experience brand

Once a primary differentiator, reliable customer service has now become a mandatory commodity. With rising consumer expectations and automated technologies, experience has replaced this long-heralded advantage.

Brands positioned with a customer-first, always-on experience optimization approach and those who build for personalization are poised to be market leaders. Becoming an experience-focus brand has been painted as more difficult than it is. The answers and truth are right in front of us. Your consumers have those answers, you just need to ask – and pay attention.

In working with more than 30 brands on their experience strategies, I’ve found four critical steps to helping brands successfully migrate to become customer experience leaders in their market. The simple formula is to identify, measure, build and test.

Identify audiences and journeys

Identify your audience

Let’s start with an exercise. Suppose money is no object, and you get to pick out a new vehicle. Take a moment to picture what you’d like to buy. Now that you have that vehicle in mind, let’s assume that this is the vehicle everyone else wants. It seems ridiculous that the vehicle you want is assumed to be the vehicle everyone else would want. But, how often do you create experiences using that same assumption? As you design an experience, you need to have an audience in mind, but oftentimes, experiences are developed in a vacuum without consumer feedback. In our current environment audience strategy and experiences should never be developed without some type of consumer insight.

Here are a few questions to help you get started in assessing your audience(s).

  • Who is my current audience? 
  • What data sources do I have available to me (research, analytics, databases, etc.)? 
  • What do they prefer? What are their motivations? 
  • Who is/not responding?  
  • Do my loyal customers look different than everyone else? What type of data and insights am I missing? 

Identify audience journeys

I often think of the journey as the foundation. The good news about building out an audience journey is that there are a lot of good approaches. I do not believe there is one single source of truth to creating an audience journey. The important thing is that you create one. If your budget, resources, and time only allow for a whiteboard brainstorm session, then do it. If you have behavioral data at your fingertips and can look at connected event stream data by specific channels and by individual, then do it. If you have the ability to conduct primary research, please do it.

After building a journey, the first mistake I see is that too many brands try to tackle fixing all of the possible interactions they’ve discovered. Prioritization becomes key; if you are able to gather consumer-driven insights to measure and help you prioritize experiences, then that should be your next step.

How do they behave? How do they buy? What are the most common paths to purchase? What are all of the possible interactions?

Measure experiences

Beginning to think from the consumer’s perspective is the right first step, but it is far more effective to actually measure experiences from their direct interactions. Always-on customer-listening engines have been around for decades. Today’s new wave of measurement is more effective but needs to be further elevated. The Customer Effort Score (CES) has come to the forefront of this movement but is lacking in three critical components: measuring multiple interactions, measuring importance, and measuring revenue. But the four-dimensional approach has the power to begin moving the needle.

The measurement of ease to work with a brand across interactions, prioritized within the journey, allows brands to identify the most critical points within the consumer experience. This enables brands to find quick wins to remove as much friction as possible. In the example provided in the image above, one would initially think that “compare plans” and “cancel subscription” should be the areas of focus, but a closer look at importance guides you to prioritize “compare plans” to have the greatest impact.

What are their significant phases of interaction in their journey? Which interactions are the most important? What interactions are in desperate need of help? What is the revenue associated with each interaction?

Build

With a foundational and an architectural assessment, you’ll be poised to build best-in class experiences based on consumer insights. Along the way, an audit of data and technology will become critical to supporting the automation of personalized, people-based experiences. The alignment of key stakeholders across the organization will be another critical component to driving change, which is why a data-driven approach to prioritization from the consumer’s perspective is needed for the potential political battles you’ll be up against.

Another supporting point for your internal journey will be the results from prioritized quick wins. A four-dimensional prioritization of experiences allows the brand to hit the ground running, making immediate improvements to prove out the work, while also laying out critical interactions that may take more significant efforts to improve for long-term planning.

Who are the key stakeholders (detractors/supporters)? What quick wins are we going to tackle? What is our long-term experience roadmap? What technologies/data do I need? 

Test experiences

Another shift in the market over the years has continued in the same vein of always-on, quick-win optimization. Take, for example, website redesigns, as depicted in the image above. Traditional methods would call for significant redesigns every couple of years, requiring weighty amounts of time and money, with gaps and subpar experiences in between. There is a better way. If you are truly interested in meeting consumer expectations you’ll not only be measuring and tracking those experiences on an ongoing basis, but you’ll be consistently making updates to improve them.

What approach are we using today? What tools do I need to conduct testing? What should we test first? Who (internal and/or consumers) should I gather feedback from?

I believe Dentsu Aegis Network Americas CEO Nick Brien sums it up best when he says, “There’s been a fundamental shift in the balance of power. When I started in marketing, I lived in a brand-led world – you changed consumer behavior. But now we live in a consumer-led world. It’s about changing your brand behavior, it is about personalization, it is about relevance, it is about engagement.”


Opinions expressed in this article are those of the guest author and not necessarily Marketing Land. Staff authors are listed here.


About The Author

Strategic brand and direct marketer, leading a team of experience and research strategists in using cognitive psychology and advanced analytics to develop insight-driven strategy with 30-plus brands such as Samsung, GM, SoFi, Lowe’s, MetLife, Dell, Boys & Girls Club and Regions Bank. Personally recognized by the ANA, MediaPost and the Drum Marketing with thought leadership on the subject of neuroanalytics in the Huffington Post, Bank Administration Institute and the Philanthropy Journal.

Research: The most common SEO errors

Research The most common SEO errors

Now, after reading the title, you can think, “What new can I read here? At least every month I see similar articles on different blogs”. I can say without a doubt you’ll definitely like this post.

My article is developed on the basis of unique research.

Every SEO specialist checks a site with the help of some SEO service. I work at one of the most popular all-in-one SEO platforms — Serpstat. Every year our team analyzes site audit results of our users to find out which SEO errors are really the most common.

In this article, I’ll shed light on the results we’ve got for the last year.

Serpstat research: Results we’ve got

During 2018, our users carried out 204K audits and checked 223M pages through Serpstat. Our team analyzed this data and collected the stat.

All stat you can see on the infographics below the text. I just want to specify some facts in words here.

After the research, we’ve discovered that most sites had problems with meta tags, markups, and links. The most common errors are concerned with headlines, HTTPS certificate, and redirects. Issues with hreflang, multimedia, content, indexing, HTTP status codes, AMP (accelerated mobile pages), and loading time were least likely.

Also, we’ve analyzed country-specific domains to get more exact information. The stat we’ve got from it shows that 70% of “.com” domains have the most common problems with links, loading time, and indexing. The same situation is with “.uk” and “.ca” domains.

The most common mistakes and how to fix them

1. Meta tags

Meta tags are rather important despite the fact they aren’t visible to website users. They tell search engines what the page is about and take part in snippets creation. Meta tags affect your website ranking. Errors which can occur with them may spoil user signals.

According to our research, you should first check the length of the title and description itself.

2. Links, markups, and headings

External links (their number and quality) affect your site’s position in SERP as search engines rate link profiles very carefully. Also, you should always remember about internal links factors (nofollow attributes and URL optimization).

The Serpstat team also found out that bugs with markups and headings are rather popular ones despite the fact that they are very important for websites. Markups and headings contain attributes which mark and structure the data of the page. They also help search engines and networks crawl and display the site correctly.

The most common errors in this chapter are with:

  • Nofollow external link attributes
  • Missing Twitter card markups
  • H1 doubling the title tag

3. HTTPS certificate

This certificate is one of the important ranking factors as it ensures a secure connection to the website and the browser. If your website uses personal information, don’t forget to pay attention to it.

The most common mistake here is the referral of HTTPS website to HTTP one.

4. Redirects, hreflang attribute, multimedia

Redirects direct users from the requested URL to another one you need. According to our statistics, you should avoid the most common error with them — having a multilingual interface it’s necessary to apply the hreflang attribute for the same content in different languages. In such a way search engines can understand which version of your texts users prefer.

Multimedia elements don’t affect SEO directly. Although, they can cause bad user signals and indexing errors. Also, pictures affect the website’s loading time. That’s why multimedia are rather important. And here is the same situation with the hreflang attribute — if you have the multilingual interface, you should apply it for the same content in multiple languages.

More info about errors in this section you can find on the infographics.

5. Indexing

Search engines find out what sites are about while indexing. If the site is closed for indexing, users can’t find it in the SERP. Some weak spots of the site that often lead to errors are the following:

  • Canonical tags that reference a different page
  • Non-indexed pages (noindex)
  • iframe tags

6. HTTP status codes, AMP, and content

Answers that the server delivers on user request have the name HTTP status codes. Errors with them are rather serious problems and negatively affect the position of the site in SERPs.

AMP is accelerated pages optimized for mobile devices. You should use such technologies to improve the loading time of the site. Also, poor content causes the deterioration of ranking positions.

The most common problems here are:

  • 404 error codes
  • missing AMP
  • generated content

7. Loading time

Long loading time can worsen the site’s usability and waste the crawling budget. Serpstat team found that the most common problems with this issue are associated with the use of browser cache, image, JavaScript, and CSS optimization.

You can view the detailed infographic here.

How to correct these errors

To find all the above-mentioned errors for your own site, you can start a custom project at Serpstat Audit tool. Here you can check the whole site or even just a separate page. The module checks 20 pages per second and finds more than 50 errors that potentially harm your site.

In its reports, Serpstat sorts errors by importance and categories and gives the list of pages on which these problems were found. In addition, it offers recommendations on how to resolve a specific problem. Some of them are not errors in the true sense (“Information”), they are only shown for you to be aware of such problems.

Summary

There are a lot of errors that can damage your site and its rankings. Despite this fact, you can find them all at once with the help of audit tools.

At first, pay your attention to the most common weaknesses:

  • Meta tags
  • Markups
  • Links
  • Headings
  • HTTPS certificate
  • Redirects
  • Hreflang attribute
  • Multimedia
  • Indexing
  • HTTP status сodes
  • AMP
  • Loading time
  • Content

Inna Yatsyna is a Brand and Community Development Specialist at Serpstat. She can be found on Twitter .

Related reading

Search engine results: The ten year evolution

Six HTTP status codes most critical to your SEO success

Seven time-tested tactics to generate more Google reviews

Three tools providing actionable competitive research insight

20190718 ML Brief

Good morning, is your business using chatbots to communicate with customers?

According to new research from Drift and SurveyMonkey Audience, chatbots are catching up to email and phone as a key part of the conversational marketing mix. Email and telephone still dominate as the top channels for communication, but 33% of respondents reported having used online chat within the last 12 months – an indication that chat is gaining ground.

Almost half the consumers surveyed (44%) said they expect an interaction within five seconds when engaging with a brand face-to-face. On the digital side, 42% of respondents indicated they expect the same interaction time when communicating with a chatbot. The data surfaces a consumer insight that isn’t all too surprising: instant communication is a top priority for online shoppers wanting a quick and convenient way to solve customer issues.

So what does it mean for marketers? Investing more in one-to-one customer experiences (like chatbots or customer service chat) can be the key differentiator for positive interactions between brands and customers. Businesses that have been hesitant to integrate chat will find themselves playing catch-up when it comes time to implement the technology to meet the growing demands of consumers. 

There’s more to read below, including why running multi-channel campaigns can result in more favorable outcomes, and more. 

Taylor Peterson,
Deputy Editor

Delete your pages and rank higher in search – Index bloat and technical optimization 2019

Delete your pages and rank higher in search - Index bloat and technical optimization 2019

If you’re looking for a way to optimize your site for technical SEO and rank better, consider deleting your pages.

I know, crazy, right? But hear me out.

We all know Google can be slow to index content, especially on new websites. But occasionally, it can aggressively index anything and everything it can get its robot hands on whether you want it or not. This can cause terrible headaches, hours of clean up, and subsequent maintenance, especially on large sites and/or ecommerce sites.

Our job as search engine optimization experts is to make sure Google and other search engines can first find our content so that they can then understand it, index it, and rank it appropriately. When we have an excess of indexed pages, we are not being clear with how we want search engines to treat our pages. As a result, they take whatever action they deem best which sometimes translates to indexing more pages than needed.

Before you know it, you’re dealing with index bloat.

What is the index bloat?

Put simply, index bloat is when you have too many low-quality pages on your site indexed in search engines. Similar to bloating in the human digestive system (disclaimer: I’m not a  doctor), the result of processing this excess content can be seen in search engines indices when their information retrieval process becomes less efficient.

Index bloat can even make your life difficult without you knowing it. In this puffy and uncomfortable situation, Google has to go through much more content than necessary (most of the times low-quality and internal duplicate content) before they can get to the pages you want them to index.

Think of it this way: Google visits your XML sitemap to find 5,000 pages, then crawls all your pages and finds even more of them via internal linking, and ultimately decides to index 30,000 URLs. This comes out to an indexation excess of approximately 500% or even more.

But don’t worry, diagnosing your indexation rate to measure against index bloat can be a very simple and straight forward check. You simply need to cross-reference which pages you want to get indexed versus the ones that Google is indexing (more on this later).

The objective is to find that disparity and take the most appropriate action. We have two options:

  1. Content is of good quality = Keep indexability
  2. Content is of low quality (thin, duplicate, or paginated) = noindex

You will find that most of the time, index bloat results in removing a relatively large number of pages from the index by adding a “NOINDEX” meta tag. However, through this indexation analysis, it is also possible to find pages that were missed during the creation of your XML sitemap(s), and they can then be added to your sitemap(s) for better indexing.

Why index bloat is detrimental for SEO

Index bloat can slow processing time, consume more resources, and open up avenues outside of your control in which search engines can get stuck. One of the objectives of SEO is to remove roadblocks that hinder great content from ranking in search engines, which are very often technical in nature. For example, slow load speeds, using noindex or nofollow meta tags where you shouldn’t, not having proper internal linking strategies in place, and other such implementations.

Ideally, you would have a 100% indexation rate. Meaning every quality page on your site would be indexed – no pollution, no unwanted material, no bloating. But for the sake of this analysis, let’s consider anything above 100% bloat. Index bloat forces search engines to spend more resources (which are limited) than needed processing the pages they have in their database.

At best, index bloat causes inefficient crawling and indexing, hindering your ranking capability. But index bloat at worst can lead to keyword cannibalization across many pages on your site, limiting your ability to rank in top positions, and potentially impacting the user experience by sending searchers to low-quality pages.

To summarize, index bloat causes the following issues:

  1. Exhausts the limited resources Google allocates for a given site
  2. Creates orphaned content (sending Googlebot to dead-ends)
  3. Negatively impacts the website’s ranking capability
  4. Decreases the quality evaluation of the domain in the eyes of search engines

Sources of index bloat

1. Internal duplicate content

Unintentional duplicate content is one of the most common sources of index bloat. This is because most sources of internal duplicate content revolve around technical errors that generate large numbers of URL combinations that end up indexed. For example, using URL parameters to control the content on your site without proper canonicalization.

Faceted navigation has also been one of the “thorniest SEO challenges” for large ecommerce sites, as Portent describes, and has the potential of generating billions of duplicate content pages by overlooking a simple feature.

2. Thin content

It’s important to mention an issue introduced by the Yoast SEO plugin version 7.0 around attachment pages. This WordPress plugin bug led to “Panda-like problems” in March of 2018 causing heavy ranking drops for affected sites as Google deemed these sites to be lower in the overall quality they provided to searchers. In summary, there is a setting within the Yoast plugin to remove attachment pages in WordPress – a page created to include each image in your library with minimal content – the epitome of thin content for most sites. For some users, updating to the newest version (7.0 then) caused the plugin to overwrite the previous selection to remove these pages and defaulted to index all attachment pages.

This then meant that having five images per blog post would lead to 5x-ing the number of indexed pages with 16% of actual quality content per URL, causing a massive drop in domain value.

3. Pagination

Pagination refers to the concept of splitting up content into a series of pages to make content more accessible and improve user experience. This means that if you have 30 blog posts on your site, you may have ten blog posts per page that go three pages deep. Like so:

  • https://www.example.com/blog/
  • https://www.example.com/blog/page/2/
  • https://www.example.com/blog/page/3/

You’ll see this often on shopping pages, press releases, and news sites, among others.

Within the purview of SEO, the pages beyond the first in the series will very often contain the same page title and meta description, along with very similar (near duplicate) body content, introducing keyword cannibalization to the mix. Additionally, the purpose of these pages is for a better browsing user experience for users already on your site, it doesn’t make sense to send search engine visitors to the third page of your blog.

4. Under-performing content

If you have content on your site that is not generating traffic, has not resulted in any conversions, and does not have any backlinks, you may want to consider changing your strategy. Repurposing content is a great way to maximize any value that can be salvaged from under-performing pages to create stronger and more authoritative pages.

Remember, as SEO experts our job is to help increase the overall quality and value that a domain provides, and improving content is one of the best ways to do so. For this, you will need a content audit to evaluate your own individual situation and what the best course of action would be.

Even a 404 page that results in a 200 Live HTTP status code is a thin and low-quality page that should not be indexed.

Common index bloat issues

One of the first things I do when auditing a site is to pull up their XML sitemap. If they’re on a WordPress site using a plugin like Yoast SEO or All in One SEO, you can very quickly find page types that do not need to be indexed. Check for the following:

  • Custom post types
  • Testimonial pages
  • Case study pages
  • Team pages
  • Author pages
  • Blog category pages
  • Blog tag pages
  • Thank you pages
  • Test pages

To determine if the pages in your XML sitemap are low-quality and need to be removed from search really depends on the purpose they serve on your site. For instance, sites do not use author pages in their blog, but still, have the author pages live, and this is not necessary. “Thank you” pages should not be indexed at all as it can cause conversion tracking anomalies. Test pages usually mean there’s a duplicate somewhere else. Similarly, some plugins or developers build custom features on web builds and create lots of pages that do not need to be indexed. For example, if you find an XML sitemap like the one below, it probably doesn’t need to be indexed:

  • https://www.example.com/tcb_symbols_tax-sitemap.xml

Different methods to diagnose index bloat

Remember that our objective here is to find the greatest contributors of low-quality pages that are bloating the index with low-quality content. Most times it’s very easy to find these pages on a large scale since a lot of thin content pages follow a pattern.

This is a quantitative analysis of your content, looking for volume discrepancies based on the number of pages you have, the number of pages you are linking to, and the number of pages Google is indexing. Any disparity between these numbers means there’s room for technical optimization, which often results in an increase in organic rankings once solved. You want to make these sets of numbers as similar as possible.

As you go through the various methods to diagnose index bloat below, look out for patterns in URLs by reviewing the following:

  • URLs that have /dev/
  • URLs that have “test”
  • Subdomains that should not be indexed
  • Subdirectories that should not be indexed
  • A large number of PDF files that should not be indexed

Next, I will walk you through a few simple steps you can take on your own using some of the most basic tools available for SEO. Here are the tools you will need:

  • Paid Screaming Frog
  • Verified Google Search Console
  • Your website’s XML sitemap
  • Editor access to your Content Management System (CMS)
  • Google.com

As you start finding anomalies, start adding them to a spreadsheet so they can be manually reviewed for quality.

1. Screaming Frog crawl

Under Configuration > Spider > Basics, configure Screaming Frog to crawl (check “crawl all subdomains”, and “crawl outside of start folder”, manually add your XML sitemap(s) if you have them) for your site in order to run a thorough scan of your site pages. Once the crawl has been completed, take note of all the indexable pages it has listed. You can find this in the “Self-Referencing” report under the Canonicals tab.

screenshot example of using Screaming Frog to scan through XML sitemaps

Take a look at the number you see. Are you surprised? Do you have more or fewer pages than you thought? Make a note of the number. We’ll come back to this.

2. Google’s Search Console

Open up your Google Search Console (GSC) property and go to the Index > Coverage report. Take a look at the valid pages. On this report, Google is telling you how many total URLs they have found on your site. Review the other reports as well, GSC can be a great tool to evaluate what the Googlebot is finding when it visits your site.

screenshot example of Google Search Console's coverage report

How many pages does Google say it’s indexing? Make a note of the number.

3. Your XML sitemaps

This one is a simple check. Visit your XML sitemap and count the number of URLs included. Is the number off? Are there unnecessary pages? Are there not enough pages?

Conduct a crawl with Screaming Frog, add your XML sitemap to the configuration and run a crawl analysis. Once it’s done, you can visit the Sitemaps tab to see which specific pages are included in your XML sitemap and which ones aren’t.

example of using Screaming Frog to run a crawl analysis of an XML sitemap

Make a note of the number of indexable pages.

4. Your own Content Management System (CMS)

This one is a simple check too, don’t overthink it. How many pages on your site do you have? How many blog posts do you have? Add them up. We’re looking for quality content that provides value, but more so in a quantitative fashion. It doesn’t have to be exact as the actual quality a piece of content has can be measured via a content audit.

Make a note of the number you see.

5. Google

At last, we come to the final check of our series. Sometimes Google throws a number at you and you have no idea where it comes from, but try to be as objective as possible. Do a “site:domain.com” search on Google and check how many results Google serves you from its index. Remember, this is purely a numeric value and does not truly determine the quality of your pages.

screenshot example of using Google search results to spot inefficient indexation

Make a note of the number you see and compare it to the other numbers you found. Any discrepancies you find indicates symptoms of an inefficient indexation. Completing a simple quantitative analysis will help direct you to areas that may not meet minimum qualitative criteria. In other words, comparing numeric values from multiple sources will help you find pages on your site that contain a low value.

The quality criteria we evaluate against can be found in Google’s Webmaster guidelines.

How to resolve index bloat

Resolving index bloat is a slow and tedious process, but you have to trust the optimizations you’re performing on the site and have patience during the process, as the results may be slow to become noticeable.

1. Deleting pages (Ideal)

In an ideal scenario, low-quality pages would not exist on your site, and thus, not consume any limited resources from search engines. If you have a large number of outdated pages that you no longer use, cleaning them up (deleting) can often lead to other benefits like fewer redirects and 404s, fewer thin-content pages, less room for error and misinterpretation from search engines, to name a few.

The less control you give search engines by limiting their options on what action to take, the more control you will have on your site and your SEO.

Of course, this isn’t always realistic. So here are a few alternatives.

2. Using Noindex (Alternative)

When you use this method at the page level please don’t add a site-wide noindex – happens more often than we’d like), or within a set of pages, is probably the most efficient as it can be completed very quickly on most platforms.

  • Do you use all those testimonial pages on your site?
  • Do you have a proper blog tag/category in place, or are they just bloating the index?
  • Does it make sense for your business to have all those blog author pages indexed?

All of the above can be noindexed and removed from your XML sitemap(s) with a few clicks on WordPress if you use Yoast SEO or All in One SEO.

3. Using Robots.txt (Alternative)

Using the robots.txt file to disallow sections or pages of your site is not recommended for most websites unless it has been explicitly recommended by an SEO Expert after auditing your website. It’s incredibly important to look at the specific environment your site is in and how a disallow of certain pages would affect the indexation of the rest of the site. Making a careless change here may result in unintended consequences.

Now that we’ve got that disclaimer out of the way, disallowing certain areas of your site means that you’re blocking search engines from even reading those pages. This means that if you added a noindex, and also disallowed, Google won’t even get to read the noindex tag on your page or follow your directive because you’ve blocked them from access. Order of operations, in this case, is absolutely crucial in order for Google to follow your directives.

4. Using Google Search Console’s manual removal tool (Temporary)

As a last resort, an action item that does not require developer resources is using the manual removal tool within the old Google Search Console. Using this method to remove pages, whole subdirectories, and entire subdomains from Google Search is only temporary. It can be done very quickly, all it takes is a few clicks. Just be careful of what you’re asking Google to deindex.

A successful removal request lasts only about 90 days, but it can be revoked manually. This option can also be done in conjunction with a noindex meta tag to get URLs out of the index as soon as possible.

Conclusion

Search engines despise thin content and try very hard to filter out all the spam on the web, hence the never-ending search quality updates that happen almost daily. In order to appease search engines and show them all the amazing content we spent so much time creating, webmasters must make sure their technical SEO is buttoned up as early in the site’s lifespan as possible before index bloat becomes a nightmare.

Using the different methods described above can help you diagnose any index bloat affecting your site so you can figure out which pages need to be deleted. Doing this will help you optimize your site’s overall quality evaluation in search engines, rank better, and get a cleaner index, allowing Google to find the pages you’re trying to rank quickly and efficiently.

Pablo Villalpando is a Bilingual SEO Strategist for Victorious. He can be found on Twitter 

Related reading

Six HTTP status codes most critical to your SEO success

Seven time-tested tactics to generate more Google reviews

Three tools providing actionable competitive research insight

Google Sandbox Is it still affecting new sites in 2019

Report: Amazon Prime Day isn’t just for Prime members any more

Amazon’s days of owning the biggest shopping day(s) of the summer may be numbered. Large retailers, companies with more than $1 billion in annual revenue, experienced a 64% increase in sales during the first day Prime Day this year, compared to their average Monday sales, according to Adobe. That’s up from last year when the same retailers saw a 54% lift in sales.

“The first day of Prime Day saw a substantial increase in online spending the U.S., suggesting that Amazon is no longer the sole winner of the summer shopping holiday,” says Adobe.

Sales lisfts for small retailers

Adobe reports small, niche retailers are also benefiting from Amazon’s Prime Day. Businesses with less than $5 million in annual revenue saw a 30% increase in online sales during the first day of Prime Day 2019.

Overall, retailers outside of Amazon experienced an increase in web traffic to their sites during the first 24 hours of Prime Day, accounting for 66% of revenue lift.

Email driving revenue

Email marketing efforts delivered big for brands on Prime Day, according to Adobe: “Brands that delivered excellent email experiences saw a 50% lift in revenue. In comparison, those that lacked a good email strategy saw only a 17% lift.”

Adobe said that, overall, email campaigns accounted for a 7.6% higher share of revenue.

Amazon’s results so far

Amazon reported Monday’s Prime Day was the “biggest 24-hour sales day” in the company’s history. This is the first time Amazon extended Prime Day to two days, so there is still another day to go.

“Prime Day is off to a tremendous start for Marlowe with sales up 2,000% over Prime Day last year. Our Pomade – launched yesterday – is the fastest growing product we’ve ever had,” said a representative from Marlowe, an Amazon seller offering a line of men’s facial and hair products. Sweet Water Décor, another SMB on Amazon, reported a 255% lift in sales during the first day of Prime Day.

Why we should care

Adobe’s data shows that Monday’s Prime Day represented the third time e-commerce spending exceeded $2 billion in sales outside of the holiday season. Labor Day 2018 and Memorial Day 2019 were the other days that passed $2 billion. Prime Day is also now considered the kick-off to back to school shopping season according to many in the industry.

Many online retailers find themselves competing with Amazon year round, but the company’s summer shopping extravaganza has proven to be a boon for savvy advertisers who have figured out how to take advantage of Prime Day promotions.

A survey from Adlucent showed 68% of online shoppers plan to comparison shop outside of Amazon on Prime day — giving retailers an opportunity to pull consumers away from Prime Day sales.


About The Author

Amy Gesenhues is a senior editor for Third Door Media, covering the latest news and updates for Marketing Land, Search Engine Land and MarTech Today. From 2009 to 2012, she was an award-winning syndicated columnist for a number of daily newspapers from New York to Texas. With more than ten years of marketing management experience, she has contributed to a variety of traditional and online publications, including MarketingProfs, SoftwareCEO, and Sales and Marketing Management Magazine. Read more of Amy’s articles.

Four cool keyword research tools you can use for free now

keyword research tools you can use for free

Keyword research is one of the most important digital marketing tasks. Furthermore, it lies at the foundation of any business strategy or campaign you are planning.

Keyword research provides useful insight into organic ranking opportunities, persona building, competitive research, product development — you name it!

Another reason why I love keyword research is that it’s a highly creative process. There is never such a thing as “enough tools” when it comes to keyword research. Each data source and the way the data is presented brings something new to the table. Sometimes when I feel stuck, all I need is to play with a new keyword intelligence tool.

With that in mind, I decided to create a roundup of free (and freemium) keyword research tools, i.e. those tools you can run right now, without the need to pay first.

Some of those tools are freemium (meaning you can pay for the upgrade) but all of them are quite usable for free (which is what I recommend doing first before deciding if you need to upgrade). Finally, I am not going to include obvious tools like Google Ads Keyword Planner and Google Search Console as I am sure SEW readers are well aware of.

New tools inspire new tactics which is what I hope you’ll end up with.

1. Rank Tracker: Aggregated keyword suggestions from multiple sources

(Freemium)

Rank Tracker free version gives you access to its keyword research feature that uses around 20 different keyword research sources, including Google Ads Keyword Planner, Google Suggest,Wordtracker, SEMRush and more.

Rank Tracker is a downloadable tool and you do have to provide your name and email to start downloading. Other than that, the installation takes seconds, and running it won’t kill your browser.

The free version includes keyword analysis feature helping you to discover most promising keywords to include into your content strategy. These metrics include:

  • Monthly search volume (according to Google)
  • PPC competition
  • Keyword difficulty that reflects the estimated level of organic competition of each query.

…keywords with the Keyword Difficulty score below 60 are the hardest to find but the easiest to rank for. When accompanied by a considerable and steady number of searches, they become perfect keywords to optimize your pages for. You may also have a look at KEI, visibility, and CPC parameters for a deeper analysis.

Rank Tracker keyword research tool you can use for free

You can export the whole list into an Excel file to play further.

The premium features include collaboration, cross-tool reporting, task scheduler, multiple projects, etc.

You can see the full list of features you’ll get free access to here.

2. Answer The Public: Google Suggest driven questions and more

(Freemium)

Answer The Public is a completely free keyword research tool that requires no registration. It uses Google Suggest data to discover questions, comparison-based queries and keywords containing prepositions.

Answer The Public allows to view the data in two ways: Visualization (i.e. a mindmap) and Data:

Answer The Public keyword research tool you can use for free

You can also export all the results in a CSV file or save any visualization as a PNG file.

The recently launched premium version allows you to target keywords by location, compare data and add team members for collaboration.

You can see the version comparison here.

Tip: You can also use this tool to upload your Answer The Public spreadsheet to add Google search volume to each question. This will help you focus on those questions that are often being searched in Google.

3. Text Optimizer: Related concepts and terms

(Freemium)

Text Optimizer is the semantic analysis tool helping you identify related concepts behind each topic or query. It uses Google’s search snippets to analyze the keyword context to come up with related concepts and entities that help Google understand and classify the topic.

Text Optimizer keyword research tool you can use for free

Text Optimizer is both content optimization and research tool helping you direct your whole content creation process.

Don’t get misled though: It’s not about stuffing your content with the suggested terms. Use the tool for deeper topic understanding and as a writing aid.

The premium version allows to use geo-targeting, build whole sentences to help you in writing and access your historic records.

You can see it in action here.

4. Kparser: Clustered keyword suggestions

(Freemium)

Kparser is a freemium tool that runs the whole keyword analysis for free, without requiring registration. You won’t be able to export the keyword list unless you upgrade but you can use the keyword filters to the left to group and cluster your list by a common modifier.

Kparser combines multiple keyword sources including Google Trends, Ebay, Amazon, Google Trends, and YouTube.

Kparser keyword research tool you can use for free

It’s a somewhat basic approach to keyword clustering but it’s nonetheless nice to have completely for free as it helps to discover more queries to optimize for.

The premium features include unlimited searches, geo-targeting and more.

Read more about Kparser here.

Bonus: Analyze keyword performance

(Free trial)

Finteza is a nice affordable alternative to Google Analytics with huge focus on conversion optimization and monetization.

One of its highly useful feature is search analysis section showing you which keywords brought most clicks to your site. It’s a great way to identify more queries to focus on:

Finteza keyword performance analysis tool with free trial

If you select any of the queries and keep browsing the site, you’ll see data related to that keyword only, e.g. its conversion rate, associated conversion funnel analysis and user demographics. Finteza also recently added retargeting feature allowing you to serve specific content based on the initial referral or engagement.

You can read more on Finteza’s traffic analytics here.

Which keyword research tools do you know that are usable free of charge? Please share yours in the comments!

Related reading

Six HTTP status codes most critical to your SEO success

Five ways PPC customer support can help SMBs

Seven time-tested tactics to generate more Google reviews

Three tools providing actionable competitive research insight

Here’s another sneak peek at the SMX East agenda

I’m checking in with an update from the Search Engine Land editors: They’re wrapping up the SMX® East agenda. It’s going to be bigger and better than ever — 70+ in-depth sessions on SEO, SEM, search for multi-location brands, agency operations, conversion optimization, paid social, content marketing, voice search, and so much more.

I’ll be able to share the final agenda very soon, but until then… here’s another sneak peek at what’s in store:

Tactic-Rich Search Marketing Sessions

Last week, I revealed nine of the deep dive sessions coming to NYC November 13-14. Here’s nine more to whet your appetite…

  • A Deep Dive on Google My Business
  • Clarifying The Murky Waters Of Attribution
  • Conversion Optimization For The Long Run
  • Future Directions For Local Search
  • Gearing Up For The Conversational Web
  • Google Posts, Google Q&A & GMB Insights
  • Harnessing The Power Of Online Reviews
  • Tackling The Challenges Of Enterprise SEO
  • The New Google Shopping

Notice any trends? This year’s agenda will feature new content concentrations on search marketing for multi-location brands and agency operations and management. And that’s in addition to all of our hallmark SEO and SEM sessions.

Opening Keynote

Rand Fishkin will kick things off with his opening keynote, “Google: From Everyone’s Search Engine to Everyone’s Competitor”. He’s also giving away copies of his new book, Lost and Founder, and hosting a book signing after the presentation!

Clinics: Expert Answers To Your Specific Questions

I. Love. Clinics. No PowerPoints, no presentations, no agenda — just a panel of friendly experts ready to answer your questions on SEO, SEM, Social Ads, Google Analytics, and more. (Each clinic has its own focus!) Bring your burning curiosities and specific cases to the table for a no-holds-barred Q&A!

Deep-Dive Workshops

Hungry for more? Our full-day, pre-conference workshops were designed for insatiable marketers like you. Choose an expert-led deep dive on one of the following topics:

Stay Tuned!

The official agenda will be unveiled next week… I’ll reach out as soon as it’s live. If you’re ready to register, now’s the best time — you’ll save up to $900 off on-site rates if you book your pass today!

See you in NYC 🙂


About The Author

Lauren Donovan has worked in online marketing since 2006, specializing in content generation, organic social media, community management, real-time journalism, and holistic social befriending. She currently serves as the Content Marketing Manager at Third Door Media, parent company to Search Engine Land, Marketing Land, MarTech Today, SMX, and The MarTech Conference.

Recent funding news indicates increased need for data compliance management

The past year has brought on an enormous shift for digital marketers and organizations. In the wake of GDPR —  and with CCPA going into effect in less than six months — data privacy compliance has become a focal point for organizations across the globe.

Data privacy compliance firm OneTrust announced that it has closed an impressive $200 million Series A investment from Insight Partners in its latest round of funding — bringing the valuation of the platform to $1.3 billion. The multi-million dollar investment demonstrates the mounting priority businesses are placing on operationalizing privacy compliance.

“It’s been an exciting three years at OneTrust, with our customers partnering with us to define and build the most widely used technology platform in a completely new market,” said Karbir Barday, CEO and fellow of information privacy at OneTrust. “This investment will help us to bring a new level of scale and support for our customers, coming at a timely juncture with just six months before California’s CCPA is set to be enforced.”

Why we should care

Just this week, the European Union slapped massive fines on British Airways and Marriot for data breaches that affected hundreds of millions of their customers. The sheer sizes of the fines — $230 million and $123 million, respectively — should be enough to put compliance at the forefront of your digital strategy (if it isn’t already). Third-party vendors like OneTrust are recognizing the need to provide tools and solutions for compliance management.

Data privacy is becoming a visible part of our strategies, particularly in digital marketing. As we learn to manage the different regional laws and global regulations, many organizations will turn to third-party vendors to help drive the compliance component of our strategies.

We can expect to see more data compliance management platforms coming into the marketplace as enterprises and their service providers implement new programs and new laws evolve.

More on the news

  • OneTrust already serves over 3,000 clients in more than 100 countries.
  • Products within the OneTrust platform include processes to automate CCPA and GDPR requirements, including “right to be forgotten,” “access,” and “do not sell”.
  • Earlier in the week, San Francisco-based TrustArc raised $70 million in its latest funding to help organizations build privacy and compliance programs.

About The Author

Jennifer Videtta Cannon serves as Third Door Media’s Senior Editor, covering topics from email marketing and analytics to CRM and project management. With over a decade of organizational digital marketing experience, she has overseen digital marketing operations for NHL franchises and held roles at tech companies including Salesforce, advising enterprise marketers on maximizing their martech capabilities. Jennifer formerly organized the Inbound Marketing Summit and holds a certificate in Digital Marketing Analytics from MIT Sloan School of Management.

The top retailer marketing strategies to compete with Amazon Prime Day

Since launching in 2015, Amazon’s Prime Day sale has claimed its place as an industry-wide shopping holiday, generating record-breaking revenues year-over-year, and eclipsing even Black Friday.

This year, the 48-hour Prime Day mega-sale kicks off Monday, July 15 and is shaping up to be the biggest online shopping day to date.

Amazon may have been the frontrunner of “Christmas in July” but big-box retail rivals have accordingly followed suit. Contenders like Walmart, Best Buy, Target and others have taken to sharing in the cyber frenzy, launching competing sales in tandem with Amazon’s event.

The Prime Day phenomenon has transformed the days during and surrounding the event into a profitable sales window for retailers in nearly every market. During last year’s Prime Day, brands ran cutthroat promotions, including Target touting a year of free same-day delivery with a purchase minimum, eBay suspending its membership paywall for a 36-hour period, Walmart peddling sales lower than Black Friday, and Best Buy offering loss-leader sales on electronics – just to name a few.

According to a Prime Day survey by Adlucent, 68% of respondents planning to shop on Prime Day said they will also be looking outside of Amazon to comparison shop, leaving ample room for competitors to take advantage of the holiday. Last year, Walmart was the biggest competitor, claiming around 50% of sales outside the Amazon marketplace, Adlucent reported. Target and Best Buy earned 33% and 32% share of outside revenue, respectively.

So what are retailers doing to capitalize on Amazon’s sale? We’ve compiled some of the key strategies that marketers should be considering during massive online shopping events like Prime Day, Black Friday and beyond.

Driving awareness with content and search

Clear, impactful messaging and high-quality content is a critical component for online retailers going head-to-head with Amazon.

The top brands rely on promotional messaging, competitive pricing, and optimized product page listings to build awareness and support sales. Descriptive product page copy, high-quality product imagery, and mobile responsiveness are among the key drivers for conversion lifts.

Descriptive product page copy, high-quality product imagery, and mobile responsiveness are among the key drivers for conversion lifts.

“A competitive, design thinking driven UX and UI can lead to more shopper engagement. Historically, we have seen that site visitors who interact with navigation/facets convert at a higher rate, buy more, and come back more often,” said Roland Gossage, CEO of GroupBy Inc.  “A competitive combination of product data enrichment, recommendations, and intuitive navigations can result in more conversions, higher order values, higher revenue per visit, and more returning customers.”

High-quality email content also drives Prime Day sales lift for competing retailers. During last year’s event, brands that used “Prime Day” in subject lines saw an enormous lift in open rates – 47% higher than the average of other shopping holiday campaigns, according to research from Yes Marketing. Email retargeting and planned segmentation strategies were also among the tactics used by big-box retailers during last year’s Prime frenzy, teasing with content directed at the most engaged consumers.

Brooke Willcox, director of digital business development at MNI Digital Media, said that a strong competitive marketing strategy for retailers on Prime Day “should start with a strong SEM campaign, with strategic keyword selection. Since users will be searching for deals, it’s vital that the brand/landing page pops up first.”

While bidding on PPC keywords for Prime Day is often an expensive tactic for small businesses, major e-commerce brands have shown to invest heavily hot-ticket keywords to warrant top of the page results. Smaller businesses can still ride the search wave with organic SEO, ensuring product pages are optimized, promotional messaging is well-defined, and high-traffic keywords are baked into titles and rich content.

Delivering on competitive shipping promises

Amazon Prime’s free one-day and two-day shipping has rapidly become the default expectation for many shoppers. Data from digital services and solutions firm Avionos suggests that Amazon’s shipping offerings are a major driver for consumers. When a product’s price point is bolstered by its quality, nearly half (49%) of online shoppers choose to purchase via Amazon instead of directly buying from other brands and retailers because of delivery efficiency.

But for online retailers competing with Amazon, a prompt delivery may not always be the determining factor for consumers, as, say transparency about when orders will be delivered.

In Walker Sands’ “Future of Retail” report, consumers said that faster shipping will make them more likely to shop online – but the true driving force is largely the convenience of door-to-door delivery. Of the surveyed consumers who purchased products online in the past year, 61% reported using standard shipping, while 42% went with two-day delivery.

Of all shipping promises, 77% of consumers surveyed in the report ranked free shipping as the most important option for online purchasing decisions. Still, the majority of consumers show a preference for reliable delivery, with high expectations that retailers will deliver products when they promise to.

Embracing retention through brand loyalty

Dedicated loyalty programs are a lynchpin for online retailers coasting on the Prime Day shopping mentality. Premium loyalty incentives – like tiered, paid, or value-based programs – have been shown to drive higher engagement and sustainable return customer behavior.

A recent study by Clarus Commerce indicated that nearly 86% of consumers who were satisfied with a brand’s paid loyalty program were likely to choose that retailer over a competitor offering a lower price for future purchases.

Retail rivals have been able to capitalize on the Prime Day mentality around impulse purchases and saturated shopping behavior by creating meaningful connections with customers after the sales are over. Personalized offerings, exclusive benefits, and content that goes beyond the discount signals value for customers who engage with Amazon competitors during Prime Day, laying a solid foundation for brand loyalty.

Fine-tuning sales operations and martech

With more than 29% more retailers expected to play in this year’s Prime Day arena, airtight sales operations and strong e-commerce technology are factors in delivering a positive customer experience and supporting promotional efforts.

Hazelcast CEO Kelly Herrell pointed out that mega-sale events like Prime Day “not only create new consumer demands, but also daunting technical challenges for retailers vying to keep up with the onslaught of buyers and transaction volumes.”

With Amazon alone selling more than 100 million products during last year’s Prime Day (equating to more than 1,150 transactions every second), retailers face the pressure to ensure that all technology touchpoints are optimized to withstand high-volume traffic while still delivering key funnel metrics.

“In this new climate, mere microseconds matter as even fleeting blips or delays can mean thousands lost in failed transactions – and unhappy consumers missing out on their desired purchase. Retailers who don’t build the right systems to support this type of split-second processing simply won’t survive the Prime-pocalypse,” Herrell said.


About The Author

Taylor Peterson is Third Door Media’s Deputy Editor, managing industry-leading coverage that informs and inspires marketers. Based in New York, Taylor brings marketing expertise grounded in creative production and agency advertising for global brands. Taylor’s editorial focus blends digital marketing and creative strategy with topics like campaign management, emerging formats, and display advertising.

How to grab featured snippet rankings with zero link building effort

how to grab featured snippets with no link building

Featured snippets, also known as “position zero” placements on Google, have been receiving their fair share of glory and blame lately. 

While some big corporations like Forbes went ahead and questioned if Google is stealing traffic with the featured snippet, content creators like me have found it easy to get more traffic, thanks to being able to rank small sites on a featured snippet.

This post will give you a brief idea on how you can rank a page on Google’s featured snippet — without building any links to that page.

Understand the types

There are three major types of featured snippets that you can go for. As most of our clients are bloggers, we tend to go for either the paragraph snippets or the list snippets. Table snippet is another popular one that you can target.

Here’s a quick graph from Ahrefs about the snippet type and their percentages.

graph about the snippet type and their percentages

Targeting the right keywords

Once you finalize the type of snippet that you would want to go for, it is time to dig deep into your keyword research to find keywords that suit your blog and match the requirements for the type of snippet that you are going after.

If you are going for a paragraph snippet, you will have to find keywords that are primarily related to these types:

  • How to
  • Who/what/why

example of finding keywords on snippets

If you are trying to rank for a numeric list (numbered list or bullet points), the idea would be to structure your content in a way so that it offers step by step guides to someone. As per our experience, Google only shows a numeric list on featured snippet when the keyword tells Google that the searcher is looking for a list.

example of a listed featured snippet

For table snippets, the idea is to have structured schema data on your website that compares at least two sets of data on the page. You don’t really have to have a properly formatted column-based table to be able to rank for table snippets as long as the comparison and the schema is there.

example of a table structured snippet

Understanding the type and targeting the right keywords will do more than half of the job for you when it comes to ranking your website on the featured snippet with zero links.

However, you are not going to win the battle by out-throwing an already existing featured snippet. This will only work for keywords that don’t already have a featured snippet ranking on Google.

To grab featured snippets from the existing competition, you will need to go ahead and perform a few more steps.

Copying your competitor

Some will call it “being inspired”, but essentially, what you are doing is copying the structure of an existing featured snippet article and trying to make it better (both with content and if possible, with links).

What do I mean when I say, copying the structure of an existing page and making it better? If you want to rank for the featured snippet for the keyword “best cat food brands” and if the one, ranking at this moment already has a list of 20, you will have to create a list of 25, in the exact same format that the current one is using.

Once that’s done, the final step is simply to make sure you have proper schema on the page.

Note: It is very unlikely that this method will help you outrank an existing featured snippet unless you also rank in the top ten for that keyword.

How do we find keywords for featured snippets?

As you can imagine, finding the right keyword to target is winning half of the battle when it comes to ranking on featured snippets.

I use Semrush, but feel free to use your own tools. Here’s what our agency’s process looks like.

Let’s assume, for the purpose of this article, that I run a pet blog and I am interested in ranking for multiple featured snippets.

I would go to Semrush, and put one of my competitors on search.

example of competitor research on semrush

Source: semrush

Now click on “Organic Research”, select positions and from advanced filters, select – Include > Search features > featured snippet.

example of organic research

Source: semrush

This will give you a huge list of keywords that are currently ranking as featured snippets. As you can see, we found about 231 opportunities to target here:

listing of potential keywords for targeting

Source: semrush

It is time to add another condition to our advanced filters. Let’s select include > words count > greater than five. Here’s what the new result looks like:

example of using advanced filters in semrush

Source: SEMrush

From here on, simply organize the keywords by volume and then select the ones that you think matches with your target market. Like any keyword research, you will have to find keywords that have low competition and moderate search volume. Personally, I would try to go for keywords that have less than 500 monthly searches.

Make sure that you are following the initial three steps that we discussed. You will almost always have a higher chance of ranking on featured snippet following this strategy.

Khalid Farhan blogs about internet marketing at KhalidFarhan.com. He can be found on Twitter @iamkhalidfarhan.

Related reading

Seven time-tested tactics to generate more Google reviews

Three tools providing actionable competitive research insight

Google Sandbox Is it still affecting new sites in 2019

alexa.com search tools updates competitive analysis