Market Brew Integrates Google’s Chrome JavaScript Engine

Palo Alto, CA ‒ Market Brew, the leading A.I. platform for SEO, has swapped its current JavaScript engine out for Google’s Chrome JavaScript engine.

Google publicly announced their JavaScript capabilities in 2014, but Market Brew’s founders and CTO, Scott Stouffer, said his team had discovered that Google had these capabilities as far back as 2008. “We integrated our own JavaScript engine into the technology stack for our clients, so they could see how Google viewed their pages. This recent advancement to Market Brew’s platform is just a continuation of that mindset.”

Market Brew’s platform now precisely replicates what users see on Google’s Chrome browser. Market Brew can execute JavaScript like AJAX requests and parse CSS and other DOM-manipulating elements, which is where competitive platforms struggle. “One of the benefits of using Chrome is that your JavaScript tests will be executed in the same environment as users of your site,” said Eric Bidelman, an engineer at Google.

“This is a step forward in scalability and performance for the Market Brew modeling platform, and clients will now benefit from understanding exactly what Google sees when it crawls their site,” said Stouffer. While Market Brew has maintained that its current JavaScript rendering capabilities are first class, the integration with Google’s Chrome technology no doubt cements itself as an important piece in Market Brew’s search engine modeling offering.

Stouffer said that this decision also streamlines its technology offering. “In the past, we had clients using workarounds. Prerendering, escaped fragments (the old AJAX crawling scheme), and other server-side hacks are no longer needed. What you see on a Chrome browser, what Googlebot sees, is exactly what Market Brew’s crawlers now see.”

Market Brew was started by search engineers in Palo Alto, as a unique alternative to the growing number of enterprise SEO tools and platforms pretending to provide insight into Google by simply regurgitating already public ranking data.

With Market Brew, there is NO black box — Market Brew is a “generic” search engine that calibrates (transforms) itself into whatever search engine environment the user wants. This unique process uses artificial intelligence to machine learn the behavior and characteristics of the target search engine, and adjust thousands of algorithmic weightings within its Search Engine Model. Once calibrated, users can explore the search engine model — almost like having their very own Google Simulator.

Market Brew’s patented Search Engine Model allows teams to precisely identify each type of issue within their site, and automatically prioritize those items by comparing millions of keyword and competitive environments to determine which opportunities provide the biggest upward movement for the least amount of optimization. And it does this every time a change is made to your (or your competitor’s) site.

Sign up for a demo at marketbrew.com, to see why Market Brew is the trusted partner of CMOs and Data Scientists for top Global Brands.

Five ways SEOs can utilize data with insights, automation, and personalization

Five ways SEOs can utilize data with insights, automation, and personalization.

Constantly evolving search results driven by Google’s increasing implementation of AI are challenging SEOs to keep pace. Search is more dynamic, competitive, and faster than ever before.

Where SEOs used to focus almost exclusively on what Google and other search engines were looking for in their site structure, links, and content, digital marketing now revolves solidly around the needs and intent of consumers.

This past year was perhaps the most transformative in SEO, an industry expected to top $80 billion in spending by 2020. AI is creating entirely new engagement possibilities across multiple channels and devices. Consumers are choosing to find and interact with information by voice search, or even on connected IoT appliances, and other devices. Brands are being challenged to reimagine the entire customer journey and how they optimize content for search, as a result.

How do you even begin to prioritize when your to-do list and the data available to you are growing at such a rapid pace? The points shared below intend to help you with that.

From analysis to activation, data is key

SEO is becoming less a matter of simply optimizing for search. Today, SEO success hinges on our ability to seize every opportunity. Research from my company’s Future of Marketing and AI Study highlights current opportunities in five important areas.

1. Data cleanliness and structure

As the volume of data consumers are producing in their searches and interactions increases, it’s critically important that SEOs properly tag and structure the information we want search engines to match to those queries. Google offers rich snippets and cards that enable you to expand and enhance your search results, making them more visually appealing but also adding functionality and opportunities to engage.

Example of structured data on Google

Google has experimented with a wide variety of rich results, and you can expect them to continue evolving. Therefore, it’s best practice to properly mark up all content so that when a rich search feature becomes available, your content is in place to capitalize on the opportunity.

You can use the Google Developers “Understand how structured data works” guide to get started and test your structured data for syntax errors here.

2. Increasingly automated actionable insights

While Google is using AI to interpret queries and understand results, marketers are deploying AI to analyze data, recognize patterns and deliver insights as output at rates humans simply cannot achieve. AI is helping SEOs in interpreting market trends, analyzing site performance, gathering and understanding competitor performance, and more.

It’s not just that we’re able to get insights faster, though. The insights available to us now may have gone unnoticed, if not for the in-depth analysis we can accomplish with AI.

Machines are helping us analyze different types of media to understand the content and context of millions of images at a time and it goes beyond images and video. With Google Lens, for example, augmented reality will be used to glean query intent from objects rather than expressed words.

Opportunities for SEOs include:

  • Greater ability to define opportunity space more precisely in a competitive context. Understand underlying need in a customer journey
  • Deploying longer-tail content informed by advanced search insights
  • Better content mapping to specific expressions of consumer intent across the buying journey

3. Real-time response and interactions

In a recent “State of Chatbots” report, researchers asked consumers to identify problems with traditional online experiences by posing the question, “What frustrations have you experienced in the past month?”

Screenshot of users' feedback on website usage experiences

As you can see, at least seven of the top consumer frustrations listed above can be solved with properly programmed chatbots. It’s no wonder that they also found that 69% of consumers prefer chatbots for quick communication with brands.

Search query and online behavior data can make smart bots so compelling and efficient in delivering on consumer needs that in some cases, the visitor may not even realize it’s an automated tool they’re dealing with. It’s a win for the consumer, who probably isn’t there for a social visit anyway as well as for the brand that seeks to deliver an exceptional experience even while improving operational efficiency.

SEOs have an opportunity to:

  • Facilitate more productive online store consumer experiences with smart chatbots.
  • Redesign websites to support visual and voice search.
  • Deploy deep learning, where possible, to empower machines to make decisions, and respond in real-time.

4. Smart automation

SEOs have been pretty ingenious at automating repetitive, time-consuming tasks such as pulling rankings reports, backlink monitoring, and keyword research. In fact, a lot of quality digital marketing software was born out of SEOs automating their own client work.

Now, AI is enabling us to make automation smarter by moving beyond simple task completion to prioritization, decision-making, and executing new tasks based on those data-backed decisions.

Survey on content development using AI

Content marketing is one area where AI can have a massive impact, and marketers are on board. We found that just four percent of respondents felt they were unlikely to use AI/deep learning in their content strategy in 2018, and over 42% had already implemented it.

In content marketing, AI can help us quickly analyze consumer behavior and data, in order to:

  • Identify content opportunities
  • Build optimized content
  • Promote the right content to the most motivated audience segments and individuals

5. Personalizations that drive business results

Personalization was identified as the top trend in marketing at the time of our survey, followed closely by AI (which certainly drives more accurate personalizations). In fact, you could argue that the top four trends namely, personalization, AI, voice search, and mobile optimization are closely connected if not overlapping in places.

Across emails, landing pages, paid advertising campaigns, and more, search insights are being injected into and utilized across multiple channels. These intend to help us better connect content to consumer needs.

Each piece of content produced must be purposeful. It needs to be optimized for discovery, a process that begins in content planning as you identify where consumers are going to find and engage with each piece. Smart content is personalized in such a way that it meets a specific consumer’s need, but it must deliver on the monetary needs of the business, as well.

Check out these 5 steps for making your content smarter from a previous column for more.

How SEOs are uniquely positioned to drive smarter digital marketing forward

As the marketing professionals have one foot in analysis and the other solidly planted in creative, SEOs have a unique opportunity to lead smart utilization and activation of all manners of consumer data.

You understand the critical importance of clean data input (or intelligent systems that can clean and make sense of unstructured data) and differentiating between first and third-party data. You understand economies of scale in SEO and the value in building that scalability into systems from the ground up.

SEOs have long nurtured a deep understanding of how people search for and discover information, and how technology delivers. Make the most of your current opportunities by picking your low-hanging fruit opportunities for quick wins. Focus your efforts on putting the scalable, smart systems in place that will allow you to anticipate consumer needs, react quickly, report SEO appropriately, and convey business results to the stakeholders who will determine budgets in future.

Jim Yu is the founder and CEO of leading enterprise SEO and content performance platform BrightEdge. He can be found on Twitter .

You might like to read these next:

Related reading

How to speed up SEO analysis API advantages for SEO experts (with bonus)

Common technical SEO issues and fixes, for aggregators and finance brands

faceted navigation in ecommerce

marketing automation for SEOs, five time-saving strategies

Pinterest’s new head of engineering brings deep e-commerce experience

Pinterest has recruited Walmart’s former CTO Jeremy King as its new head of engineering. King will lead the team responsible for building Pinterest’s visual search engine and report to CEO Ben Silbermann.

Why you should care

As Pinterest closes in on an IPO date, the social network is beefing up its e-commerce chops. Adding King to the executive mix — an e-commerce technology expert who has been focused on creating “seamless shopping experiences” for companies like Walmart and eBay — should better position Pinterest to compete for social e-commerce dollars and market share.

Pinterest’s focus on e-commerce could be good news for marketers who’d like to see the platform move more aggressively in this area. Recent features for retail marketers include Shopping ads and Shop the Look pins.

“Not only is Jeremy a respected engineering leader, but from the moment we met him, we knew his values around putting the customer first were aligned with our own focus on Pinners. As we build products to inspire people to create a life they love, Jeremy’s technical experience and leadership are a perfect combination to build a visual discovery engine for all,” said CEO Ben Silbermann.

More on the news

  • As Walmart’s CTO, King oversaw the technology teams for the retailer’s U.S. retail stores and e-commerce for Walmart and Jet.
  • In addition to his C-level role at Walmart, King also served as an EVP at LiveOps and VP of engineering for eBay.
  • Pinterest has steadily built out its executive team over the last year, hiring Francis Brougher as its first COO and, more recently, naming Andréa Mallard as CMO.

About The Author

Amy Gesenhues is Third Door Media’s General Assignment Reporter, covering the latest news and updates for Marketing Land and Search Engine Land. From 2009 to 2012, she was an award-winning syndicated columnist for a number of daily newspapers from New York to Texas. With more than ten years of marketing management experience, she has contributed to a variety of traditional and online publications, including MarketingProfs.com, SoftwareCEO.com, and Sales and Marketing Management Magazine. Read more of Amy’s articles.

Pinterest’s new head of engineering brings deep e-commerce experience

Pinterest has recruited Walmart’s former CTO Jeremy King as its new head of engineering. King will lead the team responsible for building Pinterest’s visual search engine and report to CEO Ben Silbermann.

Why you should care

As Pinterest closes in on an IPO date, the social network is beefing up its e-commerce chops. Adding King to the executive mix — an e-commerce technology expert who has been focused on creating “seamless shopping experiences” for companies like Walmart and eBay — should better position Pinterest to compete for social e-commerce dollars and market share.

Pinterest’s focus on e-commerce could be good news for marketers who’d like to see the platform move more aggressively in this area. Recent features for retail marketers include Shopping ads and Shop the Look pins.

“Not only is Jeremy a respected engineering leader, but from the moment we met him, we knew his values around putting the customer first were aligned with our own focus on Pinners. As we build products to inspire people to create a life they love, Jeremy’s technical experience and leadership are a perfect combination to build a visual discovery engine for all,” said CEO Ben Silbermann.

More on the news

  • As Walmart’s CTO, King oversaw the technology teams for the retailer’s U.S. retail stores and e-commerce for Walmart and Jet.
  • In addition to his C-level role at Walmart, King also served as an EVP at LiveOps and VP of engineering for eBay.
  • Pinterest has steadily built out its executive team over the last year, hiring Francis Brougher as its first COO and, more recently, naming Andréa Mallard as CMO.

About The Author

Amy Gesenhues is Third Door Media’s General Assignment Reporter, covering the latest news and updates for Marketing Land and Search Engine Land. From 2009 to 2012, she was an award-winning syndicated columnist for a number of daily newspapers from New York to Texas. With more than ten years of marketing management experience, she has contributed to a variety of traditional and online publications, including MarketingProfs.com, SoftwareCEO.com, and Sales and Marketing Management Magazine. Read more of Amy’s articles.

Robots.txt best practice guide + examples

robots.txt best practice guide

The robots.txt file is an often overlooked and sometimes forgotten part of a website and SEO.

But nonetheless, a robots.txt file is an important part of any SEO’s toolset, whether or not you are just starting out in the industry or you are a chiseled SEO veteran.

What is a robots.txt file?

A robots.txt file can be used for for a variety of things, from letting search engines know where to go to locate your sites sitemap to telling them which pages to crawl and not crawl as well as being a great tool for managing your sites crawl budget.

You might be asking yourself “wait a minute, what is crawl budget?” Well crawl budget is what what Google uses to effectively crawl and index your sites pages. As big a Google is, they still only have a limited number of resources available to be able to crawl and index your sites content.

If your site only has a few hundred URLs then Google should be able to easily crawl and index your site’s pages.

However, if your site is big, like an ecommerce site for example and you have thousands of pages with lots of auto-generated URLs, then Google might not crawl all of those pages and you will be missing on lots of potential traffic and visibility.

This is where the importance of prioritizing what, when and how much to crawl becomes important.

Google have stated that “having many low-value-add URLs can negatively affect a site’s crawling and indexing.” This is where having a robots.txt file can help with the factors affecting your sites crawl budget.

You can use the file to help manage your sites crawl budget, by making sure that search engines are spending their time on your site as efficiently (especially if you have a large site) as possible and crawling only the important pages and not wasting time on pages such as login, signup or thank you pages.

Why do you need robots.txt?

Before a robot such as Googlebot, Bingbot, etc. crawls a webpage, it will first check to see if there is in fact a robots.txt file and, if one exists, they will usually follow and respect the directions found within that file.

A robots.txt file can be a powerful tool in any SEO’s arsenal as it’s a great way to control how search engine crawlers/bots access certain areas of your site. Keep in mind that you need to be sure you understand how the robots.txt file works or you will find yourself accidentally disallowing Googlebot or any other bot from crawling your entire site and not having it be found in the search results!

But when done properly you can control such things as:

  1. Blocking access to entire sections of your site (dev and staging environment etc.)
  2. Keeping your sites internal search results pages from being crawled, indexed or showing up in search results.
  3. Specifying the location of your sitemap or sitemaps
  4. Optimizing crawl budget by blocking access to low value pages (login, thank you, shopping carts etc..)
  5. Preventing certain files on your website (images, PDFs, etc.) from being indexed

Robots.txt Examples

Below are a few examples of how you can use the robots.txt file on your own site.

Allowing all web crawlers/robots access to all your sites content:

User-agent: *
Disallow:

Blocking all web crawlers/bots from all your sites content:

User-agent: *
Disallow: /

You can see how easy it is to make a mistake when creating your sites robots.txt as the difference from blocking your entire site from being seen is a simple forward slash in the disallow directive (Disallow: /).

Blocking a specific web crawlers/bots from a specific folder:

User-agent: Googlebot
Disallow: /

Blocking a web crawlers/bots from a specific page on your site:

User-agent: Disallow: /thankyou.html

Exclude all robots from part of the server:

User-agent: *
Disallow: /cgi-bin/
Disallow: /tmp/
Disallow: /junk/

This is example of what the robots.txt file on the theverge.com’s website looks like:

The example file can be viewed here: www.theverge.com/robots.txt

You can see how The Verge use their robots.txt file to specifically call out Google’s news bot “Googlebot-News” to make sure that it doesn’t crawl those directories on the site.

It’s important to remember that if you want to make sure that a bot doesn’t crawl certain pages or directories on your site, that you call out those pages and or directories in the in “Disallow” declarations in your robots.txt file, like in the above examples.

You can review how Google handles the robots.txt file in their robots.txt specifications guide, Google has a current maximum file size limit for the robots.txt file, the maximum size for Google is set at 500KB, so it’s important to be mindful of the size of your sites robots.txt file.

How to create a robots.txt file

Creating a robots.txt file for your site is a fairly simple process, but it’s also easy to make a mistake. Don’t let that discourage you from creating or modifying a robots file for your site. This article from Google walks you through the robots.txt file creation process and should help you get comfortable creating your very own robots.txt file.

Once you are comfortable with creating or modify your site’s robots file, Google has another great article that explains how to test your sites robots.txt file to see if it is setup correctly.

Checking if you have a robots.txt file

If you are new to the robots.txt file or are not sure if your site even has one, you can do a quick check to see. All you need to do to check is go to your sites root domain and then add /robots.txt to the end of the URL. Example: www.yoursite.com/robots.txt

If nothing shows up, then you do not have a robots.txt file for you site. Now would be the perfect time to jump in and test out creating one for your site.

Best Practices:

  1. Make sure all important pages are crawlable, and content that won’t provide any real value if found in search are blocked.
  2. Don’t block your sites JavaScript and CSS files
  3. Always do a quick check of your file to make sure nothing has changed by accident
  4. Proper capitalization of directory, subdirectory and file names
  5. Place the robots.txt file in your websites root directory for it to be found
  6. Robots.txt file is case sensitive,  the file must be named “robots.txt” (no other variations)
  7. Don’t use the robots.txt file to hide private user information as it will still be visible
  8. Add your sitemaps location to your robots.txt file.
  9. Make sure that you are not blocking any content or sections of your website you want crawled.

Things to keep in mind:

If you have a subdomain or multiple subdomains on your site, then you you will need to have a robots.txt file on each subdomain as well as on the main root domain. This would look something like this store.yoursite.com/robots.txt and yoursite.com/robots.txt.

Like mentioned above in the “best practices section” it’s important to remember not to use the robots.txt file to prevent sensitive data, such as private user information from being crawled and appearing in the search results.

The reason for this, is that it’s possible that other pages might be linking to that information and if there’s a direct link back it will bypass the robots.txt rules and that content may still get indexed. If you need to block your pages from truly being indexed in the search results, use should use different method like adding password protection or by adding a noindex meta tag to those pages. Google can not login to a password protected site/page, so they will not be able to crawl or index those pages.

Conclusion

While you might be a little nervous if you have never worked on robots.txt file before, rest assured it is fairly simple to use and set up. Once you get comfortable with the ins and outs of the robots file, you’ll be able to enhance your site’s SEO as well as help your site’s visitors and search engine bots.

By setting up your robots.txt file the right way, you will be helping search engine bots spend their crawl budgets wisely and help ensure that they aren’t wasting their time and resources crawling pages that don’t need to be crawled. This will help them in organizing and displaying your sites content in the SERPs in the best way possible, which in turn means you’ll have more visibility.

Keep in mind that it doesn’t necessarily take a whole lot of time and effort to setup your robots.txt file. For the most part, it’s a one-time setup, that you can then make little tweaks and changes to help better sculpt your site.

I hope the practices, tips and suggestions described in this article will help give you the confidence to go out and create/tweak your sites robots.txt file and at the same time help guide you smoothly through the process.

Michael McManus is Earned Media (SEO) Practice Lead at iProspect.

Related reading

cybersecurity in SEO, how website security affects your SEO performance

Spring Clean Your Website

Spring is often associated with a fresh, new, clean start and a renewed sense of life. For many, this getting rid of the old and in with the new takes on the form of spring cleaning.  While you may be thinking about the house, why not think of refreshing and reviving your business’ website?  After all, your website is your business’ presence online and is the first interaction customers have with your company.

Does your website accurately display your company’s character, personality and culture? Is it current and up to date?

Like your home, you may acquire a different taste in décor and choose to go beyond cleaning to overhauling a room’s look. Spring is a great time to do the same for your business website.

There are some things to consider when spring cleaning your website:

Update Content and Information

Does the content of your website still embody your company’s personality and mission? Is your last blog post or “news” item from a year or two ago? Is the company contact information and personnel biographies current?

Nothing is more boring or unprofessional looking as a website that is full of outdated, incorrect information. Additionally, consider adding regular blog posts so your site. This can give your site a constant stream of fresh content that piques the interest of customers and catches the attention of search engines.

For e-commerce websites with online ordering and a catalogue of products, make sure the description for each product is correct and updated. Be sure to display current inventory, not discontinued items.

Test Out the Website’s Usability

Is your website easy to navigate and do all the internal links work? If not, site visitors will be quick to leave. It is also a good idea to check if your website is compatible on different devices such as phones and tablets. Also, look through all the content for grammar, spelling, punctuation errors and industry jargon. The tone and style of the content should be consistent on each webpage and should reflect the image and personality of the brand.

A website that looks nice and is easy to use increases the time visitors spend on the website, whereby increasing their chance of converting.

Give it a Fresh Look

Like your personal home décor preferences, your business will undergo a brand refresh to update and freshen its look to match its evolving personality. The frequency of this change can also be attributed to the preferences of a business’ targeted customers and the industry a business is in.

When undergoing a brand or website redesign, it’s a good idea to work with a professional website designer as they  know their way around the colors and font styles and sizes and their emotional and psychological effects. He or she will know which complimentary colors should be incorporated as well as the appropriate website design and layout for the look and feel of one’s brand and the personality it wants its website visitors to experience.

In addition to changing up the layout, typography and color scheme, also consider adding new photos (that are optimized) along with fresh content. Adding an events calendar and current news can also revive the appearance of a bland website.

Make Sure the Links Work

Outdated or broken links can can lower your business’ credible, trustworthy, professional appearance. If the internal and external links point to pages that are either outdated or no longer exist, your page’s SEO and online visibility can suffer. One should check out each link on their website frequently and regularly.

Make Sure Third Party Website Apps and Extensions Work

Are your company’s social media and RSS feeds properly linked and connected to your website? Are third party on-page analytics trackers up and running? Is the spam filter on your blog post comments turned on or should you disallow all commenting on old blog posts to avoid spam? If your business has an app, does it work properly? If the third party apps and extensions installed on your website  fail to work properly, you miss out on  taking the biggest online advantage you have, making your content shareable, and valuable analytics information about your website users and their behavior and interaction with your website. Without these analytics data, it is hard to assess the areas of improvement.

Evaluate and Clarify Your Site’s Call-to-Actions

Are the calls to action (CTA) of each page of your website obvious and clear? If it isn’t noticeable or clear, website visitors won’t know what action to take which can lead to missed conversions, purchases and e-newsletter subscriptions. When people come to your site, they not only want to quickly find the information they are looking for, but they also want to be told what to do with the information you provide.

Renew or Re-evaluate the Domain Name and Web Hosting Plan

It’s always a good idea to look at your website hosting plan each year as many plans require annual renewal. Did the hosting provider take care of any or all the glitches of your website? Did they provide adequate website protection? Were they responsive and easy to work with? Did prices go up or do you feel like you’re overpaying for service? Your website can’t function without a hosting provider. You need to be sure the one you choose is skilled, experienced, trustworthy, responsive and are comfortable to work with.

Similarly, yearly evaluation of your website’s domain is also a good idea. Does it clearly communicate the company’s name and targeted keywords? Is it catchy, concise and memorable? If you’re overhauling a company re-brand, will the current domain name fit the new brand identity and personality?

Spring is a great time to clean up your website and bring new life into your business. This entails ensuring that your website functions correctly, that it is easy to navigate and that the content is accurate and current. Besides being easy to use and fun to interact with, one’s website may be in need of redesign and a new look.

Whether you’re undergoing a major re-branding or simply want to make a few minor tweaks and improvements, the professionals at SEO.com can help. We are a full-service digital marketing company with professional developers, website designers, content and SEO specialists and PPC professionals. Contact us today to learn more about we can help you in your website spring cleaning.

Report: Facebook the top network for app-installs, Google, Apple follow

The top sites and networks for mobile app installs are, in order, Facebook, Google, Apple (Search Ads), Snap and Twitter. This changes somewhat by app category and geography but this is the hierarchy in North America according to the latest AppsFlyer Performance Index (registration required).

Facebook #1 overall, Snap most improved. Facebook remains the top network for mobile app installs overall. It’s also the ROI leader, while Snap saw the best improvement in ROI in the non-gaming category. Google is a strong number two in the majority of categories. Apple generally follows in the third position or fourth, though not in all categories. Twitter, Snap (as mentioned) and others rank highly depending on the category: gaming vs. non-gaming, etc.

The analysis is based on more than 20 billion installs during the second half of 2018. It also examined more than 11,000 apps across more than 350 media networks.

Fraud is high, Facebook and Google share flat. One of the more important findings, AppsFlyer said that app-install fraud remains high. Indeed, the company reports that 30 percent of all installs are fraudulent. AppsFlyer added that the affiliate model for app marketing is losing ground because it is more vulnerable to fraud.

AppsFlyer further reports that that although the perception is that Facebook and Google’s dominance and share of the app-install market are growing, their combined market share is large but flat. Especially in gaming, a number of networks (AppLovin, ironSource, Unity Ads, Vungle, Tapjoy) are growing and showing strength.

Why should you care. The report is useful to app marketers to better understand the networks and platforms to use to market their apps. Facebook, Google and Apple Search Ads are must-dos. After that marketers can be more selective.

The other important thing in the report to be aware of is app-install fraud. According to estimates, app developers are spending an average of 31 percent of their budgets on app marketing. Billions of dollars globally are thus being wasted because of widespread fraud. Choose your networks wisely.


About The Author

Greg Sterling is a Contributing Editor at Search Engine Land. He writes a personal blog, Screenwerk, about connecting the dots between digital media and real-world consumer behavior. He is also VP of Strategy and Insights for the Local Search Association. Follow him on Twitter or find him at Google+.

Eight tools you need for backlink generation

Eight tools you need for backlink generation

So you’ve created your website, following all the recommended SEO best practices.

That means you’ve included valuable, relevant keywords on your pages, made it mobile friendly and even started a blog that you’re updating frequently with original, relevant content.

But despite your best efforts, you’re not seeing as much traffic as you’d like, and your site is still ranking too low on Google’s Search Engine Results Page (SERP). It could be that your site is missing just one thing: backlinks.

Backlinks are links from another website that point to your website. Getting backlinks from websites with high domain authority that are relevant to your niche will help you rank higher on Google searches and grab your audience’s attention.

Why is there such an emphasis on backlinks? Google’s Search Engine Results Page (SERP) uses them to discover new pages, confirm pages are legitimate and determine the popularity of these pages. After all, Google doesn’t want to risk its own reputation by ranking subpar sites high on the SERP. According to a study by Backlinko, the number of domains linking to a webpage “correlated with rankings more than any other factor”.

Backlink generation isn’t easy, especially for new businesses or businesses just starting to build their web presence. However, with time, effort and the right tools, you can make sure you’re ranking high and receiving the views you deserve.

If you’re a business owner and want to boost your backlinks, here are eight tools to get you started.

1. MozBarScreenshot of MozBar

MozBar is a free SEO toolbar you download onto your web browser. It shows you the domain authority (DA) of a certain website, which gives you an indication of whether or not you should reach out for a backlink. If you do earn a backlink from a website with a DA, this will positively affect your own site’s authority.

In terms of DA, it ranges from 1 to 100, and the higher, the better. There’s no ideal number to look for, but generally, try finding sites with excellent content that relate to your field. If the DA is, say, a 35, that won’t help you as much as a site with a 75, but it won’t hurt, either. Research sites thoroughly and makes sure they aren’t spammy before pursuing them.

2. SEMrushScreenshot of SEMrush

SEMrush, which helps with all types of marketing strategies, shows users a few key tools for backlink generation. When logged into the paid version, you can navigate to the mentions section and find which websites are mentioning you but not linking to you. Once you discover these mentions, you can reach out and ask for a link to your site (as long as the site is relevant and has a high DA), which will boost your rankings.

Another tactic is to go into the backlink audit and see who’s currently linking to your website. Check to see if the link appears underneath the proper SEO-rich keyword and if the site is legitimate and relevant. (If the site is not legitimate, you may want to reach out and ask them to take it down, since that backlink can potentially hurt your ranking.)

While on SEMrush, try the backlink gap tool, which shows you which backlink opportunities your competitors are not taking advantage of. Then, you can reach out and ask for those valuable backlinks instead.

3. PitchboxScreenshot of Pitchbox

Pitchbox is a platform to find websites that may want to spread the news about your business or backlink to your pages or content. You simply sign up for Pitchbox, log in, paste the link to the page/content you’re doing backlink generation for and add in some specific keywords you’re looking to target. Then, in a minute or two, Pitchbox will come up with (usually) hundreds of websites you can reach out to.

You can filter for or delete any websites with low domain authority, and go through the sites one by one to see which are valuable. You can reach out to these websites using a Pitchbox email template. Pitchbox will show you the contacts for that site (or allow you to manually input them), automatically place in the person’s name and their website name, and send as many follow-up emails as you’d like.

When using Pitchbox, double check the contacts to make sure they’re current. Another best practice is to email a maximum of two people at the website since you don’t want to spam numerous people within an organization. If you’re having trouble with backlink generation, consider offering a backlink exchange. Just make sure, again, that the site you’re promising to link to relevant to yours and not spammy.

4. AhrefsScreenshot of Ahrefs

Ahrefs is similar to SEMrush and allows you to use the platform’s backlinks checker to view your current backlinks. Since they’ve already linked to your content before, you can ask these sites to link back to your other pages as well. Ahrefs also allows you to disavow toxic backlinks that might hurt your ranking.

Another helpful backlink generation tool is the Ahrefs Site Explorer. By entering the name of your competitor, you can see all of their referring backlinks. Using that information, you can reach out to the same sites that are linking to your competitors and see if they want to link to a valuable piece of content from your site.

5. Google AlertsScreenshot of Google Alerts

Let’s say you don’t have time to log onto SEMrush or Ahrefs every day and go through your mentions and backlinks. Instead, sign up for Google Alerts, which will email you when you’re mentioned somewhere. Visit the websites that mention you and try to find the contact information for someone you can reach out to there. If you can’t find them, log onto Hunter.io, which is a free tool for finding email addresses using only a domain name.

6. Broken Link Builder

Screenshot of Broken Link Builder

Somebody’s broken link can be your backlinking opportunity with Broken Link Builder. With this tool, you can find dead websites and their respective backlinks, and then offer up similar content to the website that was linking to the dead link. It’s a white-hat SEO tactic that benefits both webmasters and backlink seekers. Broken Link Builder only takes 30 to 60 minutes to generate a report for you to find valuable backlinking opportunities.

7. Majestic

Screenshot of Majestic

Majestic is a backlinking tool, like SEMrush and Ahrefs, that examines all the backlinks for your website, as well as your competitors, and allows you to perform very specific searches. You can search and filter backlinks however you choose, including by crawl or discovery dates, anchor text, link type, URL snippet or merchant ID. Majestic also claims to have the largest index out of any other service.

8. Linkody

Screenshot of Linkody

Linkody is another platform for tracking and performing research on backlinks. It tells you when you lose or gain links, and you can disavow bad links. You’re able to see your competitors’ backlinks and analyze your own link profile. You can choose to receive daily notifications in your inbox, view which links point to your landing pages and connect your Linkody and Google Analytics accounts for more backlink information. If you don’t want to pay for the service, you can use Linkody’s Free Backlink Checker to check two unique domains per week.

Tracking backlinks

With backlink generation, you need to track your efforts. A good place to do this is within a Google Sheet. Create a spreadsheet and share it with your team working on backlinks. They should input information like the date the backlink was pursued, the DA of the website, the URL of the website, the target URL of your content or page, the date the backlink was added, the contact’s email address and any notes about the process. Then, when you’re doing another round of backlink generation, you can refer to your Google sheet and reach out to the same people to see if they’d like to link to something else of yours.

Backlinks will always be part of Google’s ranking requirements. Understanding their importance and learning how to use these tools empowers you to do effective backlink generation that can increase your rankings and bring in more visitors to your site.

Mario Medina is a content strategist. He can be found on Twitter .

Related reading

Common technical SEO issues and fixes, for aggregators and finance brands

faceted navigation in ecommerce

marketing automation for SEOs, five time-saving strategies

A primer to forecasting the value of SEO