Is The University Of Phoenix About To “Go Up In Flames”?

Market Brew’s highly-correlated search engine model reveals how the “online colleges” organic search results work.


When it comes to higher education, it’s no secret that the trend everywhere is toward remote learning and online programs. With the obvious limitations of cost, geography, class size, and available resources, it makes sense that traditional universities the world over would be pushing to expand their online course and degree program offerings. Aside from the obvious convenience it affords the students, this shifting environment also enables all those “for-profit” institutions out there an “easy” way to boost revenues through an every growing base of students interested in seeking their education online.

Source: Sloan Consortium Annual Survey of Online Education.

The Competition and the Spoils

I say “easy” because as anyone who has ever tried to compete online for website traffic in a highly competitive industry knows… getting to the top of a search engine for a keyword like “online colleges” with 228M competing pages is no easy feat.

The highly competitive “online colleges” market.

Without the help of powerful seo tools this kind of task would be hopeless for the less sophisticated. Yet, for those lucky few who are smart enough to jump on the cutting edge of search analytics technology by utilizing an invaluable scientific marketing platform like the one produced by the team at Market Brew, they would be able to crack the code of the search engines and seriously compete in this highly competitive industry of online education and would stand to gain a lot of money for their efforts! You can see from this chart (below) that even traditional brick and mortar institutions make a sizeable chunk of profit per student.

Source: CATO Institute.

Of course, this chart is taking into account all the costs and spending of the traditional university model as well, but what if we just look at the purely online universities and their profits/spending…would we see anything different? Well according to a private investigation into the for-profit higher education industry conducted by the U.S. Senate’s Health, Education, Labor and Pensions Committee (HELP) led by Senator Tom Harkin, of the 30 for-profit institutions the committee investigated, the average spending per student was only $2050, and specifically regarding the University of Phoenix, their parent company Apollo Group spent just $892 per student. With these new costs in mind, you can imagine how much more profit these kinds of institutions stand to gain with every new enrollment. Source: American Public Media.

Cracking the Code

With so much profit at stake, it’s no wonder that companies like the University of Phoenix, Kaplan, and DeVry are pulling out all the stops to make sure they are capturing as much of this online “gold” in the form of website visitors as they can. But in an industry that has no doubt matured a lot over the past few years in their online marketing tactics, is the competition over, or are there still any surprises we can glean from the current environment online for this industry?

Let’s see what we can find out by taking a look at the insights within the Market Brew predictive search platform.

In this use case example, we are going to look at the current results for one of the most competitive keyword terms in this already very competitive industry of online higher education – and that is “online colleges”. Why did I choose that? Well, after loading a few dozen keywords into the Market Brew analytics platform (though I could have used many thousands as I’m not limited at all in how many keywords I can analyze/track) and kicking off a crawl/site analysis for a few of the top higher education institutions focused on remote learning (including the three listed above), I took a look at what combinations of keywords and webpages carried the greatest ROI potential if I were to focus my time and resources on optimizing them. What kind of crazy math and wizardry did I have to perform to come to such a conclusion amidst the myriad of possibilities that I could optimize for? Well, frankly… none.

Yes, you heard me right. None. The Market Brew system was designed to make SEO a no-brainer from an ROI perspective (great for all you CMOs and VPs of marketing out there!). For now, I’ll spare you the details of the complicated math, though for you curious folk you can check out our Automated SEO whitepaper, but suffice it to say that after running millions of simulations of potential changes to the underlying metrics that determine a webpage’s organic search ranking for any of the keywords we have entered into the engine, the system will return a list of those potential optimizations (ordered by highest ROI potential), where they should be done (which webpage and keyword), and how to accomplish them (just click on ‘Switch to Task View’ to see the optimizations ordered by task with explanations and the ability to assign each task to team members).

It just so happens that the most valuable optimization I could focus on if I were working for the University of Phoenix is to improve my homepage’s performance for the keyword “online colleges.”

The most valuable optimizations (ordered by ROI potential) for the University of Phoenix.

You might be thinking right now, “Yeah, that’s all well and good, but how does the engine really know which pages/products and keywords are the most valuable to me? ” By default, the engine will calculate optimizations based on search volume. To make the optimizations even more accurate, and more customized to your site’s conversions, you have the option to import your own data into the engine by promoting and demoting URL/Keyword pairs. This part is key to getting the most value out of the platform.

Using the Overrides feature, we are able to tell the engine which keywords and/or URLs are converting the best or driving the most revenue into the business. Input your revenue per keyword and/or webpage to customize the model even further.

Adding this key piece of quantitative information gives the system the proper context it needs to then go and do what computers do best…crunch massive amounts of data! Once completed, the system arrives at a statistical approximation that accurately simulates the reality of your organic search market and gives you the kind of insight that can ONLY BE POSSIBLE by having access to your own search engine – a search engine combined with an analytics platform that can make these calculations and decisions in the proper context of numerous changing variables (solving for the butterfly effect).

So let’s get back to the example at hand… optimizing for the keyword “online colleges”. Starting from square 1, let’s just see what our old friend Google has to say about the competition…

Google’s Search Engine Results For the Phrase “Online Colleges”.

When doing a typical Google search and looking at the results, it may not be clear how close or far the race to the top truly is in this case. We can see that of the top 10 organic results, there are only 4 actual schools making it onto this coveted page of traffic gold. Looking at the top 3 competitors for this keyword marketplace, we can see Kaplan is pulling up the top ranking (currently 4th at the time of this writing) with the University of Phoenix and DeVry pulling up a “close” 6th and 7th finish in the rankings. But how close are they really? And how reliable are these rankings? Might these results jump around a bit, even exchanging places in the Google results from one hour to the next or one location search to the next? How is an SEO professional supposed to honestly, objectively judge this and find any meaningful signal with which to begin the long arduous process of optimizing??

Well, if that SEO professional had access to Market Brew’s search engine model he would be able to look underneath the noisy signal that modern search engines like Google broadcast, and expose the real story going on behind the curtain. Let’s take a look at what we see inside the engine:

Market Brew’s Transparent Search Engine Results For the Phrase “Online Colleges”.

By eliminating all the other noise and just focusing on the top competitors that matter to us in this competition to become the king of “online colleges”, we can see that while it may normally be presumable that any of these top three could potentially win out based on the Google results alone, once we look at the numbers that underlie the scoring layer of any major search engine (we allow you to model a number of different search engine environments), we begin to see that Kaplan is far and away the leader (at least as far as this keyword race is concerned). The real story here is the race for second place amongst these top institutions. Depending on when you make the search, Phoenix or DeVry may pull up ahead of the other which shows you that Google already thinks these guys are fairly close in overall relevance to the search term. But how close are they and what are their differences or relative strengths and weaknesses?

A quick hover over Market Brew’s search engine query score for each result begins to paint a clearer picture…

The Query Score Breakdown (QSB) for the University of Phoenix for the query “online colleges”.

Now these numbers at first glance might not make much sense, but as soon as we begin to compare with the raw numbers for DeVry we begin to see where the opportunities are…

The Query Score Breakdown (QSB) for DeVry University for the query “online colleges”.

So by comparing the breakdowns between these two (you can also view these detailed metrics side by side and do further analysis by exporting an excel report), we immediately understand how the disparities in Net Total Link Flow and the Semantic makeup of each of these two pages affect the query scores and thus the distances between them making the race interesting. We already knew by looking at the other metrics underneath each result (without going into the query score breakdown) that the University of Phoenix has more Net Total Link Flow than DeVry which we can then conclude that DeVry must be doing a better job on the semantic side of things (page content and html markup) in order to be even in the running. But now that we see these hard figures we begin to understand what that gap really looks like and can hedge against it (which could mean the difference of hundreds of thousands of dollars if not millions if I were working from the perspective of the University of Phoenix and wanted to ensure that DeVry can’t pass me up anytime soon).

Taking Action on the Insight

Keeping with the perspective of working for the University of Phoenix (someone at that administration should really buy me a beer for what I’m about to share for free 😉 ), the first thing I might want to do is get a better understanding of why we are underperforming from a semantic perspective for one of our top keywords. I will investigate how I might be able to change that scenario using the least amount of time/resources possible. In this instance I have already checked the Top ROI Optimization suggestions that the engine has identified to make sure I’m using my time wisely. I want to be working on an optimization that has the potential to drive serious value into the organization by way of pushing the needle on important areas with revenue impact potential.

I already know from the query score breakdown (QSB) that the basket of keyword terms that the engine has indicated represent the content footprint, or Market Focus, of the page in question (in this case it’s the homepage) is scoring around 25% of what DeVry’s page is scoring, so this is a major area of concern. Because the search engine is integrated into the analytics platform, I can quickly drill down directly from the search results themselves into the proper section of Market Focus data that I want to analyze. In this case, all I have to do is click on the term “university phoenix school” in the search results to be taken to more information regarding this metric.

Clicking the phrase takes you to the next screen to dissect what’s going on within the Market Focus basket for this page.

The Market Focus Details screen (content).

The powerful natural language processing engine let’s us scroll down through a list of the unique keyword groupings within the html content of the page (the engine can also read any javascript) and see how often they occur and learn what the page is really about versus what we attempted to optimize for. While the list here is quite lengthy, it is surprising that terms like “online college” are not already more prominently found in the content. Since all of the terms on this list are occurring just once or twice, there is no clear winner from a content perspective that a search engine would be able to use to determine keyword relevancy. This is good news for us, because it means that it won’t take too many changes to the content itself to begin to position certain terms (like online colleges) above the rest.

However, search engines are a bit more sophisticated than merely doing a shingle analysis of the on-page content itself… they also consider the surrounding link graph and any clues that those inbound links may give away as to what a page could be about. But not all links are created equal (as we well know), so when we click on the Anchor Text tab above, we can see how much relevance (by way of link flow) is flowing through each linked word of the anchor text on inbound links to paint a more detailed semantic picture.

The Market Focus Details screen (incoming anchortext).

Perusing this list clearly shows the overwhelming focus on branded terms for those inbound links which sadly don’t drive nearly as much traffic as more industry focused terms. Luckily though, there is already a good amount of link flow coming in for the term “online”, we are just missing much needed anchor text link flow for the term “colleges” in order to boost the weighting of the full term “online colleges” within the basket itself:

The Market Focus Details screen (Market Focus Basket).

Combining the unique phrase occurrences with the incoming anchor text link flow, we see a whole basket of phrases with the associated weightings that the search engines can then use to determine our page’s relevancy to any search query. Of course, while branded terms are quite natural and do drive traffic depending on the strength of your brand, having a Market Focus Basket entirely dominated by branded terms is not good. Serious steps need to be taken in order to shift the semantic picture of this homepage toward terms that will drive more traffic and revenue overall. Naturally, the quickest and easiest changes to be made are increasing the number of occurrences within the content for these high traffic terms and also changing the anchor text (or ALT text in the case of images) of any internal links that may already be pointing to the homepage. Of course, if possible, attempting to change the specific wording of links on other domains that may be sending a decent amount of link flow to your domain can help tremendously if you are successful in making such a change. Usually, however, a good PR piece with intelligently placed, keyword optimized links will do the trick!


Now what about the penalty situation? Is the content on this page possibly negatively impacting the pages performance in any other ways?

Clicking the score takes you to the next screen to dissect what’s going on within the Webpage Scorecard for this page.

Clicking back to the search results again and this time clicking on the Market Brew Score takes us into a deeper look at how any potential penalties may be limiting the performance of the page.

A Webpage Scorecard revealing families of algorithms that have reduced this page’s ranking power.

Not surprisingly, due to the lack of any single phrase occurring significantly more than any others (which we just saw in the MF basket section), there is no keyword stuffing penalty being assessed however that doesn’t mean there aren’t a number of other potential issues going on. One of those issues which is actually reducing the ranking power of the page by almost 12% is due to content duplication within the internal pages of the site itself.

Viewing Duplicate Content Penalties.

After clicking on the penalty location button for the Duplicate Content penalty, we are taken to this screen which gives us a breakdown of all the top offending pages and the average percentage of content that is the same between the two pages (each link is in respect to the homepage because that is the penalty scorecard we are looking at in this example, but you can view this for any page on your site or your competitors sites too!). With the way modern websites are designed in a template based fashion with common elements like header and footer navigation being common throughout the whole site, it is nearly impossible to totally eliminate any duplication, but as a rule of thumb we suggest keeping the level of duplication below 50% to stay in the safe zone and not risk being penalized by the search engines.

Highlighting that Readily Shows Offending Content Duplication on the University of Phoenix Homepage.

In this case, a duplication percentage of 83% is significantly impacting the performance of the University of Phoenix homepage, and though it may not be something most SEOs would think is a top concern, with the power of this platform we can see statistically significant metrics like this that quite affirmatively do impact the overall performance of key landing pages and could very easily be the difference in thousands of visitors if DeVry were to optimize any areas where they are currently underperforming and pass the University of Phoenix right up!


So basically, in just a few minutes of exploring the Market Brew analytics platform, I could have prevented the University of Phoenix from going ‘up in flames’ (forgive the pun) and saved them many thousands of dollars. But when it comes down to it, optimizing a website for any number of keyword search terms can often be a multi-pronged effort with changes positively impacting some areas while negatively impacting others. This can be frustrating and confusing for brands operating on merely a bit of non-contextualized intel from other tools and whatever historical best practices they have acquired. In this modern world of Big Data and predictive analytics, the high stakes game of for-profit higher education demands more.

About The Author

Giordan heads up the solutions engineering team at Market Brew. He is an honors graduate in Computer Science and Information Technology from Purdue University where he was a multiple NCAA Academic and Athletic All-American as a captain of the swim team. As a professional athlete on the USA National Swim Team, Giordan was a member of the silver medal winning relay at the 2006 World Championships in Shanghai, China and a finalist at the 2008 Olympic Trials. For more information about Market Brew, visit

Google Walkout: What the ‘five real changes’ demanded by staff will entail for the search giant

As I write this (1st November) thousands of Google staff in offices around the world have taken to the streets ‘to protest sexual harassment, misconduct, lack of transparency, and a workplace that doesn’t work for everyone,’ according to the movement’s official Twitter feed.

The mass walkout is just the latest public display of employee anger within the search engine. It follows criticism of the company’s involvement with Project Maven back in March, high profile resignations over the leaked Dragonfly project in August, as well as the “Rubingate” scandal uncovered by the New York Times last month which saw key Android developer Andy Rubin given ‘a hero’s farewell’ and a $90m exit package after claims of sexual misconduct were made against him.

It would appear that the treatment of Andy Rubin (amid accusations that Google admitted to be completely credible) has been the key contributing factor to the well-orchestrated global protest.

Employees are united under a banner of five clear demands they want to see implemented within the organization. Let’s unpick these proposed changes one-by-one to get a better understanding of how a multinational company might go about making them and why they are so important for the future of the tech industry.

1. An end to forced arbitration in cases of harassment and discrimination for all current and future employees.

Forced arbitration refers to the policy of companies who only allow their employees the right to solve disputes via processes of internal arbitration.

Many organizations have forced arbitration clauses written into the employment contracts for their staff and while it isn’t a bad route to go down when employees have the option to solve disputes this way (this is known as voluntary arbitration), forced arbitration means that anyone at the company who wants to bring forward a case does not have the right to sue, to make a class action lawsuit, or to appeal – and nor do they have access to federal protections such as The Equal Pay Act of 1963.

A change within Google which would end forced arbitration in cases of harassment and discrimination would signal a fundamental shift in corporate culture which has, to date, more often sought to protect the wellbeing of the company above the wellbeing of those who work on the office floor. It would ensure that any employee who is embarking on the complex and, often, emotionally difficult process of raising a dispute would have the right to raise it outside of the internal arbitration framework if they want to.

There is some advantage for Google itself when internal disputes are resolved by internal arbitration – but this change could certainly help rekindle their image as a progressive organization.

2. A commitment to end pay and opportunity inequity.

Google is still plagued by a significant gender pay gap, as well as a lack of representation of women and people of color at board level. In the US during 2017, the Department of Labor said that its audit of Google revealed ‘systemic compensation disparities against women pretty much across the entire workforce.’ This was followed by UK reports by The Telegraph this year where the company admitted that the mean average salaries for women working at the search engine were 17% lower than the salaries for men.

Of course, the ethical argument for pay and opportunity equity is clear, but there is increasing data which points to the business and economic value of diverse workforces. McKinsey’s illuminating Delivering through diversity report published earlier this year sees companies in the top quartile for gender diversity being 21% more likely to experience above-average profitability than companies in the fourth quartile. The most ethnic and culturally diverse businesses see even better comparative performance, with those in the top quartile earning 33% more than those in the fourth quartile.

3. A publicly disclosed sexual harassment transparency report.

The value of this quite straightforward. It will be useful for employees, future employees and the public to have a full understanding of the levels of harassment which have occurred within Google’s walls. It will also go some way to drawing a line under the problematic nature of the Andy Rubin resignation, and will set the company on a course of renewed transparency.

4. A clear, uniform, globally inclusive process for reporting sexual misconduct safely and anonymously.

This change will further set Google on course to improving the lives of people at all levels of the business. It would improve the rights for victims of harassment and would provide better support as they go through the process of reporting misconduct. It would also give employees more confidence in coming forward should they be unsure whether to report something.

The organizers of the protest sum it up well, ‘The improved process should be accessible to all: full-time employees, temporary employees, vendors, and contractors alike. Accountability, safety and an ability to report unsafe working conditions should not be dictated by employment status.’

The goal to roll this out globally would be a positive step for the tech industry at large – showing that the organization can be united across borders in protecting the rights of all who work there.

5. Elevate the Chief Diversity Officer to answer directly to the CEO and make recommendations directly to the Board of Directors. Appoint an Employee Rep to the Board.

This will be the step which ensures numbers 1-4 are implemented and that there is the necessary accountability in place for these demands going forward. The CDO and Employee Rep will also be in the position to suggest changes in the realm of gender and ethnicity equity.

Five real positive changes for Google, its employees and the tech industry

It has been fascinating to see this protest play out across news sites and social media today. The walkouts in New York, Dublin, London, Berlin, Haifa, Zurich, Tokyo and elsewhere have shown that staff are united in wanting to see the company do better for their current employees, as well as the thousands of gifted programmers, developers, engineers etc. who are looking to the company as a potential source of work in the future.

It also gives Google a clear opportunity to listen to the needs of the very people who make the business work. If they implement these changes, and acknowledge that the actions of their staff are justified, then they can be seen as a leading light for worker’s rights and equity within the tech industry on an international level – a place which is too often seeing such values eroded.

Related reading

Search Buzz Video Recap: Google Halloween Update, Google News Changes, JavaScript SEO, Bing Ads, Matt Cutts & More

This week we covered what appears to be a targeted Google update that happened on Halloween. Google investigated publisher complaints around Google News and said they will release an update in the upcoming weeks to address some of them. Many SEOs now turn to EAT and the Google quality raters guidelines instead of links and PageRank. Google gives different advice for many reasons, here is one on removing versus improving low quality content. Google wants your feedback on how to improve the search results snippets. GoogleBot is closing in on modern browsers, which is a good thing for indexing. Google complained more details on lazy loaded scrolling events, it was super interesting. Google cache is showing some people the mobile version of the site. Google says you should fill in the meta descriptions on your pages. The Google Home Hub does not support speakable markup and it is pretty embarrassing. Google My Business added a setting to stop the Google Assistant from calling you. Google My Business also dropped addresses for businesses that use service areas. Google has redesigned their hotel search results. Bing Ads sunset version 11 of their ads API. Google AdSense added a quality feature to require sites to be verified. Google’s mobile home page now has the discover feed on it. Matt Cutts was interviews on the Internet History Podcast and the SEO community should all listen to it. That was this past week in search at the Search Engine Roundtable.

Make sure to subscribe to our video feed or subscribe directly on iTunes to be notified of these updates and download the video in the background. Here is the YouTube version of the feed:

For the original iTunes version, click here.

Search Topics of Discussion:

Please do subscribe via iTunes or on your favorite RSS reader. Don’t forget to comment below with the right answer and good luck!

Product Showcase: Brand Ambassador

Continuing our series of product showcases that started with the recently launched, today we’re going to take a look at Brand Ambassador, and the potential to help build a profitable network of online influencers.

A brand ambassador represents your brand in a positive way and provides a more consumer-centric view of the products or services in question.

Anyone who can do that for your company is worth their weight in gold.

In an era where people are prone to mistrust regular advertising, these messages seem to come from people just like them – regular people that they follow on social media and have opinions that they trust.

Potentially, anyone could fill this role for your company – a fan, a customer, a social media influencer, and anyone else who can spread the positive word about your company or products.

Of course, not all of them provide the same value to your company. And many of them may still need a little more incentive to become an active ambassador.

That is where Brand Ambassador comes in.

Brand ambassadors, like any other resource, can be hard to manage. And, if you’ve been building a community of ambassadors by sponsoring their work in the form of free products or other compensation, you need to make sure they are providing some real value in return.

In the world of social media, this is especially important. You need to know who is posting about you and to make sure the content the ambassadors are creating actually supports your brand.

We are talking about Brand Ambassador in this context because they have documented instances in which companies have been able to drive a 40% increase in sales in three months.

This product gives companies the ability to:

  • Manage the existing ambassador base to improve the quantity and quality of content creation and maintain commitment through better communication tools.
  • Obtain reliable up-to-date data on their ambassadors’ daily activity by centralizing all content and measuring impressions, engagement, and exposure generated.
  • Scale and empower their ambassador network to drive sales through referral codes that track clicks and conversions. This also means they can create custom ambassador incentive programs.

This software is designed to incentivize current paid influencers to join the platform by making it easier for them to create content across all social media channels. The entire process is quite simple. It goes a little like this:

  1. Invite your customers, friends, and followers to be a part of a community of Brand Ambassadors.
  2. Communicate directly with them so you can keep them involved with any campaigns, promotions, and any other important events.
  3. Analyze the results through the metrics that are tracked in the system. It gives the brand the ability to measure and understand the amount of influence a social media push (and each individual ambassador and influencer) really has.

Influencer marketing can have a huge impact on your brand. To show exactly how much, Brand Ambassador shared a case study with us in which it took just 4 months for a brand’s network to grow exponentially. In those few months, their network was able to generate more than 3k posts, earning more than 500k likes, and 25k comments. They calculate that the visibility grew to nearly 8.5 million impressions in a single month of activity.

Is It for You?

The right influencers can have a huge impact on your brand perception, the question is how much value is each one contributing.

This platform is designed to help you manage your influencers and motivate your advocates (and potentially provide rewards for their activities). You can expand and utilize user-generated content to leverage the power of these online personalities.

Brand Ambassador is offering an effective tool to help brands communicate with their influencers and ambassadors.

We’ve discussed before how important a network of influencers can be in the modern marketing world, but many people don’t know where to even start. Sure, to some extent, a good ambassador community will form naturally, but if you go about it the wrong way, you could hinder your efforts before you get things rolling.

This tool may be what you need to kickstart your community growth.

SEO Risks to Take and SEO Risks to Avoid

Many business people have a hard time seeing the value of SEO, and we understand that. There is a lot of information and misinformation out there about search engine optimization, and it can really increase the difficulty of this decision.

While an effective SEO strategy will take many months to yield positive results for a company, and the tactics may not make sense for non-SEOers, it can make a significant improvement in a company’s online exposure and profits.

Many companies are naturally nervous about trying something out they’re not familiar with, but let’s face it, nearly any business decision is going to carry some risks with it. Some can be avoided, and some can lead to sudden and serious growth.

The question that has to be asked, then, is what SEO risks are worth it, and which should be avoided.

Before we get into it, though, let’s put this out there first: the greatest SEO risk a company can make is to avoid SEO all together. Everyone got that? Great. Let’s consider a few more.

SEO Risks to Take

1.  Making and Testing Large and Small Changes

The overall goal of SEO is to get traffic and, ultimately, transactions on your website.

Neither of those will happen if nobody clicks on your website in the first place.

So, what if you’re getting your website to rank well for certain keywords, but no one is actually clicking on your link?

There could be any number of reasons for this, and it can take some time to zero in on exactly why it isn’t performing as well as expected.

And the only way to do that is through A/B testing. You’re going to have to take one element at a time, whether that’s the meta descriptions, the titles, the content and more, and test them against new variations.

That’s all well and good and even a little obvious. So what makes it a “risk”?

It will likely take a bit of trial and error to come up with the correct wording and layout combination that results in maximum website traffic and transactions. During this time, you may find a combination that doesn’t work well at all and ends up reducing what traffic you do have – at least for a while.

The risk is worth it, though, because once you find the best results, you’ll be able to focus on that element and continue to drive more traffic and get better returns.

2.  Getting and Giving High-Quality Backlinks

Why would one company feature a link to another company’s website and risk the web user leaving their page?

Backlinks are a well-established part of SEO, and most companies want to get as many of them as they can. They help increase rankings and build authority.

However, it’s not just about being the one with the most links. Sometimes you need to give a little back.

So, yes, you may risk losing a few web visitors by providing a link to other high-quality sites, but at the same time, you’re showing Google that you are using and referencing reliable sites with established authority.

Just keep in mind, webpages that knowingly feature links to low-quality, malicious, spammy websites are at risk of getting penalized by Google. You may also get penalized by getting too many links to your site from those poor-quality sites.

3.  Enhancing Your Site’s URL Structure

Ideally, your homepage URL should be short, with only the company name, such as: Short, simple, concise and easily remembered.

Subsequent pages, however, should have targeted keywords and be more specific about the content of the webpage.

Even so, you don’t want to let the URL get out of hand. If they’re too long and descriptive, the search engine will truncate their display with a […] after a cut-off point.

So, it may be time to alter some of your URLs with an overhaul of the site’s structure.

The risk, here, is that any kind of change like this can impact your rankings. As you alter old URLs and 301 redirect traffic to the new ones, you may see some dips in traffic and rankings.

However, if you do it right, you can end up with a streamlined structure that appeals to both search engines and internet users.

4.  Overhauling Your Website

Every once in a while, websites need to get updated and redesigned. Website redesigns can be risky and expensive, not to mention time-consuming.

Eventually, though, your website may need a new facelift. Maybe it just looks extremely outdated. Then again, it may be optimized for search engines, but human users find it difficult to navigate. There could be any number of reasons to take another look at your website and maybe – just maybe – consider reconstructing it from the ground up.

Of course, just like changing the URL structure, these types of changes come with a risk to your rankings as Google tries to re-evaluate your site. For that matter, it comes with the risk of alienating customers who have grown accustomed to your website just the way it is.

Usually, though, Google understands that every website goes through these overhauls every once in a while, so your rankings will usually bounce right back. You just have to be patient. Most of your customers will eventually get used to the changes, too. More importantly, updated your website has a better chance of bringing in many more new clients.

5.  Buy Expired or Available Domains

Some website owners, for whatever reason, don’t renew their domains, making them available for others to buy and use.

Buying some domains with a history and redirecting them to your site can potentially be a quick and easy way to increase the number of valuable backlinks adding some link juice to your site.

There are some serious risks with this technique, though, so you should only do so when you know exactly what you’re doing.

The domain, for example, has to be related to your business. It should be professional and legitimate, because if that domain still receives rankings and traffic, those visitors will be redirected to your site, and there is nothing more frustrating than arriving on a site that isn’t at all related to your original search.

Also, expired domains that were filled with spammy content and links will also be transferred over to your website, causing your site to potentially drop in rankings and get penalized by Google.

This tactic, however, is inexpensive and has the potential to drive serious traffic to your site if you follow the best practices.

|Get a free website report and see how your site is currently performing.|

Good and Bad SEO Risks2

SEO Risks to Avoid

Now that you have an idea of what SEO risks are worth taking, here are SEO risks that will likely do your business more harm than good:

1.  Poor Doorway Pages (or any doorway pages at all)

Doorway pages are simple and easy to create in batches to target specific keywords and keyword groups. Trustworthy SEOers avoid doorway pages as a rule because Google greatly dislikes them and penalizes sites that use them.

Google’s opinion of such pages should be reason enough for you to avoid this particular risk.

The only time Google will let doorway pages slide is if they offer unique, clear and valuable content and information to the site visitor – in other words, only if it acts just like the regular content on your website.

There is simply no reason to bother with them, so don’t risk it.

2.  Disallowing Neutral Backlinks

You want good backlinks to your website, not bad ones. What about the ones that are neutral, that don’t help, yet don’t hurt your website’s ranking and SEO?

Neutral backlinks may not give your website the SEO boost it needs, but they also won’t subject your site to Google’s potentially harsh penalties.

In fact, with Google’s Penguin update, some penalties for bad backlinks because the search engine realized that the websites themselves don’t have control over every site that links to theirs.

As a result, it is harder for a site to be punished by Google for malicious backlinks.

The only way you’ll be able to tell if the backlinks on your website are bad, spammy and low-quality is if you’ve noticed that Google has taken manual action on your site.

If no action has been taken against your website by Google, the backlinks on your website are safe, though they may not be high enough quality to boost your site’s search rankings.

It is possible to disavow certain links, but you need to be careful about it. If you attempt to disavow all your neutral links, you risk potentially blocking sites that can improve your ranking.

3.  Deleting or Condensing Content or Entire Pages

It may seem like no big deal to delete a page from your website, especially if it is about a product or service your company has discontinued.

Once a page is deleted, the keywords it once ranked for are now gone. The same thing happens to the URL of the page, which also includes those page-specific keywords.

Instead of risking the loss of those rankings, consider keeping the webpage even if you’ve discontinued the product. Simply leave a message on the page for the visitor that redirects them to a similar page with a relevant product or service.

If you’re merging or condensing two pages into one, make sure to include 301 redirects on the old URLs to make sure that all the link juice and traffic isn’t lost.

4.  Using Exact Match Keywords in Anchor Text

It may seem logical to have your targeted keyword as the anchor text for a link to your website. After all, you want your site to rank for that keyword or phrase.

This practice was popular for SEOers in the past who had the same logic. Unfortunately, this practice got abused by “black hat” SEOers who used an excessive amount of exact match keyword anchor texts to link to their websites – and the links didn’t exactly come from the most authoritative sites.

Since then, Google has greatly cracked down on this practice and will punish websites who overdo this practice. Don’t risk it. Look for more natural ways to link to your site and develop a more varied backlink portfolio.

5.  Making Too Many “Small” SEO Changes to a Site

Occasionally, it is a good idea to update the content on your website. In fact, Google favors fresh, updated content.

However, constantly changing the content and the look and feel of your website, even a little bit at a time, strictly for SEO purposes, will not go unnoticed by your website visitors or Google.

Making too many changes to your website or making the changes too often will raise red flags for Google which will likely see your webpage as suspicious and likely penalize your site.

Over time your site visitors will also notice the changes (especially since most of the changes were likely done for search engines instead of them). If this happens, they may find your site harder to navigate and find value. Some visitors may even start to think your site is suspicious.

Balancing Risk and Reward

SEO is essential for any business to succeed. There are many risks to SEO, some of which are worth taking because they can produce favorable results for a business. Other risks can harm and hinder a company’s internet marketing strategy and online presence.

As risky as SEO is, the only thing riskier is for a company not to do any SEO at all.

Social media has its own share of risks. So before you jump into your next campaign, download and complete this checklist to ensure everything is ready to go.

Download your free social media checklist

‘Long’ Putin-Trump meeting planned at G20: Kremlin adviser

Russian President Vladimir Putin may hold a “long” meeting with US counterpart Donald Trump during the G20 summit late this month, a Kremlin adviser told state news agency RIA Novosti Friday.

A “long and substantial” meeting could take place “on the sidelines of the G20 (summit)” that starts on November 30 in Argentina, Kremlin adviser Yuri Ushakov was quoted as saying.

He added that the terms of the talks are still being decided by Washington.

Ushakov said a meeting is also planned in Paris on November 11, where both leaders will attend a ceremony marking the end of World War 1.

But he said the Paris meeting will be “short”.

Putin and Trump last met in Helsinki this summer, after which the US leader was criticised at home for failing to publicly address sensitive issues with the Russian leader.

Moscow-Washington ties are under deep strain over accusations of meddling in the 2016 US presidential election. The two states are also at odds over Russian support for Bashar al-Assad’s regime in Syria’s civil war, and the conflict in Ukraine.

Pinterest Lets Businesses Promote Multiple Products in a Single Ad by @MattGSouthern

Pinterest has introduced a new ad format, which lets businesses promote up to 5 different products in a single advertisement.

Pinterest’s new “Promoted Carousel” lets advertisers include 5 images in one ad unit.

Of course, a Promoted Carousel could also be used to display 5 images of the same product, but businesses will likely get more mileage out of displaying different products.

As the company explains, this new ad format can be an effective way to raise brand awareness and drive performance goals like traffic and conversions.

“This format can present a product’s numerous features, drive additional purchases by showing multiple items in a Pin or increase awareness with a multi-image brand story.”

Each slide within a Promoted Carousel can have its own:

  • Image
  • Title
  • Description
  • Landing page

The carousel gets displayed in users’ feeds just like any other pin.

Users can swipe through the carousel while it’s displayed in the feed, and tap on an individual card to enlarge the image and view the landing page.

Pinterest Lets Businesses Promote Multiple Products in a Single Ad

Pinterest Lets Businesses Promote Multiple Products in a Single Ad

Brands who have had early access to Promoted Carousels are reporting results such as:

  • Lifts in brand awareness
  • Lifts in ad awareness and association
  • Improved ad performance
  • Three-times engagement rates

Promoted Carousels are now available for all business accounts in all of Pinterest’s ad markets.

Subscribe to SEJ

Get our daily newsletter from SEJ’s Founder Loren Baker about the latest news in the industry!


E-Trade Could Get Schooled by Online Trading Academy

Summer vacation may be just around the corner, but Market Brew’s Director of Solutions uncovers a real-world scenario showcasing how the established online trading firms E-Trade and Scottrade could get schooled by the Online Trading Academy before Labor Day.


In this modern information age, many industries are drastically changing the game-plan of business to keep pace with the ever changing technological landscape. When you consider that the largest growing demographic in the US workforce is the hyper-connected, tech-savvy Millennials (those born 1985-2000) who are expected to make up 50% of the US workforce by 2020, it’s no wonder that any business or industry which hopes to avoid being rendered totally obsolete is quickly adopting new tools, strategies, and business models to cater to this increasingly sophisticated and self-directed group of consumers. Furthermore, in the aftermath of the 2008 financial crisis in which Millennials watched in horror as their parents life savings and 401Ks were drained overnight at the hands of unscrupulous businesses which were foolishly handed the reins of control, there couldn’t be an industry more primed to capitalize on this increasing desire of individuals to control their financial future than the very industry which precipitated the crash – trading brokerage firms and the investment banking industry.

With fears somewhat assuaged due to the perception of increased safety measures at the hands of new regulation, and buying into the confidence brought on by this bull market in the last couple years, Millennials are increasingly throwing their money into the markets, though this time not as passive investors with one of the “Big 4” wirehouse firms like Morgan Stanley, etc (as their parents did), but rather to try their hand as active investors with a low-cost online brokerage which is day by day producing new tools, software, and education to cater to the busy, “on-the-go” lifestyles of this new global youth. As a professional trader myself, I can tell you that there has never been a better time to take advantage of this change that is increasingly empowering everyone from the hobbyist trader who is learning to trade through a plethora of online trading education resources to those on the bleeding edge programming entire trading systems.

Of course, the financial industry already knows that you are coming to join the party once again, the only question is which company or firm is going to get your business? As one might expect, the race to capture these new Millennials as clients is going to be decided in much the same way that these firms are already competing in the markets on every trade right now… it will be decided by those who have the fastest and best information and are able to employ it before anyone else.

Unlike the big wirehouse firms, these low-cost online brokerages catering to retail traders rely all the more heavily on the location of their websites at the top of search engines to capture the attention of prospective investors and traders and steadily build their client base. And while some of these online brokerages like E-Trade have been around for a while and can afford to outspend their competition when it comes to ads, the coveted organic search spots at the top of search engines for high value keywords are becoming increasingly vulnerable to being snatched up by smaller or more recent companies with a knack for search marketing analytics and the determination to supplant these would-be mainstays. Of course any such attempt to seriously compete with the big boys in the SERPs for this highly lucrative ‘online trading’ space will either require more resources to more quickly and efficiently implement the reigning SEO “best practices” of the day, or it will require pivotal information that others simply do not have access to at a time when it pays to know it.

The Leading Indicator of Search Performance

In any industry where the potential for profit with every new client is as high as it is for the online trading world, you can bet that the hundreds of firms popping up all the time to cater to the growing Millennial base will be making SEO a major priority and thus will be making changes to their site (both from a content and architectural perspective) all the time to make sure they are optimized to the hilt. The major drawback with nearly all SEO tools these days is that their competitive intelligence is only able to report on things like traffic and keyword performance that have already happened (and furthermore not even giving you the specific changes to the sites that produced those results) – thus these tools really are only able to provide intel in a kind of “post-performance scorecard” fashion. Depending on the industry and the site in particular, the search engines may only crawl it every few days to upwards of a couple months or more. This means that most tools (including Google’s own analytics) are merely showing you the END RESULTS of changes that have happened within your competitive landscape after it’s too late to do anything about it! It would be like a trader blindly submitting trades while only having access to historical market data as recent as two months ago and expecting a positive result from these blind guesses. Everyone knows that’s ludicrous. And yet, SEOs have suffered with these long testing cycles and infrequent crawl times since the beginning which has been one of the major roadblocks to establishing immediately actionable data indicators.

But what if you had the X-ray vision of a search engine and were able to point them at any site you wanted, whenever you wanted, in order to see which site would outperform the other, and all of this weeks before Google sees it? Do you think that might give you an edge over your competition when it comes to dominating the SERPS? As any trader or financial analyst knows, to get an edge on the competition you need to be able to preempt the market, in essence, you need information that you can use as a true “leading indicator.” When you have something like this that can be proven to give a statistically positive result, then it’s lights out for the competition and the game will have changed forever. If there were a tool for SEO that could provide this kind of “insider information” that is sure to indicate a particular directional move in the market before it happens as there is in financial trading, don’t you think you would do everything you could to gain access to that kind of intel? Of course you would.

Luckily for you, and for all those newer players in the online trading industry who already know the value of having technology that can put you a step ahead of the competition and enable you to compete where the pocket book alone cannot, there is a marketing analytics platform out there called Market Brew that does just that… getting you the intel you need about your website, your competitors websites, and the specific factors impacting your performance at every level within the search engine environment you are competing in…up to 60 days or more before they even get noticed by the likes of Google which of course only then gets passed on to the other merely descriptive analytics tools in the SEO space that rely on Google’s data to tell you what ALREADY happened on a competitors website weeks or possibly even months ago!

Brew’ing Your Future

A big advantage of the Market Brew analytics platform lies in the complementary power of its transparent search engine coupled with an optimization engine. In comparison to normal public facing search engines today, Market Brew’s search engine was designed to address some of the leading issues brands face in attempting to build their websites and content to please the search engines that they hope will send them loads of free traffic.

A few of the key benefits that arise out of this starkly different approach to search engine optimization are:

  1. Immediate Causality – Brands can take a baseline snapshot of their documents using the optimization engine interface. They can then make changes to their documents and instantly request that a new snapshot be taken, which includes a recrawling and re-scoring of their documents. Changes can be immediately attributed to specific outcomes, allowing brands to establish a set of cause and effect rules that improve accuracy and shorten the life cycle of the optimization process.
  2. Algorithm Atomicity – The optimization engine can provide access to any set of scoring processes in the search engine. Dynamic run-time algorithms like local and social can be removed and basic core algorithms can be isolated. Each specific algorithm can be tracked. Consequently, brands can attribute specific penalties or errors to a given algorithm, which allows targeted changes with intended results.
  3. Known Investment – Brands can now see exactly how far two documents are apart, given a base algorithmic score, as well as additional overlays such as keyword and query score. This allows brands to efficiently spend time and direct resources to the appropriate document. They can provide “just enough” resources for one document, and then use the remaining resources on additional documents in their network. A known investment is established up-front, allowing them to assess the given ROI before proceeding with implementation.
  4. Compressed Timeframes – With on-demand features, the optimization engine enables brands to have a well-defined schedule of activity. This puts brands in control of their business and removes any risk associated with deliverables.

When you have the ability to objectively and concretely see how any changes on your site as well as all of your competitor’s sites directly impact your revenue generating potential, this effectively means that SEO can finally be placed alongside SEM as a fully trackable, consistent, and predictable approach to marketing with ROI you can have confidence in.

You are now able to go beyond a mere isolated and thus non-contextualized picture of on-page and off-page factors for a domain (a limitation shared by nearly all SEO crawler tools out there) and finally begin to understand how these on-page and off-page factors are interconnected – very practically revealing the positive and negative downstream effects from any changes you or your competitors make (in terms of their impact on any number of metrics – including ranking and revenue potential).

It just so happens that the most valuable optimization I could focus on if I were working for the University of Phoenix is to improve my homepage’s performance for the keyword “online colleges.”

Market Brew allows you to visualize the butterfly effect that each optimization has on the others.

If you want to “Brew” a fresh new future for your business, then join in the fun and watch as I walk you through the Market Brew world of predictive analytics for search engine optimization, and see how just one of these benefits – that of the known investment – could help the online trading education franchise known as get a major leg up on established trading firms like and in the hyper competitive world of low-cost online trading.

The Competition at a Glance

We have covered some of the introductory steps to setting up a competitive market analysis inside the Market Brew system in other posts, so I won’t bother to reiterate that here, however it is worth mentioning that to kick off any analysis we must first input our list of keywords (this essentially defines the totality of the search market we are choosing to shine a light on) as well as the domains that we will be crawling and performing the analysis on.

For the sake of this case study example, I used the trusty Google Keyword Planner to find some relevant terms and checked their search volume, bid prices, and competition levels. The most relevant and highly competitive term I found that I thought would provide for adequate demonstration was “trading online”.

With a high competition score of .91 and a PPC suggested bid price of $16.54 along with a decent search volume, ‘trading online’ is certainly worthy of our optimization efforts.

Taking a quick glance on Google, we can see the current results of how the competition has played out so far for Trading Academy this term.

First page of Google results for ‘trading online’ reveals the top contenders.

Right away, disregarding the ads at the top of the page, we can see that the top 3 results are in fact the domains for three brands we will want to track (as opposed to articles or review sites, etc. that we wouldn’t bother to load into the engine). After loading in the keyword(s) and these 3 domains (and a couple others for posterity), I kicked off a crawl and waited for the analysis to finish (typically takes anywhere from 1-3hrs depending on the structure of the site). Since I already know which keyword market I am going to optimize for, I like to immediately jump into the Market Brew search results to see just how far apart these 3 competitors really are. Is it a 3-way race for first? Or is the leader already dominating and the race is for second? Let’s see…

The Transparent Search Engine Reveals a More Telling Story

We have covered some of the introductory steps to setting up a competitive market analysis inside the Market Brew system in other posts, so I won’t bother to reiterate that here, however it is worth mentioning that to kick off any analysis we must first input our list of keywords (this essentially defines the totality of the search market we are choosing to shine a light on) as well as the domains that we will be crawling and performing the analysis on.

For the sake of this case study example, I used the trusty Google Keyword Planner to find some relevant terms and checked their search volume, bid prices, and competition levels. The most relevant and highly competitive term I found that I thought would provide for adequate demonstration was “trading online”.

The Market Brew Search Results reveal that the race is really for the second place spot.

As we can see here, is also pulling up the top spot in the Market Brew search engine just as it is in Google, though with the search engine metrics revealed (one of the patented features of the Market Brew engine), we can now see that E-trade and The Trading Academy are not really all that close to the top spot but are instead in a nail-biting race for second. With such close query scores for both, it is likely that they often are exchanging spots in the SERPs depending on when you happen to search for it. With such a close race in hand, let’s see what we can find out that might give the nudge to the underdog and only non-brokerage in the group –

To get a better understanding of what these final aggregate query scores are being driven by, we can hover over the percentage scores and see the breakdown.

The Query Score Breakdowns reveal what categories a URL is outperforming or underperforming in contrast to the competition.

The query scores are the resulting combination of other categorical scores (Semantic & Net Total Link Flow) that measure the URLs performance from an on-page and off-page perspective. By looking at these query score breakdowns, we can readily identify whether a particular optimization needs to be made to boost the semantic score of the page relative to that keyword, or whether it is a more general problem of link flow that is diminishing the page’s potential ranking power. In this particular case, in comparing the two categorical scores for E-trade and Trading Academy we see that Trading Academy is doing great when it comes to optimizing its content for this keyword, however, it is unfortunately being hampered by a lack of link flow to this page (and likely to any other page that might be contextually relevant to this query).

While it is truly powerful to finally be able to see the distances between results and peer into the component scores for any particular search result like this (as well as being able to export the scores to excel for further analysis), if I wasn’t already determined to optimize for this keyword and wasn’t sure what my priorities for optimization should be, manually typing in searches and comparing scores to find the best opportunity for a high ROI advancement could be quite tedious and likely end in a less than optimal choice of what to optimize, further wasting my time! Luckily, I can turn to the Optimization Engine within the platform and let it cut through the mass of data for me and cherry pick the best opportunities I have to make some serious revenue gains for the least amount of work.

What Does an Optimization Engine Do?

The Top Optimization Simulations screen gives you immediate awareness of what the most valuable optimization tasks a digital marketing team should be working on, so that no time is being wasted by working on pages or keywords that are unlikely to drive as much incremental revenue into the business.

We call this screen the Top Optimization Simulations because that’s exactly what the engine is doing to produce these scores – it’s simulating the potential cost-benefit of every page within a site moving up to any other spot in the query rankings for all the keywords within the Analysis Group that you have chosen to track / analyze. To give you a better understanding of what that calculation looks like:

Market Brew simulates all ranking improvement scenarios from the current spot all the way to the top for every keyword and determines which keyword and page combination present the best low-hanging fruit to focus one’s resources on.

Since the Optimization Engine has already done the millions of calculations necessary to determine that, based upon the keywords I have entered into the engine, would be wise to focus its efforts first and foremost on the keyword “trading online”, and that the most likely page to rank highest for that is the URL because that combination of keywords and URL received the highest optimization score of 99.67. (Note: these optimization scores could change, and thus the top optimizations it would recommend, once we input overrides into the engine which help it to more precisely determine which keywords are historically driving the most revenue for a business as well as which pages tend to be more important from a conversion and revenue generation standpoint than others – both of which allow a brand to bring its own data to bear and hone the optimizations based upon their specific historical performance).

By clicking on the optimization score of 99.67 at the top of the list in the interface, we are taken to another screen that gives us a bit more information regarding this optimization.

Optimization detail pages give the user a high level view of the types of changes tat should be made and the relative value of the changes in helping to achieve an optimization result.

This details page tells me essentially the same thing as I uncovered earlier when I took a look at the Query Score Breakdowns, though in a much quicker and more readily identifiable manner, making it a no-brainer for SEO teams to stay on task and keep priorities in line. Rather than comparing query scores, I can immediately see from the pie chart that while does stand to gain a little by further optimizing some of the semantic factors for this page, the real focus should be on increasing the Link Flow to this page.

Ok good to know, but how do I do that? Spend 3 months and XXX dollars on a link building campaign? I could… but as anyone in the SEO world knows… that is so much work and takes forever! Might there be some things I can do within the confines of my site itself to change the link flow scores on this page? In fact there is!

The Magic of Link Flow Distribution

If you will notice in the optimization details snapshot above, the engine has identified 2 options for on-site changes that could help to boost that link flow score:

  1. Eliminating the Algorithmic Penalties
  2. Reallocating link flow to the page using the Link Flow Distribution tool

Though the engine tells me precisely how much I can increase my link flow score if I were to completely eliminate the algorithmic penalties occurring on that URL, I am going to focus on the second option here, Link Flow Distribution, because while eliminating penalties is often fairly straightforward thanks to the Market Brew’s tracking of numerous families of important algorithms that nearly all modern search engines take into account, the improvement to my score would merely be fractional at best (since these penalties suppress my gross score by a fractional percentage) whereas with a reallocation of link flow I can boost the gross amount of link flow to the page effectively achieving a boost to the link flow score oftentimes by a factor of 4 or more. As a first step, Link Flow Distribution is the clear choice as a first priority here.

Similar to the way Google and other search engines calculate a metric like PageRank, the Market Brew Link Flow Distribution tool runs a recursive calculation that could theoretically be run indefinitely (as the longer it runs it acquires a marginal increase in the precision of the calculations) to determine where link flow is divvied up once it hits any page on the domain, regardless of which page is receiving link flow from an external source. This means that the tool is able to disregard the external incoming link flow to the pages, and instead determine how the internal linking architecture actually divides up the link flow points. This is incredibly useful in identifying any major structural and linking issues that are impractically promoting internal pages that are likely not even intended to be landing pages for organic search traffic and therefore don’t need that link flow promoting them and can be redirected to the pages which are instead in a position to leapfrog a competitor in the SERPs and bring additional revenue to the company.

The top ten most internally promoted pages on based on the sites current linking architecture.

Take notice of where the homepage is in this list, denoted by the single “/” in the path. Not only is it not at the top where one might expect, but it’s not even in the top 5 – instead crawling its way into the 8th spot. Anything you might notice is the absence of the power trading workshop page from this list. Although it is one of the key landing pages and does receive some decent link flow from its inbound links from other domains, when it comes to how the site deals with that link flow once it flows in, it clearly is choosing to send that ranking power elsewhere and diluting this landing page’s ability to rank as high as it should.

The page we want to optimize is currently only receiving .18% of the total Link Flow available within the site making it the 164th highest page in terms of its Link Flow Distribution.

After scrolling down a ways, I see the page in question pulling up the 164th spot with a paltry amount of link flow being allocated. Clicking on the URL hyperlink takes me to the URL specific metrics within the system so that I can begin the process of digging into what might be causing this misallocation of link flow.

The Link Listing tool gives an exhaustive and detailed list of the link relationships for this URL ordered by the amount of link flow shared by the link (in this case we are looking at the top internal incoming links for the power trading workshop page.

Browsing through this list I can quickly see that aside from a powerful image link in the main content of the homepage and a couple text links on the education page, the amount of link flow being shared with the power trading workshop page is nowhere near the amount it is sending out to other pages in the site (see below).

The Link Listing tool showing specific amounts of link flow share distributed by each link.

The reality –> The net outflow of link flow for this page is far greater than the net inflow = major Link Flow deficit and a page vastly underperforming in the SERPs!

With a simple and straightforward reallocation or addition of inbound links and changing of link factors to sculpt the amount of Net Link Flow Share that is flowing through links that can’t be redirected, Online Trading Academy could easily get a 10x boost in its link flow score for this page and easily surpass E-Trade in the rankings and possibly even Scottrade as well. These are somewhat quick and easy changes that don’t require time consuming and costly link building campaigns. Considering the search volume of this keyword, its bid range, and the historical click thru rates for the top 10 spots in the SERPS, just jumping from the 3rd spot into 2nd could be worth at least an incremental $10K worth of PPC clicks every month! And that’s just the boost that would be realized from one keyword. When you add to it the fact that this link flow boost would categorically improve this page’s chance of ranking higher across ALL queries, then you really begin to see the power of what focused and informed action can do in the SEO world.


At the end of the day, for a few hours’ worth of work and with the help of an integrated search engine and analytics platform like Market Brew, could move the needle enough to account for the salaries of a whole team of digital marketers… now that’s what I call ROI!

About The Author

Giordan heads up the solutions engineering team at Market Brew. He is an honors graduate in Computer Science and Information Technology from Purdue University where he was a multiple NCAA Academic and Athletic All-American as a captain of the swim team. As a professional athlete on the USA National Swim Team, Giordan was a member of the silver medal winning relay at the 2006 World Championships in Shanghai, China and a finalist at the 2008 Olympic Trials. For more information about Market Brew, visit

How Market Brew Changed SEO

This article was reposted from Market Brew’s blog . . .


Over the past few years, we’ve received a lot of questions about how Market Brew was able to flourish in a space where most SEO software platforms took a hit to their accuracy and stability.

The short answer is due to Market Brew’s unique approach. Back in 2006, when the idea of a transparent search engine model was being formulated (and patented) by Market Brew’s founders, most of the SEO tools space was starting by scraping, or crawling Google search results on a daily basis, and indexing that data to be retrievable by URL or keyword lookup.

Most of these approaches were heavily dependant on Google’s data. This data was fairly transparent back then, since it was a pretty easy to understand mechanism. But as the data and its relationships grew more complex, these approaches failed.

Market Brew, on the other hand, started from the bottom up. The hypothesis was that, eventually, Google’s search engine would act as a very complicated black box, and using that as an input to any SEO tool would be fatal.

Today, Google couldn’t be more of a black box, and Market Brew couldn’t be sitting any prettier.

The approach, argued below, is the first to unlock the black box, and inevitably will be seen as the only approach that lets SEOs truly classify how Google’s search engine changes on a day-to-day, and month-to-month basis.

The Early Beginnings

The founders actually started out doing SEO like everyone else.

Back in 2006, they found early success by automating a lot of their internal SEO optimizations. After scaling their local successes nationwide, these large-scale optimizations started to become noticed by Google’s search engineers.

Over the course of the next year, they played a high-stakes game of “cat and mouse”, and each successive loophole that they exploited in Google’s algorithms was closed up. By late 2007, they had patented a “navigable website analysis engine”.

After many more years of research and analyzing hundreds of thousands of sites, Market Brew’s standard model was born. This, in turn, led to many other discoveries and inventions like Market Brew’s self-calibrating search engine model.

Alli AI for SEO
2008 – The founders were filling their closets with computers.

The first patent, filed in 2007, was the first of many that outlined the founder’s vision for the future of SEO: eventually, search engines would get so complex that the ability to model this black box would be very important and useful.

The founders were banking on Google eventually removing or obfuscating much of the data that they were currently sharing with the SEO world. It was their own experience, which had been on the cutting edge of SEO optimization, that led to this conclusion.

Holding Their Ground

They were aliens in the SEO world. Who the heck needed this? In reality, all you needed at that time was a good website auditor. Google wasn’t that hard to figure out.

By 2009, hundreds of SEO tools had sprung up, mostly based on scraping Google data and regurgitating that data in more and more unique ways. The main players at this time were all going vertical: all-in-one solutions that tied together different APIs, mostly from Google.
2010 – Filling closets was so old school. Now the founders were filling entire rooms full of computers. All of this would soon disappear in a transition to the cloud.

The founders even considered a business model where they attempted to sell data streams to many of these all-in-one platforms. In the end, most of the platforms simply didn’t need that level of data yet. A good stream of backlinks, rank tracking, content analysis, and the website audit was all you needed.

Things changed very slowly at first. I recall setting up next to Blekko at Pubcon one year, who had just announced a feature where users could “explore their search engine” for a small fee.

Finally, a well-backed Google challenger had made the same realization that the founders of Market Brew had: SEOs would need to have outside help from a search engine of some sort.

Google Firewalls Its Black Box

Sure enough, Google began a slow march towards data obfuscation.

First, their backlink operator stopped working. This was a treasure-trove of data that allowed website owners the ability to understand how backlinks factored into their final search rankings.

Then “(Not Provided)” started showing up in everyone’s analytics package. For years, website owners depended on understanding which keywords were driving traffic to their site. With this information, they could essentially cross-correlate search results to give them a clear picture of how many of Google’s algorithms worked.

After killing off crucial keyword data, Google’s Pay-Per-Click API to its Keyword Planner was cut off or obfuscated, rendering thousands of agencies without the ability to know which keywords to target.

By 2014, we all know about Matt Cutts’ departure as Google’s official liaison to the SEO community.

It was official: the data and explanations had dried up. Subsequently, throughout this time many of the all-in-one platforms began to get burned. The intensity of this downturn, of course, was determined by how many data streams a vendor relied on Google for.

2011 – after shifting to the cloud.

Because Market Brew’s approach didn’t rely on scraping (using) any of Google’s data, they avoided a downturn from this anti-transparency march — something that couldn’t be said about the rest of the field.

Conventional SEO Tools Start To Fail

In late 2012, Google started quietly introducing Artificial Intelligence into its core search algorithms, and the seemingly direct relationships between inputs and outputs to its black box began to get real fuzzy.

The majority of SEO tools at this time modeled Google as a semi-transparent box that, with enough data, could be explained in a straightforward manner to its users. Unfortunately, what remaining data they were relying on, had dried up. And now, search engine mechanisms were as confusing as ever, even to Google engineers themselves.

There are two major things that happened to Google’s search results that represented existential threats to the way SEO tools were being used.

  1. The Caffeine Update: this made the information updated in its search results (things like META Title, Description, and matched snippets in the HTML itself) asynchronous with the scoring updates. For instance, changes that you had just made to your web page would be shown in the results, but then a few months later your rankings would change. It was now impossible to attribute rankings changes to specific optimizations, simply by relying on what was being shown on the search results.
  2. Dynamic Algorithmic Weightings: up to this point, Google had manually curated the weightings of its various algorithmic rules. Because this was a manual process, there were two advantages: first, these weightings didn’t change very often; and second, these weightings weren’t very different from one search result to the next. After the introduction of A.I., the weightings changed as much as daily, and each search result had its own set of weightings.

Google search results became incredibly hard to decipher at this point. The major tools that SEOs were using could not deliver reliable results, and all of a sudden approaching a search engine by reading its final output seemed silly.

The Unique Approach Gains Momentum

By 2013, the founders’ unique approach to SEO began to shine. With the lack of quality data from Google, most SEO ranking tools became lagging indicators with very little distinction on why those ranking shifts happened.

On the contrary, the search engine model approach was ready for primetime. In early 2013, the founders sat down in their Palo Alto, CA office and brainstormed on how they could take advantage of Google’s faucet of data being shut off.

They had successfully built and demonstrated a search engine model with families of algorithms that they would fine-tune (manually) whenever a major algorithmic shift would occur. But now, Google was threatening to change the mixture of these algorithms much more rapidly and without public fanfare (and documentation).

Large brands began to adopt the technology at a rapid pace. In late 2013, after months of R&D, they realized one of the final missing pieces in the approach, that would end up revolutionizing the way teams did SEO.

Market Brew realized the search engine model approach still had the same critical flaw of the conventional SEO tools: how do you deal with Google always changing its algorithms?

Because of their bottom up approach, they had a major advantage over everyone else. They could fine tune the model at a much more granular level. More control over inputs meant that their models could easily be trained to find the right mixture of algorithms for each environment.

There was only one problem: using a brute force approach, trying to simulate every possible permutation of algorithmic weightings, even on the fastest computers today, would take on the order of thousands of years to find a stable output that behaved like the real search results.

Fortunately, after thousands of hours and hundreds of thousands of dollars of R&D, they finally had a breakthrough: they had figured out a way to machine learn the behavior and characteristics of any target search result in a matter of minutes. To do this, they borrowed a genetic algorithm, an Artificial Intelligence technique called Particle Swarm Optimization.

By the end of 2013, Market Brew was voted #1 out of 60 Silicon Valley Big Data Startups by judges from Oracle, Draper Fisher Jurvetson and Xignite.

Unlocking The Black Box

The main impediment to the search engine model approach was finding the correct combination of inputs into the model. After discovering what genetic algorithms like Particle Swarm Optimization could do for this process, they were able to create a simple mechanism that allowed Market Brew users to take the generic search engine and transform it into a Google-like model of their choice in a matter of minutes.

Today, Market Brew clients do this many times for all kinds of models (mobile vs. desktop, local vs. national, etc.) and even use this process to track the changes in behavior and characteristics of Google when it is undergoing an algorithmic “shift”.

Most algorithmic changes today aren’t new algorithms, rather they are refinements to the weightings of the already existing ones. From the Market Brew side of things, this translates into users re-calibrating their models whenever their models diverge from the real thing. Not only do they get a newly calibrated model, but they get an easy way to do a before / after comparison of the model settings.

Because Market Brew stores every data point in the model from the moment users start using the system, the dynamic inputs on the model can be compared across newly calibrated versions. This means that Market Brew turns into machine-learning the machine learner, so to speak. Google puts more emphasis on backlinks? Your search engine model re-calibration will indicate that.

Conceptual diagram of Particle Swarm Optimization in action with a single global minimum. Image credit: Jonathan Becker.

By 2016, the word was out that Market Brew had won its founders’ decades-long bet. In early 2015, Search Engine Journal gave me the opportunity to describe this vision. Among many industry outlets, TechCrunch and Search Engine Land began interviewing me as an expert in artificial intelligence for SEO.

Thanks, Grandfather

It’s inevitable that engineers and data scientists ask: given a black box, how would it be possible to build a search engine model with thousands of inputs? How in the world would you know which algorithmic families to model?

Market Brew got really lucky. They’ve rarely used Google’s “I’m feeling lucky” button, but this is one time they used it.

If you remember, they started out in 2007. Back then, Google was pretty simple, and semi-transparent. Most of the new algorithms that were being introduced to its search engine were well documented.

Their model paralleled this complexity. At first, they just had a few major core algorithms in the model. This worked really well. They ran nightly QA regression tests across all of the data to make sure the models were stable representations of the real thing.

As per usual back in the day, Google would add a new algorithm. The regression tests would sound an alarm: something was not right. The models, no matter what mixture of algorithms, just couldn’t correlate against the real-world search results.

So they would start trying new algorithms in the model. One by one, they eliminated the possibilities, until they found an algorithm that perfectly brought the generic model back into stable correlation with real world search results.

They continued this iterative tradition for almost five years, until Google stopped making it a tradition of adding major algorithms to its core engine. By 2012, most of the Google updates were changes to the mixture of algorithms, rather than adding new ones.

Because they were able to incrementally fill one gap in the model at a time, they were able to, with great confidence, identify all the major pillars in Google’s modern search engine. Here is more information about artificial intelligence for search engine optimization.

About The Author

Scott Stouffer is a Co-Founder and CTO of Market Brew is an enterprise-grade technology that enables SEO teams to build models of search engines which, like a Google simulator of sorts, allows them to make changes to their web sites, and predict how their Google rankings will react 60 days from now. Mr. Stouffer is a graduate of Carnegie Mellon University and holds a M.S. in both Computer and Electrical Engineering. He has been behind the wave of technology at Market Brew. For more information about Market Brew, visit

Marketers must capitalize on the new wave of email innovation

The latest round of acquisitions – Adobe/Marketo, Salesforce/Rebel and Twilio/SendGrid – is exciting because it means email-led innovation is entering a new era. It’s proof positive that email is not just surviving but thriving

For email service providers, it’s a warning not let up on innovation and improvement. As a marketer, you should not be content to coast on past success, either. These acquisitions and innovations signal that email is an investment-worthy channel, with cool things happening to make email an even more valuable driver for revenue and engagement.

What are you — and your ESP — doing to capitalize on the changes happening all around you?

First, how we got here

The last two decades saw a steep innovation arc, driven by leading ESPs that invested heavily in platform growth and technology. Those investments spurred other participants to invest in innovation to stay competitive. Marketing automation, personalization, reporting and other critical areas benefited from this mass concentration of brainpower.

These acquisitions and investment rounds shaped the email landscape over the last 18 years:

Timeline of email technology acquisitions

Click on graphic to see it larger. Use your browser back button to return.

Although these acquisitions signaled faith and innovation in email, they also had an unintended effect after 2013. Innovation stalled because companies no longer had competitive giants like Responsys or Exact Target to push them along.

Innovation rides a second wave

That leadership vacuum opened the door to smaller, more nimble companies outside the ESP space. Their innovative initiatives pushed companies along the innovation arc instead of being pulled by them.

Marketing and advertising technology providers, digital agencies and ancillary SaaS programs now challenge every aspect of the email space.

Additionally, ESPs and other industry players are growing exponentially, faster than at any other time in our history — even including the boom-boom years around the turn of the 21st Century.

Now, it’s all about the data

Every brand and technology provider wants email at the center of its martech and ad tech strategies. Email itself has grown far beyond its original construct – to send a message from Point A to Point B. If that’s how you still see your email program, you need to rethink your concept of email because it doesn’t have a place on the innovation arc anymore.

Probably the most important part of this is the fact that email kills cookies. The primary email address beats cookies as the consumer identifier across channels. It’s the supreme identifier and the one you need to persuade your customers to give you. (Key to that persuasion? Building trust and delivering on your promises!)

The primary email address is the digital equivalent of a Social Security Number, through which innovation is unlocked and empowered. That rebranding of email is where we need to focus.

Innovation is ready to ramp up again

With the recent Marketo, Rebel and SendGrid acquisitions, we see the new era for the maturation of email because each of these powerhouse companies has taken email in a new and exciting direction.

These acquisitions are about more than the eye-popping sums their buyers paid for them. These companies have shown us what they can do with their data and their capabilities.

As a group, these acquisitions say to the rest of us, “Email’s value goes way beyond sending a message.”

Here’s my short list of companies that are stretching email in new directions that we couldn’t have anticipated 10 years ago:

  • Monetization/Onboarding: LiveRamp, LiveIntent
  • Interactive messaging: Movable Ink, Liveclicker, (new player)
  • Machine Learning and Natural Language Processing: Phrasee, Persado
  • Email design and rendering: Litmus, Email on Acid
  • Retargeting/Remarketing: Traverse Data (new-ish player)
  • Validation: BriteVerify (Validity), Kickbox, Webbula

I mentioned earlier that this expanding innovation arc has implications for ESPs, and now you can see why.

ESPs that are content to coast, tweaking features under the guise of “improving on a theme,” or thinking it’s just a matter of time before somebody waves a big check in their direction because they’re “special,” will get left behind.

You know what happens to people who fall behind the pack, right? They’re the first ones to get eaten.

I’m not talking about orchestrating omnichannel marketing, the Never-Never Land, all-in-one fantasy that few if any providers have truly achieved in a frictionless way. Rather, I mean understanding what’s new and coming up with email and embracing it (and selling your executives on the need to dedicate time, money and people to make innovation happen).

I also don’t mean innovation for the sake of innovating, which leads to “shiny toy” syndrome. We know how futile and wasteful that can be.

Rather, think of innovation in terms of what it can do for your customers. How are you enabling them to push their limits, and what new tech developments could help you better serve them?

Bringing it home to those that matter

As you plan for innovation in 2019, whether you are looking to grow (customers, sales, revenue) incrementally or set off an earth-shattering kaboom, take stock of the technology you already use, what your competition is doing, what the top brands in your market are doing and the innovations you wish you could use.

SendGrid, Rebel and Marketo share a common characteristic: They grew beyond the classic definition of email. They pushed email beyond the limits of conventional wisdom. Instead, they expanded and redefined email, its capabilities and how people could use it to make their lives better. That’s what may have driven their value.

  • SendGrid appealed to large-volume senders like Uber, Spotify and Glassdoor, three industry-disrupting companies, none of which needed a traditional ESP.
  • Rebel (formerly RebelMail) developed a technology that allows consumers to buy items in an email without leaving the inbox.
  • Marketo improved and grew marketing automation in ways that accelerated B2B email communication.

Twilio, Salesforce and Adobe might have bought these companies, in part, because they saw the expansion of email as critical to driving business. They saw these companies push innovation, rather than be pulled by it.

How can you translate this to your own business? Focus your energy on the push of your own innovation and not allowing outside forces to pull it away.

You know your customers better than just about anybody else in your company. What do they want? What will they need, even if they don’t realize it yet?

Your worst-case scenario is to stall out, to wait until the time feels right to think about innovation. As you put your 2019 plans together, think about what you need to do to stretch your use of email so that it works better for your team, for your company and for your customers.

Then figure out how to make it happen.

This story first appeared on MarTech Today. For more on marketing technology, click here.

Opinions expressed in this article are those of the guest author and not necessarily Marketing Land. Staff authors are listed here.