FTC slaps Facebook with long-awaited $5 billion fine for privacy violations

Anticipated for months, the Federal Trade Commission (FTC) voted today to fine Facebook a record $5 billion for apparent violations of a 2011 consent decree that required the company to better protect user privacy. This is according to reports in the Wall Street Journal, New York Times and AP. The commission voted 3-2, reportedly with the two Democrats voting against according to the New York Times’ report.

Fines and supervision. In addition to the fine, “Facebook agreed to more comprehensive oversight of how it handles user data . . . But none of the conditions in the settlement will restrict Facebook’s ability to collect and share data with third parties,” the Times says. “And that decision appeared to split the five-member commission. The two Democrats who voted against the deal sought stricter limits on the company, the people [familiar with the proceeding] said.”

Triggered by the Cambridge Analytica scandal, the FTC investigated Facebook for more more than a year before deciding to impose the record fine. And the massive penalty may well signal a new, more aggressive enforcement posture by federal regulators toward technology companies in the absence of federal privacy legislation. The full terms of the settlement, which reportedly include ongoing oversight, will likely be disclosed early next week.

The largest FTC fine prior to this was a roughly $22 million penalty against Google in 2012 for circumventing no-third-party cookie default settings on the mobile Safari browser (“Cookiegate”).

They saw it coming. Facebook had been expecting the fine and prepared shareholders during its most recent quarterly earnings release. The company also set aside $5 billion in advance to pay for it. Accordingly, the penalty has probably already been factored into Facebook’s stock price. But even $5 billion is not that significant for a company with revenues of more than $55 billion in 2018.

For its part, Google has received several multi-billion dollar fines in Europe for various antitrust violations. Despite this, Google has emerged almost completely unscathed. Similarly, Facebook is unlikely to be materially impacted by this fine.

In the wake of Cambridge Analytica and other data-related controversies, Facebook has pivoted to more forcefully embrace privacy and regulation.

Why we should care. Before we can do any assessment of the marketing-related fallout from this settlement, we’ll need to see the formal terms of the agreement. However, as the New York Times story suggests, none of Facebook’s core ad capabilities appear to have been compromised.


About The Author

Greg Sterling is a Contributing Editor at Search Engine Land. He writes a personal blog, Screenwerk, about connecting the dots between digital media and real-world consumer behavior. He is also VP of Strategy and Insights for the Local Search Association. Follow him on Twitter or find him at Google+.

YouTube Introduces New Ways for Channels to Make Money via @MattGSouthern

ADVERTISEMENT

YouTube will soon be rolling out more ways for content creators to earn money directly from viewers.

These new features were announced this week at VidCon, which is celebrating its 10th anniversary.

Super Stickers

Building off of a feature introduced last year, called Super Chats, YouTube is introducing Super Stickers.

Super Chats let viewers buy messages that stand out in a live stream, while Super Stickers will let viewers send animated stickers for a fee.

YouTube Introduces New Ways for Channels to Make Money

YouTube Introduces New Ways for Channels to Make Money

Creators earn a cut of the revenue YouTube makes from each purchase of a super message or sticker. Some streams are earning over $400 per minute, the company says.

For nearly 20,000 channels, super chats are now their number one revenue stream. Perhaps that number will grow even more with the introduction of stickers.

Channel Membership Tiers

YouTube is building off of another existing feature with the addition of membership tiers.

Previously, viewers could pay a flat monthly fee of $4.99 to get access to exclusive content.

Now, creators can introduce tier-based pricing, which would allow them to offer more content for a greater monthly fee.

Creators can set up to five different price points for channel memberships, with varying perks at each level.

This now puts YouTube in competition with Patreon, which is a platform used by many YouTubers to earn revenue from dedicated fans.

Patreon has always allowed creators to set up different membership tiers. Now that YouTube offers the same capabilities it will be interesting to see how this affects Patreon’s business.

More Merch Partners

Creators now have more opportunities to sell merchandise with the addition of five new partners.

Joining Teespring is Crowdmade, DFTBA, Fanjoy, Represent, and Rooster Teeth.

Eligible creators who distribute merchandise through one of these companies can now take advantage of YouTube’s ‘merch shelf’ feature.

YouTube says thousands of channels have more than doubled their total revenue by using these new tools in addition to advertising.

Why an SEO should lead your website migration

Why an SEO should lead your website migration

Change is a natural part of a business, particularly when it comes to your digital presence.

The need to rebrand, switch up the CMS (content management system), consolidate your resources or revamp the architecture and user journey of your website, is ultimately inevitable. And whatever the goal may be, it is not uncommon for all major initiatives to fall under the umbrella of a contemporary digital marketer.

How does Google feel about changes

One thing to keep in mind, however, is Google’s tendency to be less than accommodating towards major website changes, especially URL changes. And who can blame them? Whilst Google’s algorithm may be able to detect semantic differences between websites, it’s somewhat unrealistic to expect it to also realize that the similarities between store.hmv.com and hmv.com mean they’re both the same brand.

Therefore, without acknowledging this, many domain changes result in staggering losses of traffic and rankings, and suddenly the most well-known brand in an industry becomes non-existent within Google’s universe. It is therefore imperative to ensure the changes you’re making can be correctly comprehended by Google.

How to understand Google

Expecting a lonesome digital marketer to be a jack of all channels is quite unrealistic. But luckily you don’t need to be. There’s a whole industry of people who are dedicating their days to figuring out how to think exactly like Google, and they can help you avoid the risk of decimating your hard-earned keyword rankings (unless you’re doing black hat tactics, in which case, those rankings aren’t very hard-earned after all). This industry is SEO.

Three pillars of SEO

Before we dive into the value SEO, here’s a quick summary of the three key pillars:

  • Accessibility: Technical workings of the site. This includes everything that Googlebot takes into account when understanding your site’s code. Basically, all the tags and developer language that are telling the crawlers how the site should be interpreted.
  • Relevance: Content your visitors and Googlebot came for, including all of the text and metadata on your pages, blog posts, and even videos – everything your visitors see.
  • Authority: Backlinks from other sites, with each one counting as a “vote” of confidence, which Google takes into account when ranking.

chart on the three pillars of SEO

So with that crash course, we can now connect the dots between SEO expertise and high-level migration requirements.

Why you need SEO

Whilst a website’s appearance is important, first and foremost it’s crucial to understand how you’re going to explain the changes you’re making to Google. We suggest a handwritten note:

“Dear Google,

Don’t worry, some things are changing but we still love you, so here is a comprehensive, incredibly large map of URL redirects detailing the new versions of the exact same pages you know and ranked the first time around.”

On a more serious note, however, here are five ways in which the expertise of an SEO professional can propel your website towards successful migration.

1. Taking the complexity out of URL mapping and redirects

Since a site’s internal linking and page equity is an essential part of SEO, we deal with redirect handling and URL mapping and all the complications that come along with it, all the time. Therefore, you have to make sure each redirect makes sense, and also that each page is able to take on the new status. Common issues at this stage can include:

  • Incorrectly implemented redirects (302 or the dreaded 307) that may undermine your intentions
  • Extremely long or even infinite redirect loops, which will cause Google to rage-quit the page or even your entire site
  • Redirects to irrelevant pages, which Google may not mind too much but will annoy your users

Just in case you’re not convinced, here’s a scary graph of what happens when you don’t do this properly.

graph showing repercussions of bad URL mapping

Source: Croud

The process of telling Google what’s what extends beyond redirect mapping, it also includes on-page work. Specifically, the canonical tag.

Fun fact: 301 redirects don’t actually stop Google from indexing your pages, so if you left it at that, you would just end up with some poor rankings and some confused users. Luckily, your friendly neighborhood SEO knows all about the various ways to help encourage Google to drop your old page out of the index as it goes along your new site.

2. Understanding your website’s behavior

So, you’ve done all the mapping and have set up just how to introduce Google to your new site. While that’s very exciting, we do have to remember the “understanding” part of these first several weeks. The primary reason for site migration is to provide a new and improved site that will (hopefully) gain more traffic and drive more business. However, without understanding how your original site performed, it’s very difficult to establish if your new site is actually superior. This, therefore, highlights the importance of benchmarking.

Of course, you may know how much traffic your ad campaigns – and even your website in general – are pulling in, but you’ll need to know more than that to be successful. As SEOs, our aim is to understand your site as much as the search engines do, which as explained above, is much more than just content on your pages.

To paint the best picture of your website before you migrate, use several tools that provide a variety of key SEO data points:

  • Keyword rankings and their respective landing pages
  • Links to your site
  • Pages with 200 (and non-200) status codes
  • Crawl volume and frequency

By aggregating the different metrics and views of each tool, you can create a beautiful, detailed portrait of how your website behaves, and how it’s interpreted by both search engines and users. Astute benchmarking will allow for in-depth, helpful post-migration analysis, particularly for those metrics that can only be recorded at a particular moment. There’s no way to tell how fast your pages loaded, or how many pages returned non-200 status codes last week. If you don’t gather this information beforehand, you won’t be able to fully report the impact of the migration.

After you complete the migration, you can gather this data again to truly judge your results. Everyone will remember to check the new traffic statistics, and even the new rankings, but only an SEO will remember to check that those numbers make sense and you haven’t accidentally orphaned half of your product pages. SEOs will make sure users aren’t just on your site, but crawlers are too. With proper data at your disposal, you can set about making iterative improvements which will undoubtedly be necessary.

3. Migrating your tracking tools

All this talk about performance and results is for naught if you can’t actually track any of it. Much like Google’s search engine, Google tools aren’t so keen on supporting your site migration either. Therefore, you have to make sure you’re ready to start tracking the new site, ideally without losing your old data.

Dealing with various tracking tools and codes all the time, an SEO has to be a Google Analytics expert too (it’s commonly a requirement on most resumes). So how do you avoid a scenario in which either you have no historical data and can’t measure the success, or when you have two different accounts and have to do the calculations for performance comparisons by yourself? By making plans to migrate your tracking tools.

Ideally, you’ll use the same analytics tracking code for the migrating site, so that the old metrics can be directly compared to the new numbers once it takes place. Need some more persuasion?

Take a look at this graph detailing a successful site migration.

graph detailing a successful site migration

Source: Croud

 4. Testing and the importance of the human touch

So you’ve planned all your new pages, and your new site is built. What’s next? Hopefully, it’s built in a staging environment and not actually live. If it’s not, you run the risk of causing all sorts of issues with duplicate content and ranking cannibalization.

However, your SEO can easily take charge of this with a robots.txt directive (which will haunt them until the site is live and they can change it). Despite its purpose, a staging environment doesn’t always reflect the search engine’s behavior since it lives in isolation. There’s no way to track backlinks or see exactly what it will look like in a SERP at this time.

Often, Googlebot doesn’t even fully crawl staging environments, because it’s seen as time-wasting. Therefore, your SEO’s brain is your very best test.

Everyone will check that the pages are set up as planned, but your SEO will be the one who can thoroughly re-test each individual redirect at 2 am. This will likely be the last time that any mistakes will be recognized before launch, so it’s critical to make sure that every redirect behaves as expected and that they are all 301 status codes.

Lastly, you’ll need to make sure that a single XML file stays live on the legacy site, containing all the legacy URLs. This will be used to push Googlebot through the old URLs and onto the new site, expediting your meticulously-mapped redirects.

5. Launching and mitigating loss

Finally, you’re ready to flip the switch and the champagne bottles are out. So you turn on the new site, and congratulations – you’ve just lost 20% of your traffic.

No, really, congratulations. In case you forgot the daunting chart we shared earlier in this post, website migrations can cause damaging losses, and sites that don’t prepare accordingly, often never recover. However, if you’re smart and you hired an SEO expert to take charge of this project, they’ll have the task at hand.

Your traffic loss is a product of search engines and users not recognizing your new site – temporarily. Your SEO will have made sure everything is set up properly, so Googlebot is quickly figuring out that your new site contains all the same high-ranking, trustworthy content as on your old site. It’s still a little miffed at you for changing on it, so you may only get back on the second pages of results.

You’ll still have some further optimizations to do, but it’s much easier to go from page two to page one, rather than page ten to page one.

Just remember, we’re guiding this migration from an SEO perspective. Googlebot is basically a person, so as long as it can read the site, we assume that users will enjoy their experience too.

Kailin Ambwani is a Digital Associate at global digital agency Croud, based in their New York office.

Related reading

Google Sandbox Is it still affecting new sites in 2019

alexa.com search tools updates competitive analysis

How to run a successful competitor-focused paid campaign

Nine Google Ads hacks to improve your CTR and conversion rate

NinthDecimal Introduces Multi-Touch Attribution for Offline Store Visits

Location intelligence company NinthDecimal is rolling out what it calls “the industry’s first multi-touch attribution (MTA) solution for foot traffic measurement.” The approach takes a more holistic look at different consumer touchpoints and how they impact offline store visitation.

NinthDecimal President David Staas says the company already has “200 customers running live with 500 different campaigns” and that there has been a very positive response from brands and agencies. Staas characterizes the MTA approach as “a fundamental rethinking of foot-traffic measurement.”

Next-gen location analytics. The company sees MTA as the next evolution of campaign measurement and analytics. It differs from “traditional” multi-touch approaches in its relative simplicity (quick set-up) and offline measurement capability.

NinthDecimal also says it’s the only multi-touch offering among the company’s location-intelligence/location analytics competitive peers, which include Foursquare/Placed, PlaceIQ, GroundTruth, Factual, Blis, Cuebiq, ThinkNear, Ubimo and others.

For the past several years, mobile-location data has increasingly been used to measure the impact of a single digital channel or sometimes two channels on offline consumer actions and incremental store visitation. In this way, it has brought new audience insights and new data on the efficacy of media, by connecting the digital and physical worlds. However, there’s always been a quasi- last-touch attribution problem with online-to-offline analytics focused on the impact of a single channel or campaign. NinthDecimal is bringing a broader lens and attribution framework to digital and soon to cross-channel (traditional and digital media) measurement.

Multiple touchpoints weighted. According to the company, “the MTA approach fractionally applies credit for visitation across every relevant customer touchpoint… Brands can use MTA based insights to optimize across audience segments, creatives and other aspects of their marketing or content to have the greatest impact on real business metrics like revenues and customer growth.”

In traditional location analytics, there’s a control and exposed methodology and a trailing attribution window (example: did those exposed to the (mobile, video, OOH) campaign show up in a store within 30 days?). Brands and agencies can then understand incremental lift and optimize campaigns that are driving foot traffic and sometimes in-store sales.

Many more variables in the mix. The MTA approach looks “back” at various categories of data — audiences, creatives, publishers and media touchpoints — before the offline visit. The media are weighted according to an algorithm and the data are aggregated (millions of impressions/exposures). NinthDecimal and its customers can then see which publishers, media and creatives are having an offline impact overall and on which audiences.

NinthDecimal says it gets data from a wide range of sources and says it sees 270 million devices monthly. Through direct and data partner relationships it can measure TV, search, social, display, video, OOH and print media. It also says it now has more than 200 audience attributes that marketers can target and optimize against.

Why we should care. Most digital marketers are still using a first or last-touch attribution methodology, let alone more sophisticated location analytics. Most enterprises have wanted multi-touch attribution for a long time, but it has been complex to set up and is often unreliable because it’s based on abstract formulas that may or may not reflect actual consumer behavior. The combination of MTA and store visitation data potentially solves some of these challenges for brands, retailers and others who ultimately care most about mapping and optimizing media against real-world business outcomes.


About The Author

Greg Sterling is a Contributing Editor at Search Engine Land. He writes a personal blog, Screenwerk, about connecting the dots between digital media and real-world consumer behavior. He is also VP of Strategy and Insights for the Local Search Association. Follow him on Twitter or find him at Google+.

Search engine results: The ten year evolution

Search engine results: The ten year evolution

The search landscape has always been one that is evolving. Right now the big discussion is around Google’s control on how much search traffic goes to publishers vs. stays on Google.com.

This might end up being a really big point for Google when it comes down to challenging the long-held point of view that competition is just one click away.

Over the past ten years that I’ve been writing for this publication, I’ve written an article each year where I review how paid and organic search can work together, and how brands appear in search listings. This article over the past few years has started to evolve into how Google has changed the search page. These changes include the inclusion of more paid search results, shopping, and local listings. As you will see from the data there is certainly a trend that is fueled by both Google’s growth objectives as a publicly-traded company, as well as consumer behavior shifts (that are, mobile and local).

The first piece is the overlap of paid and organic listings. What I’m tracking here is the number of times a brand appears in both paid and organic search results as a percentage of total paid results. For example, if there are three paid search ads, and GEICO, Progressive and Liberty Mutual all appear in both the paid and organic listings that would score 100%. I’ve been tracking five verticals since 2010. What’s really interesting is over the past few years the amount of overlap has gone up on average. However, this year the overlap dropped by 44% year over year. This had a lot to do with the drops across financial services, travel, and technology.

industry-wise paid and organic listings

Source: Google Search Data

Factors driving this change

I think this trend is driven by two factors:

  1. Increasing competition & costs – CPCs (cost per clicks) were up by five percent between Q4 2018 and Q1 2019
  2.  Rise of other areas of optimization – Local Listings/Maps/Google My Business and Shopping. These additional areas have provided both a distraction and a release valve for additional traffic, and higher ROI.

graph showing the sources driving change YoY

Source: Google Search Data

So what has been happening to the other areas of optimization, especially local and shopping? I have also been tracking these areas over the last three years. The change is exactly what you would have expected. Over the past three years, the percent of search terms that have local listings has increased more than three times, from 11% in 2017 to 38% in 2019. Retail continues to have 100% of listings with the map pack. This validates the importance to both Google, brands, and consumers of having a local presence. Also gives additional credence to optimizing and cleansing your location data, not just on Google, but across the web.

graph showing industry-wise growth of shopping ads YoY

Source: Google Search Data

Shopping ads have come on in a big way

Shopping ads have also continued to have a strong presence and have grown slightly. They are up from 43% in 2017 to 47% in 2019. Shopping ads provide a more visual experience for the consumer, and some very strong conversion rates for brands. Google has also been continuing to evolve their shopping product announcing a redesigned shopping experience in May. This included new ad formats, online to in-store options, and Smart Campaigns (which help encourage SMBs to get into the game). All these changes and enhancements demonstrate a commitment to the product and the value to both consumers and brands.

What should you be doing as a search marketer?

So what is the impact of this data to us as search marketers? I think there are two key takeaways:

  1. Search is more than just keyword listings and optimizing your website. Yes, SEO and paid search are critical, but extending beyond that is also very important. Optimizing your product feed, and local listings are very important to success. You must think about how you are there for your consumers regardless of what they are trying to do, buy, visit, or call.
  2. Having a winning strategy for critical search terms is important. You might not be able to afford top placement for paid search, or technical issues might be preventing you from having the top organic listing. However, that doesn’t mean you don’t need to have a presence for those terms over the long term. These could be choosing specific times of the day, or including schema markup for event listings. Whatever the case may be, there are now more ways than ever to be found, and having a very thoughtful strategy for your brand is key.

The search engine results page will continue to evolve as consumer behavior and technology evolves. Think about the continued expectations of online to offline buying behavior, real-time inventory, or the impact 5G will have on the marketplace. Remind yourself to take a look around at a macro-level to see the trends vs. always focusing on detailed keyword level optimizations. You will often find some great trends to help put your strategy in context.

P.S. Special thanks to Audrey Goodrick who helped pull together this data. Thank you for your help this summer Audrey.

Related reading

Three tools providing actionable competitive research insight

Google Sandbox Is it still affecting new sites in 2019

alexa.com search tools updates competitive analysis

Cheetah Digital’s acquisition of Wayin Inc. aims to bring first-and ‘zero-party’ data to marketers

Direct marketing provider Cheetah Digital announced that it has acquired data solutions firm, Wayin Inc. Wayin specializes in data acquisition technology that allows marketers to create interactive quizzes, questionnaires and games while collecting first-and so-called “zero-party” data from consumers who opt-in to participate. Coined by Forrester, zero-party data refers to information that is actively provided by users rather than inferred.

Why we should care

As consumers become increasingly wary of handing over their personal data to brands, Wayin’s technology gives marketers the opportunity to create digital experiences that deliver value to the consumer in exchange for their data. Since zero-party data is proactively provided by the consumer, the solution could enable digital marketers in the Cheetah Digital ecosystem to stay in compliance with GDPR and other privacy regulations while creating more opportunities for advanced personalization. 

“Marketers have never faced a tougher challenge than they do today. Consumers are demanding personalized experiences, global consumer privacy regulations mount, trust in and effectiveness of third-party sources deteriorates, and CEOs demand greater efficiency of their marketing spend. Smart marketers are turning to zero- and first-party data, to not only be compliant and build customer trust, but deliver exceptional brand experiences,” said Sameer Kazi, CEO, Cheetah Digital.

Wayin’s enterprise platform joins the Cheetah Marketing Suite and Cheetah Loyalty platform.

More on the news:

  • Marketers can expect to be able to create and execute campaigns and experiences across any digital channel to generate consumer data.
  • Data generated from Wayin’s interactive experiences will flow into Cheetah Loyalty or Cheetah Digital Marketing suite, allowing marketers to target leads with triggered, personalized messaging based on users’ self-reported data. 

About The Author

Jennifer Videtta Cannon serves as Third Door Media’s Senior Editor, covering topics from email marketing and analytics to CRM and project management. With over a decade of organizational digital marketing experience, she has overseen digital marketing operations for NHL franchises and held roles at tech companies including Salesforce, advising enterprise marketers on maximizing their martech capabilities. Jennifer formerly organized the Inbound Marketing Summit and holds a certificate in Digital Marketing Analytics from MIT Sloan School of Management.

Ten ways to pump out a stream of great content without burning out

Ten ways to pump out a stream of great content without burning out

There’s always more content to write. 

Sometimes that can be encouraging, even exhilarating. You’ve got plenty of space for all your ideas, and countless opportunities to engage with potential customers and to build a stronger relationship with existing ones.

But producing a constant stream of content can be exhausting.

You’ll find yourself running out of ideas and running out of steam. And at that point, it can be really difficult to keep creating high-quality content on a regular basis.

Even if you’re in a position to hire someone to help, you’ll still need to have a fair amount of involvement in content production – supplying ideas and outlines, at the very least.

So how can you keep up with all the content you need to produce? Before we dig into some specific tips, let’s take a look at how much you actually need to create.

How frequently should you post on your blog and your social media accounts?

There are no rules here different blogs do different things, often within the same industry. In the content marketing world, for instance:

  • Smart Blogger posts (very in-depth) pieces once a week
  • Copyblogger publishes three or four posts a week
  • Content Marketing Institute posts one piece each weekday

As a rough guideline, you’ll probably want to aim for at least one weekly post, one daily Facebook and/or Instagram post, and three or more posts a day on fast-moving networks like Twitter. (According to Louise Myers, the “general consensus” is that anything from three to 30 Tweets per day is fine.

So how do you keep up with this level of content, week after week?

How to create great content without burning out

Here are nine ways to keep up your content production without getting to the point of feeling so burned out that you simply give up.

You can use these as a step by step process, or you can pick and choose ideas that’ll make your existing process go more smoothly.

1. Decide how often you’ll post content

While there’s no “right” answer to how often to post content, there’s definitely a “wrong” one. Posting content whenever you feel like it, at wildly varying frequencies.

It’s best – for you and for your audience – to have a consistent posting schedule, both on your blog and on social networks. That might mean, for instance, two blog posts each week, one Facebook post each day (more may be counter-productive), and five Twitter posts each day.

While you might vary your schedule a little, having a clear idea of what to aim for makes it much more likely that you’ll write and publish regular posts.

2. Come up with a suitable pattern for your content

With social media, in particular, it’s helpful to “pattern” your content. This is also a useful practice for blog posts, especially if you post twice a week or more on your blog.

Rather than starting with a blank page when it comes to generating ideas, you can have a pre-set “pattern” for the content you’re going to create.

For instance, if you’re writing five Twitter posts each day, you might decide to have:

  • Two posts linking to other people’s great content
  • One post linking to your most recent piece of content
  • One post linking to a piece of content from your archive
  • One post that asks a question or prompts a discussion

3. Brainstorm lots of ideas

Simply coming up with ideas for content can take a lot of time. Instead of sitting down and staring at a blank page, try “batching” the idea generation process: set aside time once every week or two to come up with a whole list of ideas.

Some great ways to find content ideas include:

  • Common search terms within your industry: this is part of keyword research and as well as being a useful SEO tool, it’s great for idea-generation.
  • Questions that you frequently get asked by potential customers.
  • Problems that you faced when you were starting out in your industry.
  • Other people’s content – could you create something that tackles a topic in more depth, or from a different angle?
  • Your own content: can you go back to an old blog post and update it, or take some social media posts and weave them into a piece for your blog?
  • Asking influencers for their contributions – this might be in the form of a quote or two from one person, or a “round-up” post with quotes from lots of different experts.

4. Outline longer pieces of content

With short posts on Twitter and Facebook, you probably don’t need an outline – just a clear idea of what you’re trying to accomplish.

For blog posts, though, you’ll find it’s much faster to write when you’ve got a solid outline in place, especially if you’re producing long-form content. Again, it’s often a good idea to “batch produce” your outlines, by picking four or so ideas and outlining all those posts at once.

That way, when it’s time to write those posts, a lot of the hard work is already done. Plus, if you outline several posts in a single session, you’ll find it much easier to create links between them.

5. Write several short pieces of content at once

Instead of opening up HootSuite (or your favorite social media management tool or app) every single time you want to send a tweet or create a post, write lots of posts ahead of time.

You might want to queue up a week’s worth of posts all at once. Buffer is a great tool for this, allowing you to schedule posts to go out at any time you want – making it easier to reach potential clients in other timezones or those on unusual schedules.

6. Set aside focused time for longer pieces

Creating content requires a lot of focus – it’s not something you can easily do while you’re fielding phone calls or responding to emails every few minutes.

Block out periods of time (ideally two hours long) in advance, where you can shut your office door, ignore your email, and let calls go to voicemail.

6. Set aside focused time for longer pieces

While you may have no choice but to self-edit your content, if it’s possible, get an editor involved. This might be someone already on your team, or a freelancer external to your company.

A good editor will go far beyond correcting spelling mistakes and grammatical slips. They’ll help to ensure your content is well structured, that it flows smoothly, and that it’s as engaging as possible.

8. Have an assistant format and upload your content

If you’re uploading all your own posts on your blog and social media, you’ll be spending time finding images, selecting categories, adding hashtags, including links, and so on.

While these tasks are an important part of the content creation process, they don’t need to be done by you. Delegate as much of the repetitive work as possible to an assistant so that you can free up more time to write or design the content itself.

9. Get ahead and take time off

If content creation is starting to feel like a treadmill that you can’t get off, then you’re probably heading for burnout. Plan your schedule so you can get ahead, perhaps by creating an extra piece or two of content each week.

That way, you can take a week off from content creation occasionally (plus, you’ll also be covered for any unexpected events, like a particularly busy period, or illness).

10. Repurpose your existing content

There may well be excellent blog posts in your archive that rarely get read, and your social media posts will almost certainly only gather fleeting attention.

Instead of always coming up with fresh ideas and creating new pieces from scratch, how about reusing some of your existing content? That might be as simple as writing an updated version of a blog post, and republishing it – or it could involve something more involved like turning a series of tweets into a blog post, or turning a post into an infographic.

Valuable, high-quality content is great for your business, your potential and existing customers, and your SEO. By trying some or all of the tips above, you can keep up the flow of content, without burning out.

If you have a tip for creating lots of great content, consistently, feel free to share it with us in the comments below.

Joe Williams is the founder of Tribe SEO. He can be found on Twitter at @joetheseo.

Related reading

Improving your site's SEO by checking duplicate content

Three ideas to create a high-converting product page

SEO writing guide From keyword to content brief

Three fundamental factors in the production of link-building content

Expert advice. Actionable tactics. Lowest rates expire Saturday!

Join us at MarTech®, September 16-18 in Boston, for an unrivaled conference experience featuring 55+ sessions and keynotes, 60+ real-world experts, hundreds of martech solutions, AI-powered networking, and the actionable tactics you need to overcome the universal challenges of modern marketing:

  • Struggling to see results? The experts you’ll meet are ready to share the strategies, playbooks, and tactics they’ve used to maximize ROI, build successful teams, and translate data into meaningful action. See the agenda here.
  • Looking for the right solution? Efficiently evaluate 100+ martech solutions in the Expo Hall. See how those solutions are producing results with case studies and comprehensive tutorials in The Discover MarTech Theater and Solutions Track.
  • Does your boss need convincing? Return to the office loaded with presentations, data, and evidence from successful brands that will support and validate your initiatives and strategies.
  • Need a deep dive on a specific topic? Choose from any of the 8 expert-led workshops on topics including agile marketing, SEO operations, building marketing teams, and evaluating solutions. Check out the complete list of pre-conference workshops.

Register NOW & enjoy up to $900 in savings

Alpha rates expire THIS Saturday, July 13 at 11:59PM… now is your chance to save big. Pick your ideal pass and register now! Once these savings are gone, they’re gone – so don’t delay!

  • All Access: Complete access to all conference sessions, keynotes, networking events, exhibitors, sponsor presentations, amenities, and more. Book now and save $450 off on-site rates!
  • All Access + Workshop Combo (best value!): Dive deeper and learn more with a half-day, pre-conference workshop. Book now and save $900 off on-site rates! (Workshop-only passes are also available.)
  • Expo+: Searching for marketing technology tools? Focused on growing your network? Pick up a FREE Expo+ pass to enjoy unlimited Expo Hall access, full-length Solution Track sessions, sponsor presentations in the Discover MarTech Theater, select networking, downloadable speaker presentations, refreshments, free WiFi, and more.

See you in Boston 🙂


Opinions expressed in this article are those of the guest author and not necessarily Marketing Land. Staff authors are listed here.


About The Author

Scott Brinker is the conference chair of the MarTech® Conference, a vendor-agnostic marketing technology conference and trade show series produced by MarTech Today’s parent company, Third Door Media. The MarTech event grew out of Brinker’s blog, chiefmartec.com, which has chronicled the rise of marketing technology and its changing marketing strategy, management and culture since 2008. In addition to his work on MarTech, Scott serves as the VP platform ecosystem at HubSpot. Previously, he was the co-founder and CTO of ion interactive.

Giorgio Armani, ModiFace bring 3D AR to WeChat

Armani Beauty WeChat VR Screenshot Giorgio Armani is introducing AR makeup try-ons on its WeChat mini program. Image credit: ModiFace

Italian fashion label Giorgio Armani is catering to the growing online market for beauty in China by becoming the first luxury brand to incorporate 3D augmented reality makeup try-ons into its WeChat mini program.

Beauty group L’Oréal’s AR makeup platform ModiFace will be supporting Armani Beauty’s virtual makeup application on WeChat, one of the leading social media platforms in China. More than a quarter of beauty buys in China are made online, underscoring the importance of prestige cosmetics brands investing in ecommerce tools.

“Today’s consumer wants to be a involved with something bigger than a brand boasting only status or legacy,” said Aleni Mackarey, chief operating officer at Base Beauty Creative Agency, New York. “She is looking for the brand that tells a story and creates a moment to make her feel like she is a part of the community.

“A try-on tool that allows her to be in the heart of the action earns her trust and keeps her coming back to see what’s next,” she said.

Ms. Mackarey is not affiliated with ModiFace, but agreed to comment as an industry expert. ModiFace was reached for comment.

Virtual makeup
Armani Beauty’s AR try-on is meant to mimic the virtual mirrors that are becoming more commonplace in Chinese bricks-and-mortar stores.

Users can virtually sample Armani products, such as lip colors, with the shades remaining consistent as consumers pose in front of their smartphone cameras. The app also allows users to take screenshots, save images, compare before and after images in a split-screen mode and share on social media.

Armani WeChat

Giorgio Armani is bringing virtual makeup to its WeChat mini program. Image courtesy of ModiFace

The beauty industry has been at the forefront of bringing AR uses to the masses, but this is the first use of 3D makeup try-on through WeChat. With more than 200 million daily users, WeChat mini programs such as Armani’s represent a significant ecommerce opportunity for L’Oréal.

Since acquiring ModiFace in March 2018, L’Oréal has partnered with platforms, including Amazon and Facebook, to bring AR experiences directly to consumers.

The beauty group’s partnership with Facebook allows it to bring the interactive technology to a wider audience, since many people are using the social network, eliminating the need for users to have to download the ModiFace app.

ModiFace technology is seamlessly integrated with Facebook, allowing brands from L’Oréal direct access to consumers for makeup testing. In addition to Giorgio Armani, brands such as Lancôme and Yves Saint Laurent allow users to try on different makeup looks virtually from their inventory of products (see story).

Previously, ModiFace integrated its AR into Samsung’s live video experience on its Galaxy S9 and S9+ phones, letting consumers explore makeup looks without needing a separate app (see story).

Along with ModiFace, Giorgio Armani also has a partnership with AR beauty platform Perfect365. The mobile app has been dowloaded more than 100 million times, and 65 percent of users are women between the ages of 17 and 34 (see story).

Beauty in China
China’s beauty market is of great importance to luxury brands, and prestige players such as L’Oréal need to continue to innovate as Chinese consumers are increasingly interested in trying lesser-known labels.

According to a new report from Reuter: Intelligence, 85 percent of women and 70 percent of Chinese men are curious about niche brands, and 92 percent of male beauty buyers say they prefer indie options. These brands are a growing competition for bigger labels, as consumers believe niche products put more investment into developing formulas than marketing.

Additionally, while Chinese consumers are heavily engaging and shopping on digital channels, they still value the bricks-and-mortar experience for beauty buying. The majority of consumers agree that physical stores enable them to try on products before buying (see story).

In China, 27 percent of all beauty purchases are made online, according to Euromonitor (see story).

L’Oréal’s sales were up 11.4 percent in the first quarter of the 2019, propelled partly by double-digit growth in its luxury division, including Armani.

In addition to L’Oréal Luxe’s buoyancy, the company also saw strong sales increases in Asia Pacific, at travel retail and in ecommerce (see story).

Six HTTP status codes most critical to your SEO success

Six HTTP status codes most critical to your SEO success

HTTP is the standard protocol defining how information passes between your visitor’s browser and the server hosting your site and HTTP status codes are your handy way of knowing exactly what is happening within that process.

For web marketers, it’s well worth the effort to get familiar with these status codes. By understanding your site’s backend activity you can recognize errors that demand attention and find opportunities to help improve (or at least not hinder) your SEO efforts.

A quick HTTP status code overview

HTTP status codes are three-digit numbers. The first number indicates which of the five categories each belongs to. The categories refer to either a type of request or type of error, as follows:

1xx status codes: Information request

Status codes beginning with a “1” communicate that a server is processing information, but has not yet completed the request.

2xx status codes: Success 

These status codes indicate that a requested information transfer was successfully completed. For marketers seeking to improve SEO, these codes mean that no action is required, everything is working correctly.

3xx status codes: Redirection

These redirect codes communicate that your visitor requested information that was not available at the targeted address.

4xx status codes: Client error

These codes signal that the client (the browser accessing the site) has encountered an error when trying to receive server information.

5xx status codes: Server error

5xx codes point to server-side errors, the client request was issue-free and yet the server could not finish the transfer.

Six HTTP status codes that are arguably most critical to SEO

While there are more than 60 HTTP status codes to be aware of, some are more relevant from an SEO perspective than others.

The following six status codes are especially important to understand and watch out for.

1) 404 – Not found

A 404 page not found error is perhaps the most commonly known HTTP status code and can signal to marketers that a page is failing to deliver content to visitors.

The server cannot return information because the resource or URL doesn’t exist. Landing at a 404 page is detrimental to SEO because unavailable content leads to a bad experience for both your audience and the search engine crawlers that are so critical to your SEO success. To address these errors, ensure that any 404 pages utilize a 301 redirect to reach an available and relevant page.

2) 301 – Moved permanently

You’ll recognize this code as the prescribed solution to the 404 errors just mentioned – a 301 status code means that the requested resource or URL has been permanently redirected somewhere else. This code is a valuable tool for sending visitors to relevant content that is available on the site.

Marketers can and should set up 301 redirects for pages that are no longer available so that their audience lands on useful content instead of error pages. The 301 code gives search engines the message to update their index for the page.

3) 302 – Found

Similar to code 301, code 302 is another type of useful redirect to know. However, this one is temporary rather than permanent. A 302 code directs browsers to a new URL, ensuring that visitors reach relevant content – but stops short of instructing search engines to update the page index.

4) 307 – Temporary redirect

This code offers a more specific redirect method than the 302 code and has the browser perform the redirect instead of the server. This is useful for sites served on HTTPS that are on an HTTP Strict-Transport-Security (HSTS) preload list.

Side note: If you are running an HTTP site, it’s definitely in your best interest to migrate to HTTPS.

Thus, using codes 301, 302, and 307, marketers can optimize SEO by closely controlling search engine crawlers’ understanding of what content exists, and how they ought to crawl and index that content.

5) 503 – Service unavailable

This error indicates that the server cannot process a request due to a temporary technical issue. The 503 code informs search engines that processing was stopped on purpose and tells the search engine not to de-index the page (as it would when seeing other server errors). However, if the 503 error isn’t resolved over a long period of time, search engines can begin to view it as a permanent error that warrants deindexing. Therefore, marketers should address 503 errors as rapidly as possible to avoid deindexing of the unavailable page and the negative impact on SEO that would come hand-in-hand with that scenario.

6) 410 – Gone

This dramatic-sounding code means that a resource or URL is unavailable because it was deleted on purpose and was not redirected. When search engines see a 410, they will remove the page from the index instead of redirecting. Marketers should be sure to properly correct any page issues or implement effective redirects so that visitors arrive at content pertinent to their search needs.

By at least understanding the most relevant HTTP status codes and properly addressing website fixes that can make or break SEO success – marketers can help ensure their sites function smoothly and offer the intended experiences for both search engines and potential customers.

Kim Kosaka is Director of Marketing at Alexa.com.

Related reading

How to run a successful competitor-focused paid campaign

Nine Google Ads hacks to improve your CTR and conversion rate

traffic forecasting customer journey

Five tips to create an SEO-friendly FAQ page