How the Digital Marketer Must Change as Automation Grows

The 2017 digital marketing technology landscape

Image credit: ChiefMarTec.com

As the hype around marketing automation fades and evolves into a regular part of the marketer’s job description, evaluation and adoption of automated technology and platforms (“Martech”) is growing to become a competency requirement for marketers across our industry.

88% of companies are already using marketing automation or plan to use some form of automated tech over the next two years. That’s at a clip of 31% growth over last year.

Marketing automation is here (it’s been here for a while) and it’s not going anywhere. As the number of tools, platforms, and opportunities grow for marketers, we must change how we think, how we work, and how we communicate as our potential reach spreads even further through automation opportunities.

A few thoughts on how digital marketers must change as automation grows in our industry.

The need for intellectual curiosity is as strong as ever

As the already enormous marketing automation and overall Martech space continue to grow, we as marketers not only have to learn how to rapidly evaluate the potential value and viability of new options but also how to operate our shiny new platforms and toys once we do say “yes”. How should we be thinking about evaluating every piece of new technology and every nice-to-have feature, from new data, to new lead scoring, to new bot-based messaging?

This all starts with clear, strategic goals.

Marketers must know what they want to get out of their automated technology before exploring and learning about what’s out there. Simply put, you have to have a north star. Only once that’s established does it become all about tactical tool evaluation, deployment, and execution. As Unbounce put it in a great podcast interview with the VP of marketing from Uberflip last year: “A fool with a tool is still a fool.”

What’s imperative is that we not try to learn the ins and outs of every single new thing that hits the market. That’s simply not going to happen.

What’s vital is the ability to quickly identify what’s needed based on your strategic goals, what’s going to get you there, and what’s not going to get the job done.

Waste and repetition are bad. Failing to plan your communication is worse.

While automated marketing platforms take away much of the repetitive, tactical work previously making up much of our day to day, marketers must begin to think more strategically at every step before they scale up message distribution.

For many marketers, we now have the ability to streamline a lot of everyday tactical to do’s of communicating with prospects and customers through automated platforms. Programmatic media buying, customer relationship management, email outreach, social publishing, and even social listening for automating community management are areas where marketing automation can be effectively leveraged. (We’ll hold comment on chatbot driven customer service for now.)

Effective deployment of these tools can mean a lot more time for all the bigger projects we want to tackle that have been held hostage by the pressing demand of day-to-day tactical tasks.

As those everyday tasks and time requirements are slowly and surely reduced through Martech, we can and should shift to a more strategic mindset. To beat this point into the ground: you didn’t download the 7 productivity apps on your phone (not judging) so that you could spend more time fiddling with apps. You did it so that you could have more time to do the things that were more important to you, and would make a meaningful difference in your work and life.

Market analysis, competitive differentiation, customer persona mapping, and personalization of messaging based on deep segmentation. This is where we need to live and, if we’re successful, the way that we use these automation tools themselves should become even more effective. It’s a virtuous circle.

Those strategic pieces, which can commonly be passed over in favor of ever more tinkering with tools, should become the mainstay of your work and can be the difference in what sets you apart from your competitors. There’s a great article on Medium by Graham Gnall about how and why to automate yourself out of a job that makes this point perfectly.

Data literacy for marketers is no longer optional

Farfetch CMO John Veichmanis (among many others) put it perfectly in an interview with Digiday: “Data is the New Marketer’s Currency”.

Regardless of specialty within our industry, marketers must understand how to collect and read data available to them. Most importantly, the interpretation of that data from raw numbers into insightful, actionable decisions is what brings all of this together.

Veichmanis goes on to say in the Digiday interview:

I’m definitely not a data scientist. But there’s a huge appreciation of what that team can do and deliver… Our job is to build meaning around the data to serve our customer more appropriately. That goes back to the basics of what marketing is: understanding consumer preferences… Ten years from now, any marketing team at any level will have those capabilities.

As a rule, platform-centric numbers and statistics are great for their convenience and reporting tailored to whatever the tool does. However, I will challenge that those pre-packaged dashboards and tool sets are built to highlight platform wins and in the worst case to sneakily hide potential pitfalls.

Marketers must be able to tag a website and pull their own data outside of any automated platforms they are using, whether you’re a giant with a 20 person analytics team, or working in a startup and wearing every marketing hat imaginable.

If you’re on the latter end of the spectrum, Google Analytics is the best free tool out there for this.

From the information available within Google Analytics (again, for free), every marketer needs to be able to find data, understand what it means post hoc (what pattern of behavior do the numbers indicate), and be able to turn that data into actionable, clear insights that can guide proactive decisions.

Those insights and actions should come full circle to drive evolving strategy within your automated platforms.

Proliferation of Martech is not without risk

Considering for a moment the incredible wave of entrants into marketing technology over recent years: if we look hard, there are probably at least a few dozen ways to achieve every single part of our digital communication plan. For some of the more common tasks, there could easily be a hundred or more tools that offer a particular feature.

That can be dangerous.

Go back to that north star of clear business goals. We have to learn to cut through the clutter of bells and whistles and look clear-eyed at what’s most important to driving your strategy, meeting your business goals, and ultimately what’s a distraction.

Once you’ve made the commitment to a new platform that has the features and functionality that can drive the results you need, we have to know when it’s working and when it’s not. And we have to be able to evaluate that based on data available to us outside of these marketing platforms. Again, whether it’s an analytics team that’s in lock-step with your marketing group or your own review of carefully tagged Google Analytics data, you must be able to answer without hesitation:

  1. “What did the tool actually deliver in context of our overall marketing and communication mix?”
  2. “Where did it fall short of expectations?”
  3. “What can and will we do about it?”

Where to from here?
When the marketing technology roadmap and the results are both running like clockwork, it’s time to get back to the essence of what makes great marketing. Whether that’s taking a look at your positioning, your brand pillars, your content strategy, or anything in between, getting to this higher leverage work and letting it guide your (wonderfully automated) day-to-day execution is where we need to live.

That’s when the fun starts.

The post How the Digital Marketer Must Change as Automation Grows appeared first on Portent.

Source: Conversation Marketing: Internet Marketing with a Twist of Lemon (Original

What I learned from teaching StubHub about React and Redux

Hey Swiz,
We have a 2-day Intro to React and Redux workshop scheduled at Big Company next next Tuesday and our instructor dropped out.
1 day of React, 1 day of Redux a week later. 7 to 8 hours each.

Can you jump in?

~ Ben

It was Ben from Real World React, the React training company with a great meetup here in San Francisco, but no real website. They do things the hard way: sales and networking.

By the time we met with StubHub to discuss what they were looking for, solidified that I was in fact doing it, and agreed on the business side of our arrangement, it was 4 days until the first workshop.

I was trembling. How the hell do you prepare a full day workshop in just 4 days? From scratch because all your materials are React+D3v4.

2 days into that weekend from hell, I got an email:

Workshop pushed to Friday. 2x people signed up. Needed bigger conference room.

Phew… Now I gotta entertain ~26 people, and I have 3 extra days to prepare 😅

My previous workshops had been with Freddy. He did React in the morning, I did D3 and React+D3 integration in the afternoon. All of our workshops were either at conferences or self-organized for people who wanted to improve their careers and paid for their own ticket.

A full day workshop… that’s new territory.

Can I even keep people engaged for 7 hours all on my own? Are corporate training people just as motivated? If your boss says “Yo, we’re moving to React. Here’s a workshop we organized for you. Go learn,” will you still care enough to pay attention?

Like… challenge accepted, I guess?

Stressss

The first workshop… well, I wouldn’t say it was a disaster. People learned React fundamentals and asked a lot of questions about state management. My answer to most of those was “We’re going to talk more about that next week.”

I think I aimed the content a little low for the room. Half the feedback said that we could have covered more content in a whole day workshop and spent less time coding things that aren’t super relevant to React principles.

Examples are hard. Balancing how much to let your audience code on their own and how much to hold their hands is tough as nails.

On the first day, I missed the mark.

Everyone started super engaged then drifted off and got bored. By 4pm, I lost half the room. The rest stuck with me until 5 even though it was Friday.

See how tired I get towards the end?

See how tired I get towards the end?

I had exactly a week to prepare the 2nd workshop. It was focused on Redux, but I don’t use Redux on the day to day! A lot of my code uses MobX because there’s less to type. Hell, my day job is still all Backbone 😅

But so is StubHub. Their goal with my workshops was to accelerate the transition from Backbone to React.

Objective for the week: Create and prep a more ~~boring~~ real world example, something more e-commercey, the kind of stuff these people do in their jobs. Reduce the gap between what you’re teaching and what they’re doing so it’s easier for them to apply later.

Less coding, more hand holding.

And you know what? The 2nd workshop day went much better. People didn’t get bored, they didn’t drift off, and there was a palpable reluctance in everyone who left at 4pm.

That’s just the way it is in the corporate world, you know. Friday 4pm, you’re out. Can’t blame ’em. It was hard to stay focused and engaged for 7 hours straight for me, too.

We did take breaks, but when that 3:30pm/4pm hits… you just start fading and becoming less and less engaging and vivacious as one audience member put it. “Very Vivacious presenter”… what a lovely compliment. ❤️

The hard part of leading a workshop like that is that your energy dictates the room of the whole energy. You have to bring the energy. You have to be engaged and engaging and vivacious and high energy. You have to pull everyone back when their eyes start to close and their mind starts to wander.

Questions were much better at the Redux workshop, too. It’s hard to put my finger on it, but I think that’s because Redux is a bigger intellectual leap from how they already do things than React is.

With React, you get HTML as a first-class citizen of JavaScript. Great.

With Redux, you get a completely new way of thinking about the architecture of your webapp. Whoa.

So… what did I learn from all of this? A few things 👇

  1. You can prepare a good full day workshop in 1 week. It’s stressful as fuck, but doable.
  2. Materials don’t matter as much as you think. With a good example project, you can wing it for 5 hours and everyone will love it.
  3. Use your livecoding practice. Write code live, talk about what you’re doing, take questions.
  4. Leave smaller blanks for people to fill. At a workshop, people want to write code and see it run on their laptop. But the blanks you leave for them should be small. Smaller than you think is worth leaving.
  5. Show some code, leave a blank, fill the blank, repeat. People are most engaged with short 2 to 3-minute blanks to fill. Write a bunch of code, leave a blank, tell them what to do, give them a few minutes, fill it in yourself, continue showing code.
  6. Take anonymous feedback. In person, everyone says they loved your thing. Or they say nothing. When it’s anonymous, they don’t fear telling you exactly how it is.
  7. Just do what the feedback says When you get that feedback, you know what, just do whatever it says and everyone’s happy. Who woulda thought, eh?
  8. This stuff is still new to a lot of people My biggest surprise was that when you step away from the bleeding edge conversation Twitter, the world is gigantic and huge. We live in a little bubble where everyone knows more than we do and is far more advanced and we feel left behind. People at big corporations, the people writing JavaScript every day for billions of dollars, they still struggle with fat arrow functions and spread operators and transpiling and all the things we take for granted.
  9. Enterprise sales are hard Ben says it took him around 6 months to organize this workshop from when he first started talking with StubHub. Corporate training is an enterprise sales business.

The post What I learned from teaching StubHub about React and Redux appeared first on A geek with a hat.

Source: A geek with a hat (Original

Proposing Better Ways to Think about Internal Linking

I’ve long thought that there was an opportunity to improve the way we think about internal links, and to make much more effective recommendations. I feel like, as an industry, we have done a decent job of making the case that internal links are important and that the information architecture of big sites, in particular, makes a massive difference to their performance in search (see: 30-minute IA audit and DistilledU IA module).

And yet we’ve struggled to dig deeper than finding particularly poorly-linked pages, and obviously-bad architectures, leading to recommendations that are hard to implement, with weak business cases.

I’m going to propose a methodology that:

  1. Incorporates external authority metrics into internal PageRank (what I’m calling “local PageRank”) to take pure internal PageRank which is the best data-driven approach we’ve seen for evaluating internal links and avoid its issues that focus attention on the wrong areas

  2. Allows us to specify and evaluate multiple different changes in order to compare alternative approaches, figure out the scale of impact of a proposed change, and make better data-aware recommendations

Current information architecture recommendations are generally poor

Over the years, I’ve seen (and, ahem, made) many recommendations for improvements to internal linking structures and information architecture. In my experience, of all the areas we work in, this is an area of consistently weak recommendations.

I have often seen:

  • Vague recommendations – (“improve your information architecture by linking more to your product pages”) that don’t specify changes carefully enough to be actionable

  • No assessment of alternatives or trade-offs – does anything get worse if we make this change? Which page types might lose? How have we compared approach A and approach B?

  • Lack of a model – very limited assessment of the business value of making proposed changes – if everything goes to plan, what kind of improvement might we see? How do we compare the costs of what we are proposing to the anticipated benefits?

This is compounded in the case of internal linking changes because they are often tricky to specify (and to make at scale), hard to roll back, and very difficult to test (by now you know about our penchant for testing SEO changes – but internal architecture changes are among the trickiest to test because the anticipated uplift comes on pages that are not necessarily those being changed).

In my presentation at SearchLove London this year, I described different courses of action for factors in different areas of this grid:

It’s tough to make recommendations about internal links because while we have a fair amount of data about how links generally affect rankings, we have less information specifically focusing on internal links, and so while we have a high degree of control over them (in theory it’s completely within our control whether page A on our site links to page B) we need better analysis:

The current state of the art is powerful for diagnosis

If you want to get quickly up to speed on the latest thinking in this area, I’d strongly recommend reading these three articles and following their authors:

  1. Calculate internal PageRank by Paul Shapiro

  2. Using PageRank for internal link optimisation by Jan-Willem Bobbink

  3. Easy visualizations of PageRank and page groups by Patrick Stox

A load of smart people have done a ton of thinking on the subject and there are a few key areas where the state of the art is powerful:

There is no doubt that the kind of visualisations generated by techniques like those in the articles above are good for communicating problems you have found, and for convincing stakeholders of the need for action. Many people are highly visual thinkers, and it’s very often easier to explain a complex problem with a diagram. I personally find static visualisations difficult to analyse, however, and for discovering and diagnosing issues, you need data outputs and / or interactive visualisations:

But the state of the art has gaps:

The most obvious limitation is one that Paul calls out in his own article on calculating internal PageRank when he says:

“we see that our top page is our contact page. That doesn’t look right!”

This is a symptom of a wider problem which is that any algorithm looking at authority flow within the site that fails to take into account authority flow into the site from external links will be prone to getting misleading results. Less-relevant pages seem erroneously powerful, and poorly-integrated pages that have tons of external links seem unimportant in the pure internal PR calculation.

In addition, I hinted at this above, but I find visualisations very tricky – on large sites, they get too complex too quickly and have an element of the Rorschach to them:

My general attitude is to agree with O’Reilly that “Everything looks like a graph but almost nothing should ever be drawn as one”:

All of the best visualisations I’ve seen are nonetheless full link-graph visualisations – you will very often see crawl-depth charts which are in my opinion even harder to read and obscure even more information than regular link graphs. It’s not only the sampling but the inherent bias of only showing links in the order discovered from a single starting page – typically the homepage – which is useful only if that’s the only page on your site with any external links. This Sitebulb article talks about some of the challenges of drawing good crawl maps:

But by far the biggest gap I see is the almost total lack of any way of comparing current link structures to proposed ones, or for comparing multiple proposed solutions to see a) if they fix the problem, and b) which is better. The common focus on visualisations doesn’t scale well to comparisons – both because it’s hard to make a visualisation of a proposed change and because even if you can, the graphs will just look totally different because the layout is really sensitive to even fairly small tweaks in the underlying structure.

Our intuition is really bad when it comes to iterative algorithms

All of this wouldn’t be so much of a problem if our intuition was good. If we could just hold the key assumptions in our heads and make sensible recommendations from our many years of experience evaluating different sites.

Unfortunately, the same complexity that made PageRank such a breakthrough for Google in the early days makes for spectacularly hard problems for humans to evaluate. Even more unfortunately, not only are we clearly bad at calculating these things exactly, we’re surprisingly bad even at figuring them out directionally. [Long-time readers will no doubt see many parallels to the work I’ve done evaluating how bad (spoiler: really bad) SEOs are at understanding ranking factors generally].

I think that most people in the SEO field have a high-level understanding of at least the random surfer model of PR (and its extensions like reasonable surfer). Unfortunately, most of us are less good at having a mental model for the underlying eigenvector / eigenvalue problem and the infinite iteration / convergence of surfer models is troublesome to our intuition, to say the least.

I explored this intuition problem recently with a really simplified example and an unscientific poll:

The results were unsurprising – over 1 in 5 people got even a simple question wrong (the right answer is that a lot of the benefit of the link to the new page flows on to other pages in the site and it retains significantly less than an Nth of the PR of the homepage):

I followed this up with a trickier example and got a complete lack of consensus:

The right answer is that it loses (a lot) less than the PR of the new page except in some weird edge cases (I think only if the site has a very strange external link profile) where it can gain a tiny bit of PR. There is essentially zero chance that it doesn’t change, and no way for it to lose the entire PR of the new page.

Most of the wrong answers here are based on non-iterative understanding of the algorithm. It’s really hard to wrap your head around it all intuitively (I built a simulation to check my own answers – using the approach below).

All of this means that, since we don’t truly understand what’s going on, we are likely making very bad recommendations and certainly backing them up and arguing our case badly.

Doing better part 1: local PageRank solves the problems of internal PR

In order to be able to compare different proposed approaches, we need a way of re-running a data-driven calculation for different link graphs. Internal PageRank is one such re-runnable algorithm, but it suffers from the issues I highlighted above from having no concept of which pages it’s especially important to integrate well into the architecture because they have loads of external links, and it can mistakenly categorise pages as much stronger than they should be simply because they have links from many weak pages on your site.

In theory, you get a clearer picture of the performance of every page on your site – taking into account both external and internal links – by looking at internet-wide PageRank-style metrics. Unfortunately, we don’t have access to anything Google-scale here and the established link data providers have only sparse data for most websites – with data about only a fraction of all pages.

Even if they had dense data for all pages on your site, it wouldn’t solve the re-runnability problem – we wouldn’t be able to see how the metrics changed with proposed internal architecture changes.

What I’ve called “local” PageRank is an approach designed to attack this problem. It runs an internal PR calculation with what’s called a personalization vector designed to capture external authority weighting. This is not the same as re-running the whole PR calculation on a subgraph – that’s an extremely difficult problem that Google spent considerable resources to solve in their caffeine update. Instead, it’s an approximation, but it’s one that solves the major issues we had with pure internal PR of unimportant pages showing up among the most powerful pages on the site.

Here’s how to calculate it:

The next stage requires data from an external provider – I used raw mozRank – you can choose whichever provider you prefer, but make sure you are working with a raw metric rather than a logarithmically-scaled one, and make sure you are using a PageRank-like metric rather than a raw link count or ML-based metric like Moz’s page authority:

You need to normalise the external authority metric – as it will be calibrated on the entire internet while we need it to be a probability vector over our crawl – in other words to sum to 1 across our site:

We then use the NetworkX PageRank library to calculate our local PageRank – here’s some outline code:

What’s happening here is that by setting the personalization parameter to be the normalised vector of external authorities, we are saying that every time the random surfer “jumps”, instead of returning to a page on our site with uniform random chance, they return with probabilities proportional to the external authorities of those pages. This is roughly like saying that any time someone leaves your site in the random surfer model, they return via the weighted PageRank of the external links to your site’s pages. It’s fine that your external authority data might be sparse – you can just set values to zero for any pages without external authority data – one feature of this algorithm is that it’ll “fill in” appropriate values for those pages that are missing from the big data providers’ datasets.

In order to make this work, we also need to set the alpha parameter lower than we normally would (this is the damping parameter – normally set to 0.85 in regular PageRank – one minus alpha is the jump probability at each iteration). For much of my analysis, I set it to 0.5 – roughly representing the % of site traffic from external links – approximating the idea of a reasonable surfer.

There are a few things that I need to incorporate into this model to make it more useful – if you end up building any of this before I do, please do let me know:

  • Handle nofollow correctly (see Matt Cutts’ old PageRank sculpting post)

  • Handle redirects and rel canonical sensibly

  • Include top mR pages (or even all pages with mR) – even if they’re not in the crawl that starts at the homepage

    • You could even use each of these as a seed and crawl from these pages

  • Use the weight parameter in NetworkX to weight links by type to get closer to reasonable surfer model

    • The extreme version of this would be to use actual click-data for your own site to calibrate the behaviour to approximate an actual surfer!

Doing better part 2: describing and evaluating proposed changes to internal linking

After my frustration at trying to find a way of accurately evaluating internal link structures, my other major concern has been the challenges of comparing a proposed change to the status quo, or of evaluating multiple different proposed changes. As I said above, I don’t believe that this is easy to do visually as most of the layout algorithms used in the visualisations are very sensitive to the graph structure and just look totally different under even fairly minor changes. You can obviously drill into an interactive visualisation of the proposed change to look for issues, but that’s also fraught with challenges.

So my second proposed change to the methodology is to find ways to compare the local PR distribution we’ve calculated above between different internal linking structures. There are two major components to being able to do this:

  1. Efficiently describing or specifying the proposed change or new link structure; and

  2. Effectively comparing the distributions of local PR – across what is likely tens or hundreds of thousands of pages

How to specify a change to internal linking

I have three proposed ways of specifying changes:

1. Manually adding or removing small numbers of links

Although it doesn’t scale well, if you are just looking at changes to a limited number of pages, one option is simply to manipulate the spreadsheet of crawl data before loading it into your script:

2. Programmatically adding or removing edges as you load the crawl data

Your script will have a function that loads  the data from the crawl file – and as it builds the graph structure (a DiGraph in NetworkX terms – which stands for Directed Graph). At this point, if you want to simulate adding a sitewide link to a particular page, for example, you can do that – for example if this line sat inside the loop loading edges, it would add a link from every page to our London SearchLove page:

site.add_edges_from([(edge['Source'],
'http://ift.tt/1mh63dU')])

You don’t need to worry about adding duplicates (i.e. checking whether a page already links to the target) because a DiGraph has no concept of multiple edges in the same direction between the same nodes, so if it’s already there, adding it will do no harm.

Removing edges programmatically is a little trickier – because if you want to remove a link from global navigation, for example, you need logic that knows which pages have non-navigation links to the target, as you don’t want to remove those as well (you generally don’t want to remove all links to the target page). But in principle, you can make arbitrary changes to the link graph in this way.

3. Crawl a staging site to capture more complex changes

As the changes get more complex, it can be tough to describe them in sufficient detail. For certain kinds of changes, it feels to me as though the best way to load the changed structure is to crawl a staging site with the new architecture. Of course, in general, this means having the whole thing implemented and ready to go, the effort of doing which negates a large part of the benefit of evaluating the change in advance. We have a secret weapon here which is that the “meta-CMS” nature of our ODN platform allows us to make certain changes incredibly quickly across site sections and create preview environments where we can see changes even for companies that aren’t customers of the platform yet.

For example, it looks like this to add a breadcrumb across a site section on one of our customers’ sites:

There are a few extra tweaks to the process if you’re going to crawl a staging or preview environment to capture internal link changes – because we need to make sure that the set of pages is identical in both crawls so we can’t just start at each homepage and crawl X levels deep. By definition we have changed the linking structure and therefore will discover a different set of pages. Instead, we need to:

  • Crawl both live and preview to X levels deep

  • Combine into a superset of all pages discovered on either crawl (noting that these pages exist on both sites – we haven’t created any new pages in preview)

  • Make lists of pages missing in each crawl and crawl those from lists

Once you have both crawls, and both include the same set of pages, you can re-run the algorithm described above to get the local PageRanks under each scenario and begin comparing them.

How to compare different internal link graphs

Sometimes you will have a specific problem you are looking to address (e.g. only y% of our product pages are indexed) – in which case you will likely want to check whether your change has improved the flow of authority to those target pages, compare their performance under proposed change A and proposed change B etc. Note that it is hard to evaluate losers with this approach – because the normalisation means that the local PR will always sum to 1 across your whole site so there always are losers if there are winners – in contrast to the real world where it is theoretically possible to have a structure that strictly dominates another.

In general, if you are simply evaluating how to make the internal link architecture “better”, you are less likely to jump to evaluating specific pages. In this case, you probably want to do some evaluation of different kinds of page on your site – identified either by:

  1. Labelling them by URL – e.g. everything in /blog or with ?productId in the URL

  2. Labelling them as you crawl

    1. Either from crawl structure – e.g. all pages 3 levels deep from the homepage, all pages linked from the blog etc)

    2. Or based on the crawled HTML (all pages with more than x links on them, with a particular breadcrumb or piece of meta information labelling them)

  3. Using modularity to label them automatically by algorithmically grouping pages in similar “places” in the link structure

I’d like to be able to also come up with some overall “health” score for an internal linking structure – and have been playing around with scoring it based on some kind of equality metric under the thesis that if you’ve chosen your indexable page set well, you want to distribute external authority as well throughout that set as possible. This thesis seems most likely to hold true for large long-tail-oriented sites that get links to pages which aren’t generally the ones looking to rank (e.g. e-commerce sites). It also builds on some of Tom Capper’s thinking (videoslides, blog post) about links being increasingly important for getting into Google’s consideration set for high-volume keywords which is then reordered by usage metrics and ML proxies for quality.

I have more work to do here, but I hope to develop an effective metric – it’d be great if it could build on established equality metrics like the Gini Coefficient. If you’ve done any thinking about this, or have any bright ideas, I’d love to hear your thoughts in the comments, or on Twitter.

Source: Distilled (Original

English Google Webmaster Central office-hours hangout

Join us for a Google Webmaster Central office hours hangout on Oct 17, 4pm CET http://ift.tt/2ycVEy6 Add your questions at http://ift.tt/2cBVkA7 . This session is open to anything webmaster related like crawling, indexing, mobile sites, internationalization, duplicate content, Sitemaps, Search Console, pagination, duplicate content, multi-lingual/multi-regional sites, etc. This is a Hangout on YouTube Live. To join live, watch out for the link once the event starts, and use a webcam + headset. Feel free to drop by – we welcome webmasters of all levels!

Source: Google Webmasters (uploads) on YouTube (Original

Proposing Better Ways to Think about Internal Linking

I’ve long thought that there was an opportunity to improve the way we think about internal links, and to make much more effective recommendations. I feel like, as an industry, we have done a decent job of making the case that internal links are important and that the information architecture of big sites, in particular, makes a massive difference to their performance in search (see: 30-minute IA audit and DistilledU IA module).

And yet we’ve struggled to dig deeper than finding particularly poorly-linked pages, and obviously-bad architectures, leading to recommendations that are hard to implement, with weak business cases.

I’m going to propose a methodology that:

  1. Incorporates external authority metrics into internal PageRank (what I’m calling “local PageRank”) to take pure internal PageRank which is the best data-driven approach we’ve seen for evaluating internal links and avoid its issues that focus attention on the wrong areas

  2. Allows us to specify and evaluate multiple different changes in order to compare alternative approaches, figure out the scale of impact of a proposed change, and make better data-aware recommendations

Current information architecture recommendations are generally poor

Over the years, I’ve seen (and, ahem, made) many recommendations for improvements to internal linking structures and information architecture. In my experience, of all the areas we work in, this is an area of consistently weak recommendations.

I have often seen:

  • Vague recommendations – (“improve your information architecture by linking more to your product pages”) that don’t specify changes carefully enough to be actionable

  • No assessment of alternatives or trade-offs – does anything get worse if we make this change? Which page types might lose? How have we compared approach A and approach B?

  • Lack of a model – very limited assessment of the business value of making proposed changes – if everything goes to plan, what kind of improvement might we see? How do we compare the costs of what we are proposing to the anticipated benefits?

This is compounded in the case of internal linking changes because they are often tricky to specify (and to make at scale), hard to roll back, and very difficult to test (by now you know about our penchant for testing SEO changes – but internal architecture changes are among the trickiest to test because the anticipated uplift comes on pages that are not necessarily those being changed).

In my presentation at SearchLove London this year, I described different courses of action for factors in different areas of this grid:

It’s tough to make recommendations about internal links because while we have a fair amount of data about how links generally affect rankings, we have less information specifically focusing on internal links, and so while we have a high degree of control over them (in theory it’s completely within our control whether page A on our site links to page B) we need better analysis:

The current state of the art is powerful for diagnosis

If you want to get quickly up to speed on the latest thinking in this area, I’d strongly recommend reading these three articles and following their authors:

  1. Calculate internal PageRank by Paul Shapiro

  2. Using PageRank for internal link optimisation by Jan-Willem Bobbink

  3. Easy visualizations of PageRank and page groups by Patrick Stox

A load of smart people have done a ton of thinking on the subject and there are a few key areas where the state of the art is powerful:

There is no doubt that the kind of visualisations generated by techniques like those in the articles above are good for communicating problems you have found, and for convincing stakeholders of the need for action. Many people are highly visual thinkers, and it’s very often easier to explain a complex problem with a diagram. I personally find static visualisations difficult to analyse, however, and for discovering and diagnosing issues, you need data outputs and / or interactive visualisations:

But the state of the art has gaps:

The most obvious limitation is one that Paul calls out in his own article on calculating internal PageRank when he says:

“we see that our top page is our contact page. That doesn’t look right!”

This is a symptom of a wider problem which is that any algorithm looking at authority flow within the site that fails to take into account authority flow into the site from external links will be prone to getting misleading results. Less-relevant pages seem erroneously powerful, and poorly-integrated pages that have tons of external links seem unimportant in the pure internal PR calculation.

In addition, I hinted at this above, but I find visualisations very tricky – on large sites, they get too complex too quickly and have an element of the Rorschach to them:

My general attitude is to agree with O’Reilly that “Everything looks like a graph but almost nothing should ever be drawn as one”:

All of the best visualisations I’ve seen are nonetheless full link-graph visualisations – you will very often see crawl-depth charts which are in my opinion even harder to read and obscure even more information than regular link graphs. It’s not only the sampling but the inherent bias of only showing links in the order discovered from a single starting page – typically the homepage – which is useful only if that’s the only page on your site with any external links. This Sitebulb article talks about some of the challenges of drawing good crawl maps:

But by far the biggest gap I see is the almost total lack of any way of comparing current link structures to proposed ones, or for comparing multiple proposed solutions to see a) if they fix the problem, and b) which is better. The common focus on visualisations doesn’t scale well to comparisons – both because it’s hard to make a visualisation of a proposed change and because even if you can, the graphs will just look totally different because the layout is really sensitive to even fairly small tweaks in the underlying structure.

Our intuition is really bad when it comes to iterative algorithms

All of this wouldn’t be so much of a problem if our intuition was good. If we could just hold the key assumptions in our heads and make sensible recommendations from our many years of experience evaluating different sites.

Unfortunately, the same complexity that made PageRank such a breakthrough for Google in the early days makes for spectacularly hard problems for humans to evaluate. Even more unfortunately, not only are we clearly bad at calculating these things exactly, we’re surprisingly bad even at figuring them out directionally. [Long-time readers will no doubt see many parallels to the work I’ve done evaluating how bad (spoiler: really bad) SEOs are at understanding ranking factors generally].

I think that most people in the SEO field have a high-level understanding of at least the random surfer model of PR (and its extensions like reasonable surfer). Unfortunately, most of us are less good at having a mental model for the underlying eigenvector / eigenvalue problem and the infinite iteration / convergence of surfer models is troublesome to our intuition, to say the least.

I explored this intuition problem recently with a really simplified example and an unscientific poll:

The results were unsurprising – over 1 in 5 people got even a simple question wrong (the right answer is that a lot of the benefit of the link to the new page flows on to other pages in the site and it retains significantly less than an Nth of the PR of the homepage):

I followed this up with a trickier example and got a complete lack of consensus:

The right answer is that it loses (a lot) less than the PR of the new page except in some weird edge cases (I think only if the site has a very strange external link profile) where it can gain a tiny bit of PR. There is essentially zero chance that it doesn’t change, and no way for it to lose the entire PR of the new page.

Most of the wrong answers here are based on non-iterative understanding of the algorithm. It’s really hard to wrap your head around it all intuitively (I built a simulation to check my own answers – using the approach below).

All of this means that, since we don’t truly understand what’s going on, we are likely making very bad recommendations and certainly backing them up and arguing our case badly.

Doing better part 1: local PageRank solves the problems of internal PR

In order to be able to compare different proposed approaches, we need a way of re-running a data-driven calculation for different link graphs. Internal PageRank is one such re-runnable algorithm, but it suffers from the issues I highlighted above from having no concept of which pages it’s especially important to integrate well into the architecture because they have loads of external links, and it can mistakenly categorise pages as much stronger than they should be simply because they have links from many weak pages on your site.

In theory, you get a clearer picture of the performance of every page on your site – taking into account both external and internal links – by looking at internet-wide PageRank-style metrics. Unfortunately, we don’t have access to anything Google-scale here and the established link data providers have only sparse data for most websites – with data about only a fraction of all pages.

Even if they had dense data for all pages on your site, it wouldn’t solve the re-runnability problem – we wouldn’t be able to see how the metrics changed with proposed internal architecture changes.

What I’ve called “local” PageRank is an approach designed to attack this problem. It runs an internal PR calculation with what’s called a personalization vector designed to capture external authority weighting. This is not the same as re-running the whole PR calculation on a subgraph – that’s an extremely difficult problem that Google spent considerable resources to solve in their caffeine update. Instead, it’s an approximation, but it’s one that solves the major issues we had with pure internal PR of unimportant pages showing up among the most powerful pages on the site.

Here’s how to calculate it:

The next stage requires data from an external provider – I used raw mozRank – you can choose whichever provider you prefer, but make sure you are working with a raw metric rather than a logarithmically-scaled one, and make sure you are using a PageRank-like metric rather than a raw link count or ML-based metric like Moz’s page authority:

You need to normalise the external authority metric – as it will be calibrated on the entire internet while we need it to be a probability vector over our crawl – in other words to sum to 1 across our site:

We then use the NetworkX PageRank library to calculate our local PageRank – here’s some outline code:

What’s happening here is that by setting the personalization parameter to be the normalised vector of external authorities, we are saying that every time the random surfer “jumps”, instead of returning to a page on our site with uniform random chance, they return with probabilities proportional to the external authorities of those pages. This is roughly like saying that any time someone leaves your site in the random surfer model, they return via the weighted PageRank of the external links to your site’s pages. It’s fine that your external authority data might be sparse – you can just set values to zero for any pages without external authority data – one feature of this algorithm is that it’ll “fill in” appropriate values for those pages that are missing from the big data providers’ datasets.

In order to make this work, we also need to set the alpha parameter lower than we normally would (this is the damping parameter – normally set to 0.85 in regular PageRank – one minus alpha is the jump probability at each iteration). For much of my analysis, I set it to 0.5 – roughly representing the % of site traffic from external links – approximating the idea of a reasonable surfer.

There are a few things that I need to incorporate into this model to make it more useful – if you end up building any of this before I do, please do let me know:

  • Handle nofollow correctly (see Matt Cutts’ old PageRank sculpting post)

  • Handle redirects and rel canonical sensibly

  • Include top mR pages (or even all pages with mR) – even if they’re not in the crawl that starts at the homepage

    • You could even use each of these as a seed and crawl from these pages

  • Use the weight parameter in NetworkX to weight links by type to get closer to reasonable surfer model

    • The extreme version of this would be to use actual click-data for your own site to calibrate the behaviour to approximate an actual surfer!

Doing better part 2: describing and evaluating proposed changes to internal linking

After my frustration at trying to find a way of accurately evaluating internal link structures, my other major concern has been the challenges of comparing a proposed change to the status quo, or of evaluating multiple different proposed changes. As I said above, I don’t believe that this is easy to do visually as most of the layout algorithms used in the visualisations are very sensitive to the graph structure and just look totally different under even fairly minor changes. You can obviously drill into an interactive visualisation of the proposed change to look for issues, but that’s also fraught with challenges.

So my second proposed change to the methodology is to find ways to compare the local PR distribution we’ve calculated above between different internal linking structures. There are two major components to being able to do this:

  1. Efficiently describing or specifying the proposed change or new link structure; and

  2. Effectively comparing the distributions of local PR – across what is likely tens or hundreds of thousands of pages

How to specify a change to internal linking

I have three proposed ways of specifying changes:

1. Manually adding or removing small numbers of links

Although it doesn’t scale well, if you are just looking at changes to a limited number of pages, one option is simply to manipulate the spreadsheet of crawl data before loading it into your script:

2. Programmatically adding or removing edges as you load the crawl data

Your script will have a function that loads  the data from the crawl file – and as it builds the graph structure (a DiGraph in NetworkX terms – which stands for Directed Graph). At this point, if you want to simulate adding a sitewide link to a particular page, for example, you can do that – for example if this line sat inside the loop loading edges, it would add a link from every page to our London SearchLove page:

site.add_edges_from([(edge['Source'],
'http://ift.tt/1mh63dU')])

You don’t need to worry about adding duplicates (i.e. checking whether a page already links to the target) because a DiGraph has no concept of multiple edges in the same direction between the same nodes, so if it’s already there, adding it will do no harm.

Removing edges programmatically is a little trickier – because if you want to remove a link from global navigation, for example, you need logic that knows which pages have non-navigation links to the target, as you don’t want to remove those as well (you generally don’t want to remove all links to the target page). But in principle, you can make arbitrary changes to the link graph in this way.

3. Crawl a staging site to capture more complex changes

As the changes get more complex, it can be tough to describe them in sufficient detail. For certain kinds of changes, it feels to me as though the best way to load the changed structure is to crawl a staging site with the new architecture. Of course, in general, this means having the whole thing implemented and ready to go, the effort of doing which negates a large part of the benefit of evaluating the change in advance. We have a secret weapon here which is that the “meta-CMS” nature of our ODN platform allows us to make certain changes incredibly quickly across site sections and create preview environments where we can see changes even for companies that aren’t customers of the platform yet.

For example, it looks like this to add a breadcrumb across a site section on one of our customers’ sites:

There are a few extra tweaks to the process if you’re going to crawl a staging or preview environment to capture internal link changes – because we need to make sure that the set of pages is identical in both crawls so we can’t just start at each homepage and crawl X levels deep. By definition we have changed the linking structure and therefore will discover a different set of pages. Instead, we need to:

  • Crawl both live and preview to X levels deep

  • Combine into a superset of all pages discovered on either crawl (noting that these pages exist on both sites – we haven’t created any new pages in preview)

  • Make lists of pages missing in each crawl and crawl those from lists

Once you have both crawls, and both include the same set of pages, you can re-run the algorithm described above to get the local PageRanks under each scenario and begin comparing them.

How to compare different internal link graphs

Sometimes you will have a specific problem you are looking to address (e.g. only y% of our product pages are indexed) – in which case you will likely want to check whether your change has improved the flow of authority to those target pages, compare their performance under proposed change A and proposed change B etc. Note that it is hard to evaluate losers with this approach – because the normalisation means that the local PR will always sum to 1 across your whole site so there always are losers if there are winners – in contrast to the real world where it is theoretically possible to have a structure that strictly dominates another.

In general, if you are simply evaluating how to make the internal link architecture “better”, you are less likely to jump to evaluating specific pages. In this case, you probably want to do some evaluation of different kinds of page on your site – identified either by:

  1. Labelling them by URL – e.g. everything in /blog or with ?productId in the URL

  2. Labelling them as you crawl

    1. Either from crawl structure – e.g. all pages 3 levels deep from the homepage, all pages linked from the blog etc)

    2. Or based on the crawled HTML (all pages with more than x links on them, with a particular breadcrumb or piece of meta information labelling them)

  3. Using modularity to label them automatically by algorithmically grouping pages in similar “places” in the link structure

I’d like to be able to also come up with some overall “health” score for an internal linking structure – and have been playing around with scoring it based on some kind of equality metric under the thesis that if you’ve chosen your indexable page set well, you want to distribute external authority as well throughout that set as possible. This thesis seems most likely to hold true for large long-tail-oriented sites that get links to pages which aren’t generally the ones looking to rank (e.g. e-commerce sites). It also builds on some of Tom Capper’s thinking (videoslides, blog post) about links being increasingly important for getting into Google’s consideration set for high-volume keywords which is then reordered by usage metrics and ML proxies for quality.

I have more work to do here, but I hope to develop an effective metric – it’d be great if it could build on established equality metrics like the Gini Coefficient. If you’ve done any thinking about this, or have any bright ideas, I’d love to hear your thoughts in the comments, or on Twitter.

Source: distilled (Original

You Need This 2018 Marketing Calendar & Free Template! by @annaleacrowe

What separates good marketers from great marketers?

Planning.

Now is the perfect time to begin planning for 2018. If you haven’t started yet, it isn’t too late.

Make sure your planning starts with this 2018 marketing calendar and free template.

As a marketer, I’m always looking for inspiration. But, out of all the awesome marketing campaigns, I see throughout the year like My Oreo Creation contest or Starbucks teaming up with Lady Gaga when it comes time to planning my clients’ marketing calendars I can only seem to remember one or two.

Plus, there are so many special days, weeks, and months to remember.

In other words, there are so many amazing content marketing opportunities ahead!

That’s why I created this calendar and template.

To simplify your life. (And mine, I won’t lie.)

This is the 2018 marketing calendar you’ve been waiting for.

Happy planning!

Free Marketing Calendar Templates

Before we dive into each of the months for 2018, here are my free marketing calendar templates for 2018.

Marketing Calendar for 2018 + Google Calendar

Here is the marketing calendar for 2018.

marketing-calendar-2018

Here is the Google Calendar link. 

Here is the Google Calendar ID: fd7nlomqn83tq8c60dl9hk7cgs@group.calendar.google.com

  .@sejournal & @annaleacrowe just dropped a marketing calendar for 2018 & it’s awesome 👊http://gph.is/XMzATx http://bit.ly/2gfdnvc {Click to Tweet}

Now, let’s move ahead to your 2018 marketing calendar, in all its holiday glory.

2018-marketing-calendar

2018 Marketing Calendar

I’ve broken down this calendar by month, so if you want to jump to a specific month, just click on the links below:

January

While some may spend the beginning of their year recovering from the holidays and New Year’s celebrations, for most, January is a time for a change.

From weight loss to home improvement goals, many Americans make New Year’s resolutions to improve themselves in some way, so it’s the perfect time to capitalize on fitness, healthy eating, and career ambitions.

Encourage your customers to stick to their goals. Focus on themes that support and motivate them to start a great year.

Monthly Observances
Weight Loss Awareness Month
National Blood Donor Month
National Thank You Month
National Hobby Month
National Tea Month
Girl Scout Cookie Season Begins

Weekly Observances
January 1-7 – Diet Resolution Week
January 14-20 – Hunt For Happiness Week
January 15-19 – Sugar Awareness Week
January 22-26 – Clean Out Your Inbox Week
January 22-26 – National School Choice Week
January 28-February 2 – Meat Week

Days
January 1 – New Year’s Day
January 1 – National Hangover Day
January 2 – Personal Trainer Awareness Day
January 4 – Trivia Day
January 5 – National Bird Day
January 6 – Cuddle Up Day
January 7 – Golden Globes
January 8 – Clean Off Your Desk Day
January 9 – Girl Scout Cookie Pre-Sales Begin
January 10 – National Bittersweet Chocolate Day
January 11 – National Human Trafficking Awareness Day
January 13 – First Friday the 13th of 2018
January 14 – Dress Up Your Pet Day
January 15 – Martin Luther King, Jr. Day
January 17 – Ditch New Year’s Resolutions Day
January 18 – Get to Know Your Customers Day
January 18 – Winnie the Pooh Day (Author A.A. Milne’s birthday)
January 19 – National Popcorn Day
January 20 – Penguin Awareness Day
January 21 – National Hugging Day
January 21 – SAG Awards
January 24 – Compliment Day
January 24 – National Peanut Butter Day
January 25 – Opposite Day
January 26 – National Green Juice Day
January 26 – Spouse’s Day
January 27 – Chocolate Cake Day
January 28 – Grammy’s
January 28 – Fun at Work Day
January 29 – National Puzzle Day
January 31 – Backward Day

Examples of holiday marketing for brands:

February

From Valentine’s Day to the Super Bowl, February may be short, but it’s jam-packed full of holidays.

Every marketer knows they’ll have their hands full with chocolate hearts and footballs this month. Why not embrace it?

This month is all about love and fun: being together and having a good time.

Monthly Observances
Black History Month
American Heart Month
National Heart Month
National Weddings Month
National Cherry Month

Weekly Observances
February 1-7 Eating Disorder Awareness Week
February 8-16 – Fashion Week
February 14-21 – Condom Week
February 10-16 – Freelance Writers Appreciation Week
February 14-20 – Random Acts of Kindness Week
February 13-19 – International Flirting Week

Days
February 1 – National Freedom Day
February 2 – Groundhog Day
February 2 – Wear Red Day
February 2 – Bubble Gum Day
February 4 – Super Bowl Sunday
February 5 – World Nutella Day
February 6 – National Chopsticks Day
February 7 – Give Kids a Smile Day
February 8 – Boy Scout’s Day
February 9 – National Pizza Day
February 10 – Umbrella Day
February 11 – Make a Friend Day
February 12 – Lincoln’s Birthday
February 13 – Mardi Gras/Fat Tuesday
February 14 – Valentine’s Day
February 14 – Ash Wednesday
February 15 – Single’s Awareness Day
February 16 – Chinese New Year
February 17 – Random Acts of Kindness Day
February 18 – Drink Wine Day
February 19 – President’s Day
February 22 – Washington’s Birthday
February 22 – Margarita Day
February 22 – Walk Your Dog Day
February 24 – National Tortilla Chip Day
February 26 – Girl Scout Cookie Booth Sales Begin
February 26 – National Pistachio Day
February 28 – Floral Design Day
February 29 – Leap Day (The next leap year is in 2020.)

Examples of holiday marketing for brands:

March

March is a time of wearing green and welcoming spring.

There’s so much to look forward to this month, from St. Patrick’s Day festivities to Academy Award celebrations.

In other words, there’s always an excuse for a party.

Monthly Observances
National Women’s History Month
National Nutrition Month
National Peanut Month
National Music in Our Schools Month
National Craft Month
National Irish Heritage Month
American Red Cross Month
March for Meals
The Great American Cleanup

Weekly Observances
March 23-28 – National Sleep Awareness Week
March 12-17 – Girl Scout Week
March 12-17 – Campfire Birthday Week
March 26-31 – National Cleaning Week

Days
March 1 – Peanut Butter Lover’s Day
March 2 – National Read Across America Day (Dr. Seuss Day)
March 2 – Employee Appreciation Day
March 3 – Day of Unplugging
March 3 – World Wildlife Day
March 4 – Academy Awards
March 6 – Dentist’s Day
March 8 – International Women’s Day
March 8 – Popcorn Lover’s Day
March 11 – Daylight Savings
March 12 – Girl Scout Day
March 12 – Napping Day
March 13 – Jewel Day
March 14 – National Pi Day
March 15 – The Ides of March
March 16 – World Sleep Day
March 17 – St. Patrick’s Day
March 20 – First Day of Spring
March 20 – National Agriculture Day
March 22 – World Water Day
March 23 – Puppy Day
March 25 – Palm Sunday
March 26 – Purple Day for Epilepsy Awareness
March 29 – Mom & Pop Business Owner’s Day
March 29 – Baseball Opening Day
March 30 – Good Friday
March 30 – National Doctor’s Day
March 31 – Crayon Day

Examples of holiday marketing for brands:

VIDEO

April

April: finally you can fill your posts with pastels and use baby animals in your blog!

Spring is in full bloom so remind your followers to check the house for any unfound eggs (before they start to smell), and give them pics of fresh flowers to help them forget about tax season.

Monthly Observances
Earth Month
National Volunteer Month
National Autism Awareness Month
Keep America Beautiful Month
National Garden Month
Stress Awareness Month
National Poetry Month

Weekly Observances
April 5-8 Masters Week
April 15-22 – National Volunteer Week
April 16-22- Animal Cruelty/Human Violence Awareness Week
April 23-29 – Administrative Professionals Week
April 22-28 – Every Kid Healthy Week
April 22-28 – National Princess Week

Days
April 1 – April Fool’s Day
April 1 – Easter
April 2 – World Autism Awareness Day
April 2 – National Peanut Butter and Jelly Day
April 3 – Don’t Go To Work Unless it’s Fun Day
April 4 – School Librarian Day
April 7 – World Health Day
April 7 – National Beer Day
April 9 – Winston Churchill Day
April 10 – Golfer’s Day
April 11 – National Pet Day
April 12 – National Grilled Cheese Day
April 13 – Friday the 13th
April 13 – Coachella Music Festival Begins
April 15 – National Titanic Remembrance Day
April 17 – Tax Day
April 22 – Earth Day
April 25 – Administrative Professional’s Day
April 27 – Arbor Day
April 28 – National Superhero Day
April 30 – National Adopt a Shelter Pet Day

Examples of holiday marketing for brands:

whole foods easter

May

Finally! The weather’s getting warmer and spring break is upon us.

Get your customers to feel those spring vibes with lots of colors, florals, and a whole bunch of sunshine.

And, of course, every marketer can’t forget May is all about Mom. Remind your followers that she deserves the best flowers, tastiest chocolates, and lots of fun this month.

Monthly Observances
ALS Awareness
Asthma Awareness
National Celiac Disease Awareness Month
Clean Air Month
Global Employee Health and Fitness Month
National Barbecue Month
National Bike Month
National Hamburger Month
National Salad Month
National Photograph Month
Gifts from the Garden Month
Lupus Awareness Month
Military Family Appreciation Month

Weekly Observances
Food Allergy Awareness Week (second full week of May)
April 30-May 4 – National Tourism Week
April 30-May 4 – Drinking Water Week
April 30-May 4 – National Pet Week
May 7-11 – Teacher Appreciation Week
May 6- 12 – Nurse’s Week

Days
May 1 – May Day
May 1 – Mother Goose Day
May 4 – Star Wars Day
May 5 – Cinco De Mayo
May 5 – Kentucky Derby
May 6 – National Nurse’s Day
May 8 – World Red Cross and Red Crescent Day
May 8 – National Teacher’s Day
May 9 – National Receptionists Day
May 11 – Eat What You Want Day
May 12 – World Fair Trade Day
May 13 – Mother’s Day
May 15 – National Chocolate Chip Day
May 16 – Love a Tree Day
May 18 – National Bike to Work Day
May 18 – NASCAR Day
May 19 – Armed Forces Day
May 20 – Be a Millionaire Day
May 24 – Victoria Day (Canada)
May 25 – Geek Pride Day
May 25 – National Wine Day
May 26 – Sally Ride Day
May 28 – Memorial Day

Examples of holiday marketing for brands:

poppin mother's day

June

No June gloom here! The sun is shining, the birds are singing, and there’s so much to do.

Besides playing hooky at the beach, the LGBTQ community is going to have a-rockin’ parade, the Stanley Cup Finals will keep you at the edge of your seat, and hey – it’s national Adopt a Cat Month! So settle down with your new furry friend and enjoy this start to Summer.

Monthly Observances
Men’s Health Month
National Safety Month
Acne Awareness Month
LGBTQ Pride Month
National Adopt a Cat Month
Aquarium Month
Candy Month

Weekly Observances
June 4-10 – Pet Appreciation Week
June 12-18 – Men’s Health Week

Days
June 1 – National Donut Day
June 2 – National Rocky Road Day
June 4 – Hug Your Cat Day
June 4 – National Cheese Day
June 5 – World Environment Day
June 7 – National Chocolate Ice Cream Day
June 8 – World Oceans Day
June 8 – National Best Friends Day
June 9 – Donald Duck Day
June 10 – Iced Tea Day
June 13 – National Weed Your Garden Day
June 14 – Flag Day
June 16 – World Juggler’s Day
June 17 – Father’s Day
June 18 – National Splurge Day
June 19 – National Kissing Day
June 21 – First Day of Summer / Summer Solstice
June 21 – National Selfie Day
June 22 – National Take a dog to Work Day
June 29 – Camera Day
June 30 – Social Media Day
Stanley Cup Finals
NBA Finals

Examples of holiday marketing for brands:

July

Get ready to break out those red, white, and blues and enjoy some fireworks because…. it’s July!

It might be toasty this time of year, but there’s nothing quite like ice cream on a warm summer day to cool you off.

Plus, you’ll want to soak up that sun while you can before autumn “befalls” us!

Monthly Observances
Ice Cream Month
National Grilling Month
National Picnic Month
National Independent Retailer Month
National Blueberry Month

Weekly Observances

July 15-21 – Capture the Sunset Week
July 16-22 – Independent Retailers Week
July 28-30 – World Lumberjack Championships

Days
July 1 – National Postal Worker Day
July 1 – International Joke Day
July 2 – World UFO Day
July 4 – Independence Day
July 5 – National Bikini Day
July 7 – Chocolate Day
July 8 – Video Games Day
July 11 – National 7-Eleven Day
July 12 – Pecan Pie Day
July 13 – Friday the 13
July 13 – Rock Worldwide Day
July 15 – National Ice Cream Day
July 16 – World Snake Day
July 17 – World Emoji Day
July 18 – World Hepatitis Day
July 19 – National Daiquiri Day
July 20 – National Moon Day
July 21 – #NoMakeUp Day
July 22 – Parent’s Day
July 24 – Amelia Earhart Day
July 24 – Cousins Day
July 26 – Aunt and Uncle Day
July 30 – Father-in-Law Day
July 30 – International Day of Friendship

Examples of holiday marketing for brands:

August

Back to school season can be such an exciting time of year (even if you’re not going back to school). There’s that smell of new beginnings and freshly sharpened pencils in the air, and we could all benefit from the back-to-school office supplies sale.

And yet, some of us are still in the summer state of mind – catching those last rays of sun and spending as much time as we can at the pool.

Monthly Observances
Back to School Month
National Golf Month
National Breastfeeding Month
Family Fun Month
Peach Month

Weekly Observances
August 6-12 – National Farmers’ Market Week
August 7-13 – PGA Championship Tournament
August 13-19 – National Motorcycle Week
August 13-19 – Feeding Pets of the Homeless Week

Days
August 1 – National Girlfriends Day
August 2 – National Ice Cream Sandwich Day
August 3 – International Beer Day – First Friday in August
August 5 – American Family Day – First Sunday in August
August 5 – International Friendship Day
August 8 – International Cat Day #InternationalCatDay
August 9 – Book Lover’s Day
August 10 – National S’mores Day
August 12 – Middle Child’s Day (Go Jan Brady!)
August 13 – Left Hander’s Day
August 16 – National Tell a Joke Day
August 18 – Bad Poetry Day
August 19 – World Humanitarian Day
August 21 – Senior Citizen’s Day
August 26 – Women’s Equality Day
August 26 – National Dog Day
August 30 – Frankenstein Day
August 31 – National Trail Mix Day

Examples of holiday marketing for brands

September

September can bring summer separation anxiety. Maybe you’re missing the bonfires or lazy pool days, or maybe you’re convinced you didn’t get enough wear out of your new summer outfit – but never fear!

September is full of events, from Oktoberfest to coffee day, so you’re sure to find plenty to celebrate.

Monthly Observances
Wilderness Month
National Preparedness Month
National Food Safety Education Month
Fruit and Veggies—More Matters Month
National Yoga Awareness Month
Whole Grains Month
Hispanic Heritage Month
Little League Month
Better Breakfast Month

Weekly Observances
September 9-15 – National Suicide Prevention Week
September 18-24 – Pollution Prevention Week
September 17-23 – National Indoor Plant Week
September 24-30 – National Dog Week

Days
September 1 – International Bacon Day
September 3 – Labor Day
September 5 – Cheese Pizza Day
September 6 – Read a Book Day
September 9–11 – Rosh Hashanah
September 11 – Patriot’s Day
September 11 – 9/11
September 12 – National Video Games Day
September 13 – Uncle Sam Day
September 15 – Greenpeace Day
September 16 – Wife Appreciation Day
September 15 – Boys’ and Girls’ Club Day for Kids
September 17 – Constitution Day
September 17 – Citizenship Day
September 18-19 – Yom Kippur Begins
September 19 – International Talk Like a Pirate
September 21 – International Day of Peace
September 22 – First Day of Fall
September 22 – Oktoberfest Begins
September 23 – Checkers Day
September 25– National Voter Registration Day
September 27 – World Tourism Day
September 28 – Native American Day
September 29 – Coffee Day
September 29 – World Heart Day

Examples of holiday marketing for brands:

Starting lineup.

A post shared by Chipotle (@chipotlemexicangrill) on Sep 7, 2017 at 5:18pm PDT

October

Everything in October is either spooky or filled with pumpkin spice – and we love it.

You’ve got a whole month to decide on your Halloween costume, change your mind twice, and throw something together.

All the candy is enough to get anyone excited, but beware: a sugar crash can be pretty scary, too.

Monthly Observances
Breast Cancer Awareness Month
AIDS Awareness Month
Bully Prevention Month
Adopt a Shelter Dog Month
Celiac Disease Awareness Month
Financial Planning Month
National Pizza Month
Allergy Appreciation Month

Weekly Observances
October 1-7 – Great Books Week
October 1-7 – National Work From Home Week
October 15-21 – Mediation Week
October 15-21 – National Business Women’s Week
October 22-28 – National Red Ribbon Week

Days
October 1 – World Vegetarian Day
October 2 – Name Your Car Day
October 3 – National Techies Day
October 3 – National Boyfriend’s Day
October 4 – National Taco Day
October 4 – National Kale Day
October 5 – World Teacher’s Day
October 6 – World Smile Day
October 8 – Oktoberfest Ends
October 9 – Columbus Day
October 9 – Leif Erikson Day
October 10 – World Mental Health Day
October 11 – It’s My Party Day
October 13 – Friday the 13th
October 13 – World Egg Day
October 16 – World Food Day
October 16 – Boss’s Day
October 24 – United Nations Day
October 17 – National Pasta Day
October 21 – National Pumpkin Cheesecake Day
October 24 – United Nations Day
October 27 – Make a Difference Day
October 30 – Mischief Night
October 31 – Halloween
October 31 – Day of the Dead Begins

Examples of holiday marketing for brands:

starbucks-facebook-cover-photo

 meundies-oct-email

November

From pumpkin pie to days off work, there’s so much to be thankful for during Thanksgiving time.

Use fabulous Fall colors in your posts, make your final preparations for Black Friday, and enjoy time with your family this month.

Monthly Observances
Movember
National Healthy Skin Month
Gluten-Free Diet Awareness Month
*National Adoption Month
National Gratitude Month
Peanut Butter Lovers’ Month
National Diabetes Awareness Month

Weekly Observances
November 13-19 – World Kindness Week
November 13-17 – American Education Week

Days
November 1 – All Saint’s Day
November 2 – Day of the Dead Ends
November 3 – Sandwich Day
November 3 – King Tut Day
November 4 – Daylight Savings Time Ends
November 6 – U.S. General Election Day
November 11 – Veterans Day
November 12 – Chicken Soup for the Soul Day
November 13 – World Kindness Day
November 13 – Sadie Hawkins Day
November 14 – World Diabetes Day
November 15 – America Recycles Day
November 16 – International Tolerance Day
November 17 – Homemade Bread Day
November 22 – Thanksgiving
November 23 – Black Friday
November 24 – Small Business Saturday
November 25 – Cyber Monday
November 28 – Giving Tuesday

Examples of holiday marketing for brands:

VIDEO

December

We’ve been waiting all year for this. The winter wonderland weather, the tasty peppermint everything, and all the holiday gatherings.

Whether you’re celebrating Christmas, Kwanzaa, Channukah, or just looking forward to the new year, remember to have a ton of fun with your last bit of 2018.

Monthly Observances
National Human Rights Month
Operation Santa Paws
Bingo Month

Weekly Observances
December 2-10 – Chanukah
December 26-January 1 – Kwanzaa

Days
December 1 – World Aids Day
December 1 – Rosa Parks Day
December 3 – Channukah begins
December 4 – Cookie Day
December 6 – St. Nicolas Day
December 7 – Pearl Harbor Remembrance Day
December 9 – Christmas Card Day
December 10 – Nobel Prize Day
December 12 – Poinsettia Day
December 14 – Roast Chestnuts Day
December 15 – Bill of Rights Day
December 15 – Free Shipping Day
December 18 – Bake Cookies Day
December 19 – National Ugly Christmas Sweater Day
December 20 – Go Caroling Day
December 21 – First Day of Winter / Winter Solstice
December 23 – Festivus
December 24 – Christmas Eve
December 25 – Christmas
December 26 – Boxing Day (Canada)
December 26 – Kwanzaa
December 27 – National Fruitcake Day
December 31 – New Year’s Eve

Examples of holiday marketing for brands:

What’s in Your Marketing Strategy for 2018?

It feels like you just finished planning your 2017 marketing strategy and now you’re already planning for 2018. One minute you’re high-fiving while watching the Superbowl and before you know it, your Halloween candy is already stale. With all the holiday cheer you’ve been spreading all year, it’s time to think about what’s in store for next year.

It would be so great to hear what’s on your wishlist for your marketing strategy in 2018. What are your goals? Any cool marketing campaigns you’ve already planned for next year? What did you learn from last year?

Right now, I’m currently working on wrapping up next year’s summer festivities before moving onto fall.

It would be wonderful to hear from you—either what marketing strategies have been your favorite in 2017 and what marketing plans you’re excited to explore in 2018. Tweet us at @sejournal or @annaleacrowe to share your thoughts!

More Marketing Resources Here:


Image Credits
In-Post Photo #1: Anete Lūsiņa / Unsplash
All screenshots are taken by author October 2017. 

Source: Search Engine Journal (Original

A Savvy Marketer’s Guide To Running Successful Hashtag Competitions

A Savvy Marketer's Guide To Running Successful Hashtag Competitions
If you’re looking for effective ways to build a bigger following on social media, look no further than the humble hashtag competition.

Since hashtags became a big deal in the social media world, they’ve become a quick way to explode your brand’s reach and amass user-generated content for savvy marketing.

But before we go into how to run a successful hashtag competition, let’s talk about why they’re so popular at the moment.

To be honest, it’s pretty straightforward: they’re cheap and easy to run, they tend to gain a lot of traction quickly, and they’re easier to manage than many other kinds of promotions. What’s more, they’re relatively quick to prepare for something that yields so many benefits. Hashtag competitions can help you build brand awareness, increase engagement with your audience, and ultimately increase sales.

One thing to be aware of though – user-generated hashtag competitions are different to ‘tag a friend’, ‘like’, or ‘comment to win’ competitions. You know why?

Because they use hashtags. D’uh. A UGC hashtag competition is basically a giveaway where a brand asks its followers to post a piece of content on a social network, accompanied by a specific hashtag, for a chance to win a prize. That content might take the form of a photo, video, review, or even some text, like a recipe.

Either way, each time someone posts anything containing the unique hashtag, they’re entered to win a prize. The winner is generally chosen either at random (technically making the contest a ‘sweepstakes’). Alternatively, they can be determined by criteria the brand decides upon such as popularity/views or by letting a panel of independent judges choose the winner.

Clear on all that? Great! Let’s get down into the nitty-gritty of running hashtag competitions.

The best platforms for hashtag competitions

While hashtags can be used across Twitter, Facebook, Instagram, Pinterest, YouTube and more, they are most popular on Twitter and Instagram, so you’d be best off focusing your efforts on these platforms.

Twitter is where hashtags really took off: when it first launched, the company’s founders used hashtags as ways to organize conversations and make certain topics easy to find. To run a hashtag competition on Twitter today, you’d ask people to use your hashtags in their post in order to be entered to win whatever prize you’re giving away, being aware that Twitter has a restricted character count so your hashtag shouldn’t be too long.

Twitter hashtag competition

To run a hashtag contest on Instagram, on the other hand, you’d simply create a brand hashtag of any length and ask participants to use your hashtag in whatever caption accompanies their image or video.

In this example, the Empire State Building is giving away $5000 to someone who has taken a photo touching the building.

The best practises for hashtag competitions

While hashtag competitions are relatively easy to set up, you’re more likely to meet your hashtag goals if you identify what you want to accomplish and use the best tools available to help you meet your goals.

Here’s a quick checklist of things to be mindful of:

  • Don’t violate trademarks. It may sound obvious, but you can’t use another company’s name in your hashtag. Increasingly companies are trademarking hashtags that don’t include their brand name, like Mucinex’ #blamemucus. So do some research to ensure you’re not co-opting a company’s trademarked material, and check that your hashtag can’t be read as a double entendre.

Trademarks hashtag competition

  • Make your hashtag(s) easy to remember. Creating a hashtag that isn’t overly long, hard to remember or difficult to spell is easier said than done. However, put proper time and thought into coming up with one, as it’s the ammunition that will set your brand apart. Browse through this list of the most popular hashtags for inspiration. A great hashtag contest we’ve seen lately came from Dwayne Johnson (aka “The Rock”) who used #slowmochallenge for a contest linked to the premiere of ‘Baywatch’.

Dwayne Johnson hashtag competition

  • Always include a brand hashtag. Some brands create unique hashtags that catch on with their followers and can be used for contests and general posts. PetSmart, for example, uses #petsmartcart in many of its own posts, and PetSmart customers frequently use this hashtag too. While you may have a short-term hashtag specifically in mind for your competition, you should also incorporate your branded hashtag into the post.

Petsmart hashtag competition

  • Don’t go overboard. Hashtags can be hard to resist, but don’t go too too crazy with them or you’ll look spammy. Yes, Instagram allows you to post up to 30 (and there are ways you can include even more than that) but there aren’t many situations that actually call for maxing out your hashtags like this. According to studies, nine hashtags is the optimal amount to include in Instagram posts – they receive 5 times more engagement in the form of likes and comments.
  • Write and post rules. Posting rules helps ensure that anyone who enters your promotion understands the guidelines and the limits involved. You might want to limit entries to certain countries or have a minimum age requirement, and you might want to limit how many times a person can enter. List all of these details very clearly on a landing page, rather than putting them into a comment or caption. That way your post will look much cleaner and you will be seen to have more transparency to boot.
  • Award extra chances to win by adding a form. One of the reasons hashtag contests are so popular is that they’re easy to enter. However, if you’re interested in collecting email addresses to use in future marketing endeavours, you might consider adding a form for your contest since you may have some customers who don’t use Instagram or Twitter. People can then enter by uploading an image using your specific hashtag to this form. This is a great way to earn more participation in your hashtag competition and you can even give them extra chances to win if they enter using the form.

Watchnerds hashtag competition

  • Make it clear how you will choose a winner. When it becomes time to choose a winner, you have a few options. Software platforms like ShortStack can combine all your hashtag entries into a feed from which you can use a random entry picker tool to choose a winner. You can also let votes determine the winner, as mentioned earlier, or choose the one you like best based on some criteria you specified in your rules – e.g., funniest, most creative, etc. But whichever approach you take, provide details of it right from the start.

The best UGC ideas from hashtag competitions

Once your hashtag promotion is over and you’ve chosen a winner, you should have amassed a nice amount of user-generated content that you can use in future marketing efforts. Since sharing UGC is a proven way to increase engagement, build trust and drive sales, your next best step is to start creating and sharing stories using the material you collected.

Sharing UGC gives you a unique opportunity to connect with your customers, and increasingly consumers say UGC is more ‘trustworthy’ than content created by a brand’s marketing department.

If there is a poster child for UGC marketing, it’s GoPro. The camera company has UGC campaigns running pretty much nonstop and the strategy has worked wonders for them. Customers use the camera to film, well, whatever they happen to be doing and are encouraged to upload their footage on GoPro’s website under one of many categories (Travel, Family, Fun, etc.).

The users who ‘win’ in categories get their content published on GoPro’s website and via various social media channels. GoPro also offers prizes and cash for people who win their ‘content challenges’.

GoPro hashtag competition

Any business could do something similar, showcasing the best entries either on existing social channels or on one you set up for the express purpose of promoting your users’ content. Winning videos and photos often get hundreds of thousands of views and likes so it’s a fantastic opportunity to increase brand recognition.

You can also use the UCG you collect to recruit brand advocates. Fashion and beauty brands, in particular, are very savvy about recruiting their followers to serve as brand advocates. Take Urban Outfitters. Their customers show off their style on their own platforms using hashtags like #UOHome and #UOBeauty. Urban Outfitters then curates the posts and pulls the ones they like onto a shoppable web page where customers can see how people are wearing and using various products.

Would you be more likely to purchase from a sterile image like this, which is on Urban Outfitter’s website:

Urban Outfitters hashtag competition

Or from this post, which appears on Urban Outfitter’s shoppable community page:

Urban Outfitters 2 hashtag competition

Remember, your social media channels are not just a place for your brand to grandstand about its products or services, they’re a place to build an engaged community. By showcasing users’ content on your website and social channels, you let your users tell their stories about their experiences with your brand. This is invaluable. There are even tools that will allow you to set up a feed that shows off all the content you’ve collected (which you should moderate, of course), similar to what Urban Outfitters has done.

Wrapping up

Hashtag competitions and user-generated content are influencing buyers of all ages and walks of life. In fact, a staggering 84% of millennials reported that UGC on company websites definitely holds sway and influence over what they buy (and how often).

What’s more, 43% of people are more inclined to purchase a new product if they learn about it via friends and family or on social media channels.

Are you likely to try a hashtag competition to gain exposure for your brand now? Do you want to know how ShortStack help you leverage Instagram and Twitter? Just ask! I’d love to hear your thoughts in the comments.

Guest Author: Dana Kilroy is the communications director at ShortStack.

The post A Savvy Marketer’s Guide To Running Successful Hashtag Competitions appeared first on Jeffbullas’s Blog.

Source: Jeffbullas’s Blog (Original

5 Advanced Ways to Increase Your AdWords CTR by @DustyVegas

Figuring out ways to increase your AdWords click-through-rate (CTR) can be frustrating.

Most internet advice focuses on the (very important) tactics of ad copy optimization, but that can only get you so far.

Eventually, search marketers will need to find advanced ways to increase their click-through rates.

If you find yourself in that position, read on!

1. Silo Traffic with Negative Match Keywords

Negative keywords are a crucial part of a healthy AdWords account. Just like the name suggests, they are the opposite of regular keywords.

Regular keywords tell Google what terms you want to bid on. But negative keywords tell Google where you absolutely don’t want to show up.

You should always add negative keywords to keep your ads relevant; however, there is a way to take your negatives to the next level and improve click-through rates.

Negative keywords come in three match types: exact, phrase, and broad. Adding one of those negatives to a campaign will prevent your ads from showing for that match type.

If you want to get as high of a click-through rate as possible, add your exact match regular keyword as exact match negatives to your broad and phrase campaigns. Your users will be funneled to ads you made for their specific queries.

The CTRs for your exact match keywords should be much higher than your other keywords. By adding your exact keywords as negatives to your broad and phrase campaigns, Google does not have the option of sending users to less-relevant campaigns (that happens a lot).

Bottom line: Funnel traffic to your exact keywords using negative keywords.

2. Leverage Location Bidding

Users from different geographies search in different contexts.

Imagine two users searching for the query “November clothing”, but one lives in Iceland and the other in Miami. The meaning behind the query changes depending on the physical location where it was entered.

The key to improving CTR with location bidding is to first understand how geography affects user behavior down the funnel. You want to make sure that increasing your CTR for a specific geography is worth the investment.

Do users from New York City have a higher ROI than users from other locations? If so, that would be a good place to start. But first, you need to start tracking. If you can’t measure it, you can’t manage it.

Google offers tracking parameters called ValueTrack parameters. These parameters help search engine marketers track anything from the geography of a click to the device that drove it. You should implement all of them but for now, focus on implementing the two geography-based ones: physical location (loc_physical_ms) and location of interest: (loc_interest_ms).

The physical location parameter tells you the location of the user. The location of interest will tell you the location they were searching for (if applicable). If a New Yorker were searching for hotels in Maui, the first parameter would relate to New York and the latter to Maui.

Make sure that you are passing these parameters into your analytics system. Once you can tie the data to user performance, you can see what geographies are worth investing time into.

Bottom line: Track the location information of clicks and use that information to improve click-through rates.

3. Remove Duplicate Keywords

Keyword creep is a problem that every PPC account must deal with eventually. As time goes on, you add keywords that make sense at the time. Eventually, you step back and realize that your once tidy account has a problem with duplicate or irrelevant keywords.

Don’t worry if you pull a search query analysis and see that you have search queries matching to the same keyword in multiple places. It happens. Google will not prevent you from adding duplicate keywords in separate campaigns.

Running an analysis is easy. Navigate to the keywords tab for your account and click the search terms button in the top left. This will show you the queries and keywords that are driving paid search traffic to your website.

If you pull an analysis and see this problem, keep the keyword that performs better for your business. Pause the other one and watch your overall CTR rise as more traffic see the appropriate ads.

Bottom line: Use only one instance of a keyword per match type to avoid splitting traffic.

4. Opt Out of Google’s Search Network

A lot of the clicks that marketers see in the AdWords UI don’t actually come from www.google.com. Instead, they come from Google’s vast network of partner websites called the Google Search Network.

Search network websites can range in quality. Some are legitimate while some are complete spam. Sadly, Google doesn’t let marketers blacklist specific search partner websites, so you’re either all in or all out.

The key is to figure out if the partner network is right for you.

Remember those ValueTrack parameters? There is one called the network parameter that lets marketers track the network of a click all the way to the final conversion.

You can implement the network parameter to break out your important KPIs by network. See if your search partner traffic drives a profitable ROI before shutting it off completely. If your KPIs are bad for search partner traffic, you can improve them by simply opting out of the network.

Bottom line: Google’s search network can send you a lot of bad traffic. Consider opting out of it.

5. Implement Dynamic Keyword Insertion

Maximizing your click-through rates can take a lot of time and energy, but dynamic keyword insertion can reduce that.

Dynamic keyword insertion allows advertisers to create an empty space in your ads where the user’s search term will appear. The idea behind it is that nothing can be more relevant to the searcher than the query they actually searched for.

Let’s imagine that your company sells used Mazdas. As a search advertiser, you create campaigns and ads groups around the different types of Mazda cars – “Mazda 3” or “Mazda Miata”. Dynamic keyword insertion automatically injects the user’s hyper-specific search term into your ad copy.

You would create ads that include action keywords and then the dynamic keyword:

  • Find Your Mazda 3
  • Buy The Perfect Used Miata

If you want to improve your CTR with dynamic keyword insertion, start by following Google’s instructions here.

Bottom line: Dynamic keyword insertion makes your ads more relevant by inserting the search query.

Source: Search Engine Journal (Original

Hiring Freelance Content Marketers: How to Find the Perfect Fit by @https://twitter.com/rosiemay_r

Thinking of working with one or more freelance content marketers?

The reality is that you’ll likely have far too many freelancers to choose from.

For instance, did you know that from 2014 to 2016, there were more than 2 million freelancers in the U.S.? And the number of freelancers is still growing!

Not to mention, the freelance economy is booming worldwide as well.

That’s exactly why hiring freelance content marketers is both overwhelming and appealing.

The Value of a Freelance Content Marketer

Businesses will often have a head of either marketing, content marketing, or both.

And if a business is small enough, business owners will do their own marketing.

However, marketing is a lot of work!

But if you work with a freelance content marketing specialist, you get to lighten the load a bit.

For example, one content marketer might be a specialist in a specific type of blog content, and one might specialize in sales or landing pages.

What Happens If You Make the Wrong Choice?

Unfortunately, the wrong choice will waste both your money and your time.

And this often leads to you hiring extra people to compensate for poor results.

Yet, the worst case scenario is much more likely to happen if you make hasty decisions and approach the hiring process blindly.

How Do You Choose the Right Candidate?

Want to know what makes a freelance content marketer a good choice for your organization?

Here some qualities to look out for, when figuring out who to hire.

The Candidate’s Reputation

Reputation can tell you a lot about the quality of the work that you’re getting. And you can spot that immediately based on factors such as:

  • Testimonials
  • Past clients
  • The press or “featured on” sections of a freelancer’s website

But what about newbie freelancers? Newbie freelancers can have a great reputation as well!

If someone is brand new, you can determine their reputation based on stuff like:

  • Guest posts they wrote
  • Work history
  • References

Social Proof & Its Impact on Reputation

On the freelance lifestyle website, Freelancer FAQs, Sally Acquire put it best when she explained the importance of social proof, in the following way:

“Social proof is an endorsement that you’re worth hiring. Other people have taken a chance on you already so the risk factor for hiring you goes down a notch.”

And that endorsement is more powerful than you think.

Because chances are you can remember at least one incident where you bought something because a friend or family member recommended it.

And that’s exactly why this principle applies to the hiring process as well.

Because if you recognize the company that shows up on the candidate’s testimonial page, it will likely influence your opinion.

But what do you do if there are no testimonials and the freelancer is unable to provide references?

If that’s the case, then the candidate may not be a good fit. Because they’re only worth working with if they can provide valid proof.

Numbers & Stats

Numbers and stats are the most important things to be on the lookout for when hiring freelance content marketers. And these numbers and stats cover key factors such as:

  • Number of comments
  • Social media shares
  • Google search rankings

So, what’s the best way to find these stats?

Freelance content marketers often include a content engagement highlight reel on their website.

What Content Engagement Highlight Reels Actually Look Like

Here are some content engagement stats from a blogger I follow on Twitter:

freelance content marketers stats sample

So why exactly are these numbers significant? Because it proves that this writer can not only write but also put her marketing thinking cap on as well.

What do you do if you’re unable to find relevant stats?

If they’re unable to provide relevant stats, then just ask!

Here are some questions that will help you determine the value of any content marketer:

  1. How many views did your blog get this week?
  2. Can you show me some examples of results that the websites you’ve worked on generated?
  3. What levels of engagement have your most recent email marketing campaigns produced?

Don’t, under any circumstances, be shy about asking these questions. Clients ask these questions a lot more often than you think.

Other Relevant Skills

Being a great email marketer, content writer, YouTuber, etc, is great and all. But there’s only one problem with that.

The average content marketer’s work isn’t only about telling a story. They also need other relevant skills to reach their target audience.

Applying This Approach to a Hiring Scenario

Let’s say you’re working on an upcoming product launch and you’re thinking of hiring a sales page copywriter.

And let’s also pretend that you’ve managed to narrow down your choices to two freelance content marketers:

Candidate 1:

A WordPress expert, who also has:

  • Several years of work experience as a content marketer in a corporate environment
  • A relevant client list
  • A sales page-focused portfolio

Candidate 2:

A J-school grad with:

  • An impressive client list and portfolio
  • Some content marketing experience
  • A generalist when it comes to freelance writing specialties

Which one would you choose?

More likely than not, you’d choose candidate one.

But why is that the case?

  • They have a background in one of the most popular CMS platforms on the market
  • They know how to market content to a large audience
  • Specialize in the type of content you’re looking for

5 Questions You Must Answer Before Hiring

If you want to get a great return on investment on your content, you need to take a goal-oriented approach.

And one rule applies, no matter what type of content you’re looking for. The ideal creator of that content is someone who can listen and respond to your goals, values, etc.

Before you start researching freelance content marketers, here are some questions you need to know the answer to:

1. Why Do You Need This Type of Content?

Far too often businesses take a one-size-fits-all approach to content, and that never works! No idea what I’m talking about? The one-size-fits-all approach starts like this: “Joe Schmoe found success with blogging so I will too!” Because that isn’t always true, this is an essential question to start with.

2. Who is Your Audience?

You need to hire someone who understands and can speak to this audience.

3. What Content Does Your Audience Read & How Do They Typically Consume It?

Your audience might not consume content in the way that you immediately assumed. And if that’s the case you might want to consider hiring a specialist in an alternate type of content.

4. What Are Your Company’s Values & How Are they Different than Your Competitors?

So what falls under the values category? Everything from customer interactions, to how you create your products or services applies.

5. What Is Your Tone & Voice?

For example:

  • Are you on a first or last name basis with your customers?
  • Are things like humor and sarcasm part of how you communicate?
  • Do you use slang or emoticons?

Why These Questions Are Important

Any content marketer worth working with will ask you about your audience and your values. But, they can’t help you unless you have sat down and put careful thought into the answers.

And knowing the answer to these questions will help you narrow down the list of candidates.

Because it will help you determine which candidate(s) will produce content that:

  • Suits your audience.
  • Resembles your typical tone and approach to things.

And if you want to work with freelance content marketers, you’ll make their job easier, and have better results if:

  • You provide detailed answers to their questions.
  • You give them access to relevant resources. Anything that will help them get to know what your brand’s voice is like is perfect!

But what do I mean by “resources”? Here a few examples:

  • Social media posts
  • YouTube videos
  • Brochures
  • Advertisements (print or digital)
  • Links to recent event listings via platforms such as Meetup and Eventbrite

Other Considerations

Cheap and fast projects will likely seem enticing but they’re (rarely) practical.

Never lose sight of the fact that quality content isn’t about overnight success.

It’s about what will happen over the long-term when freelance content marketers try out targeted marketing campaigns, such as:

  • Putting your brochure on a community bulletin board.
  • Doing keyword research and other activities that are good for SEO.

Never, Ever Forget About Audience Engagement!

Stats and social proof are definitely important.

But you should never get too caught up in the game of “compare and contrast.”

For instance, if one of the candidates got 100 Facebook shares, and another one got 1,000.

Picking the candidate with the highest number of shares seems intuitive. But those shares won’t mean anything unless the interactions are also more in-depth than:

  • Pressing the “like” button. And that is fine, by the way, if they also comment on what they read!
  • Writing comments such as “nice post”

And that’s exactly what meaningful audience engagement is all about.

When recruiting freelance content marketers, the ideal candidate is a conversation starter.

High-quality freelance content marketers can also make sharing info about a company an enjoyable experience.

And any content marketer worth working with will know exactly how to do that.

More Outsourcing Resources Here:


Image Credits
Featured Image: Created by Rosemary Richings, October 2017.
In-Post Photo # 1: Screenshot by Rosemary Richings. Taken October 2017.
In-Post Photo # 2: Created by Rosemary Richings, October 2017.

Source: Search Engine Journal (Original

Is WordPress Secure?

The question of whether or not WordPress is secure is complicated. While it’s obviously a secure enough platform for roughly a quarter of all websites around the world that are powered by WordPress, it’s not without its flaws.

So, who is responsible for keeping WordPress secure? Of course, some of that responsibility ultimately falls on your shoulders. That’s why it’s essential to be aware of and abide by WordPress security best practices in order to keep every site you build as secure as possible.

However, the team behind WordPress does have some responsibility in all this, too. After all, there’s nothing you can do to protect the underlying core of WordPress yourself.

If the matter of WordPress security nags at you as much as it does pretty much everyone else trying to conduct business online, then keep reading.

I’m going to cover some of the history around WordPress security issues and what the WordPress Project is doing about them.

Want to know more? Check out our detailed step-by-step tutorial, The Ultimate Guide to WordPress Security.Read more

A Brief History of WordPress Security Issues

Did you know that “[h]ackers attack WordPress sites both big and small, with over 90,978 attacks happening per minute”?

The issue isn’t necessarily that WordPress is a weak content management system, prone to hacking attempts and security breaches. It’s more likely a problem of visibility. WordPress is the most popular CMS around the world, so of course, it’s going to be an easy target for hackers.

WordPress is commonly discussed online (in blogs, forums, podcasts, and so on) so, consequently, the weaknesses of the platform are well-known. It would make sense then that hackers would primarily target WordPress websites, right?

Security is a major topic of discussion for any WordPress or web development blog, WPMU DEV included. That’s not to say that we’re to blame for publicly sharing WordPress’s flaws. In our community, this is mostly just common knowledge anyway. However, all this published information does make WordPress’s vulnerabilities painfully clear.

According to the WordPress Project (the team responsible for managing security for the platform), they issue security patches all the time. You know those automatic update notifications you receive when you log into the admin area? “WordPress has been updated to 4.7.2” or something like that? Well, usually when you see those minor versions go out, it’s because the team had to fix a security issue.

And these happen often:

The Panama Papers data breach from 2016 was, in part, traced to a vulnerability in a WordPress slider plugin.

This rundown from WPMU DEV covers a number of other documented WordPress security exploits. They might not all be as high-profile as the Panama Papers one, but it’s still concerning to know that, despite the WordPress Project’s best efforts – as well as the developers responsible for maintaining their plugins and themes – hackers are still finding a way in.

That said, it is reassuring to see how WordPress handled a very recent and far-reaching security breach stemming from the REST-API.

Here’s how things went down:

  • In January of 2017, WordPress released update 4.7.2. Nowhere in the list of updates or patches was the security patch mentioned.
  • About a week later, WordPress notified users that there indeed was a security flaw detected and patched in that update.
  • The reason they gave for the delay in notifying users? Because they wanted to give them time to update the core before attackers were aware that WordPress was aware of and fixed the problem.

Of course, that didn’t stop hackers from defacing 1.5 million WordPress sites in the meantime. There are also those WordPress users who never updated the CMS (or did it too late) who remained vulnerable to the attack.

So, even though a patch was eventually issued by WordPress and they handled the announcement of it with much-needed tact, over a million sites were harmed in the process. And, worse, many website owners continued to be unaware of this defacement even after it happened.

Security patches seem to be coming out more frequently, with 2015 receiving the brunt of the abuse. As more and more of these occur, it’s important for you to know who is responsible for securing WordPress and what you can do from your side of things to ensure those threats stay out.

What You Need to Know About the WordPress Project (and Security)

Here is what you need to know about the WordPress Project and what they are doing to maintain the security of the core.

The WordPress Security Team

First, let’s talk about the WordPress Project. This security team is comprised of about 25 individuals, all of whom are experts in WordPress development or security. Currently, half of the people on the WordPress Project work for Automattic.

This team of experts is responsible for identifying security risks in the core. They are also responsible for reviewing potential issues with third-party-submitted themes or plugins and making recommendations on how they can harden their tools or patch known breaches.

While they typically work on their own to identify and resolve these issues, they do, from time to time, consult with other experts in the field, especially those from security and hosting companies.

How WordPress Identifies Security Risks

As you’d expect, the WordPress Project team works like a well-oiled machine. Here is how the security risk identification and resolution process works:

  • An issue is identified either by someone on the security team or from outside the team. Non-Project members can communicate these detected issues by emailing security@wordpress.org.
  • A report is logged and the security team acknowledges receipt of it.
  • Team members then work together on a walled-off and private server to verify that the threat is valid.
  • This is where they track, test, and repair any security flaws detected.
  • The security patch then gets added to the next minor WordPress release.
  • For less serious repairs, WordPress simply notifies users within the WordPress dashboard whenever an automatic release occurs.
  • For more urgent issues, the release will go out immediately and WordPress.org will announce it on the News page of the website.

Of course, as we’ve seen with 4.7.2., WordPress doesn’t always immediately announce these security patches (for valid reasons), though they do always take immediate action to resolve them.

A Note About Automatic Updates

As of version 3.7, WordPress has had the ability to push minor updates automatically to all websites. This ensures that the WordPress security team can get urgent patches out in a timely fashion and not have to wait around for users to accept and make the update on each of their websites.

However, it is possible for WordPress users to opt out of these automatic core updates. If this is the case for you, please be aware that this may put your site at additional risk, especially if you don’t have the time to diligently monitor all your sites for the latest and greatest update.

WordPress Plugins and Themes Security

Much like how it’s your responsibility to provide visitors with a secure website experience, WordPress plugin and theme developers are responsible for keeping their users (i.e. you guys) safe as well. While WordPress cannot manage the tens of thousands of plugins and themes out there, they can at least keep a close eye on them to ensure nothing seriously insecure slips through the cracks.

The WordPress Project is the team responsible for working with developers when a security issue is detected. Before that, however, there is a team of volunteers assigned to review each and every theme or plugin submitted to WordPress. That team will work with developers to ensure that best practices are followed.

Nevertheless, security vulnerabilities may still arise and that’s when the WordPress security team needs to step in to:

  • Provide documentation for WordPress developers on plugin and theme development and security best practices.
  • Monitor plugins and themes for potential security flaws. Any issues detected will then be brought to the attention of the developer.
  • Remove harmful plugins or themes from the directory if the developers are unresponsive or uncooperative.

WordPress will then notify its users via the WordPress admin when those security patches (or the removal of bad plugins and themes) are available.

OWASP’s Top 10

The Open Web Application Security Project (OWASP) Foundation was created back in 2001 with the purpose of protecting organizations from software and programs that could potentially do them harm. What you may be surprised to learn is that the WordPress Project aims to abide by OWASP’s Top 10 at all times.

The Top 10 is a list comprised by the OWASP of known and very serious security risks. Having familiarized themselves with this list, the WordPress security team uses those trends to define their own top 10 list of ways to defend the core. Currently, their goal is to protect the core from the following risks:

  1. User account management abuse
  2. Unauthenticated access requests to the WordPress admin
  3. Unwanted or unauthorized redirects
  4. Exposing users’ private data
  5. Requests for access to direct object reference
  6. Server misconfiguration
  7. Unauthorized code injection
  8. Cross-site scripting from unauthorized users
  9. Cross-site request forgeries whereby hackers misuse WordPress nonces
  10. Corrupted third-party plugins, themes, frameworks, libraries, etc.

WordPress Security Requires Your Vigilance

Having reviewed all this, it does put my mind a bit more at ease to know that there is a dedicated team working to keep the WordPress core secure at all times. However, that doesn’t mean I (or you) should be lulled into a sense of complacency.

As we’ve seen – even as recently as this past January with the 1.5 million defaced websites – no matter how good the WordPress Project is at monitoring and securing the platform, hackers will find a way in.

That’s why it’s important to play your role in all this and keep your sites secured from every angle. The Defender security plugin is a good place to start.

For more tips, don’t forget to subscribe to the WPMU DEV blog as this topic of “Is WordPress Secure?” and what you can do to better protect it will continue to come up time and time again.

Source: The WordPress Experts – WPMU.org (Original