In supply chain world the last mile is that last step to get the Amazon package to your door.  It’s notoriously expensive and cumbersome and it can be the difference between turning a profit or not.

Your last mile equivalent is the paper or digital equivalent reply form.  This last mile is a good place to apply what’s often referred to as choice architecture, a wonky expression for realizing that decisions made on this form are heavily influenced by the form and what it conveys about,

  • Decision Information – the presence or absence and type of information to inform decision.  Is your language wonky?  Is your information relevant to their goals?
  • Decision Structure –  How will the person choose among options, how easy is it?  how much mental effort is required?  Think layout.
  • Decision assistance:  What’s the mental cost of not following through and not giving?  How can you close the intention-action gap?

Designing your last mile with this framework in mind is where all those nudges come into play.  Those nudges have come under attack lately from two fronts.

One is the subtle attack of the amateur-hour, wannabes who thumb through New York Times bestseller on nudges, declare themselves expert and sell snake-oil to the unsuspecting.  This attack is pernicious as it slowly undermines the collective sector’s trust in the science as poorly understood and designing experiments produce no net improvement.

The other attack is self-induced as the academic inventors and purveyors of these theories undermine their credibility with a replication crisis and the enormous pressure to publish or die, even if it means doctoring p-values or just plain falsifying data.

But, the death of nudges, like direct mail, is premature.  In a meta-analysis of 455 studies from 214 publications the headline is encouraging.

Last mile nudges work.

At least those done by experts.  In fact, if you have an expert designing your last mile nudge there is an 85% chance it delivers some lift.  I’d wager that falls to coin toss territory with the charlatans.


This chart might look wonky but it shows that everything to the right of the solid, vertical line worked all or most of the time to produce positive lift in behavior.

But what type work best?

The 455 experiments were coded into providing more or different information, design and layout (structure) and ways to close the intention/behavior gap beyond the form itself (decision assistance).

All three work as do all the sub-categories, evidenced by the squares and diamonds being to the right of zero.

A few more takeaways:

  • Defaults work the best.   Try defaulting to monthly instead of one-time in digital.
  • Design matters.  This is composition and effort in the chart.  I’m stunned at how little (read: none) user experience research is done in our sector.
  •  Social cues also work pretty well. This is helping folks with analysis paralysis on what choice (e.g. amount) to make by letting them know what others have done.   We know this is especially true for acquisition.