Last week I spent some time with LeapFrog Solutions, a Washington D.C.-based think tank specializing in federal agency communication and change management. Next to me in a conference room filled on one side with glowing windows and the other with whiteboards was Bob Derby, LeapFrog VP of Strategic Communications. On the phone was John Verrico, president of the National Association of Government Communicators and chief of media relations for the U.S. Department of Homeland Security’s Science & Technology Directorate. Bob is one of those management consultants with nearly 30 years experience who helps organizations plan their entire staffing and missions, and John is a rare data-communication guru hybrid who has helped federal agencies use analytics to track down terrorists.
Our mission was to help draft an upcoming seminar on Predictive Analytics, or more precisely, how to help enormous federal agencies take best practices from Corporate America to turn “big data” into actionable steps to achieve program outcomes. Frankly, it’s a head-scratcher.
The topic is enormously vast—predictive analytics range from modeling credit scores and terrorist threats to machine learning, data mining, and marketing campaign forecasts. Modeling varies across industries, from pharmaceutical giants predicting drug adoption curves to insurance companies evaluating the future impact of global warming on policies for homes on the Gulf Coast. How in the world do you boil the vast data flowing around every organization into a system for valid forecasts?
Yet as we drew diagrams around the topic, a simple model for predicting outcomes from complexity started to emerge.
When New York City’s Citicorp building almost fell down
First: What are predictive analytics?
In one of our favorite data stories, documented by Joel Morgenstern in The New Yorker, in 1978 William LeMessurier, lead designer of the Citicorp Center skyscraper in New York City, received a call from a young engineering student who had been tasked with evaluating the Citicorp building, what was then the seventh-tallest building in the world. The student said that he had modeled wind forces on the building and thought the columns at the bottom—placed in the center of each building side, allowing for the bottom corners of the building to jut out over space several stories in the air—had been put in the wrong spots. The building, the student thought, might fall down in high winds.
LeMessurier chuckled at the kid’s naivety, got off the phone, but later re-ran some numbers and found, to his horror, the building actually might buckle in hurricane-force winds if they were what sailors called “quartering winds,” coming from an angle and hitting two sides of the building at once. Hurricanes did hit Manhattan every 90 years or so. And thus began a secret race to reinforce the Citicorp building’s structure from the inside out, all due to an error in math.
The Citicorp story is an illustration of predictive analytics: You are trying to build something (here, a 59-story tower that won’t fall down), need to evaluate how internal systems you control (steel beams) support the outcome, but also need to forecast external factors (high winds) that put stresses on your system. Your data must follow a chain of logic from outside to inside, prediction to event to result. If you model it right, you can control the outcome.
A simplified ‘gameboard’ for prediction
During our call, Bob and I started doodling in notebooks and the whiteboard, and a lucid model for “Predictive Analytics” emerged. The first draft looked like this:
It’s extremely simple, really. The Y vertical axis, at left, shows two major areas—the external environment of things you cannot control, and the internal systems that you can control. The X horizontal axis, at bottom, shows time in three groupings: predictions, the period of time before your campaign or activity when you need to anticipate outcomes; events, the things that happen as your campaign is running; and response, how you react (in marketingspeak, this is often called the “optimization” period of your campaign).
Thus, on one side of the board, the things you can and cannot control; on the other, your forecasts, events and response.
Now, within the board, analytics and systems are grouped into six areas:
1. Game. This is where you make predictions for external environmental factors beyond your control, but which if gamed out, could be anticipated. (We use “game,” from “game scenarios.”) Example, a contender for president could have gamed out that a populist billionaire such as Donald Trump might enter the race and springboard off the undercurrent of economic dissatisfaction and fear in this country. No politicians can control Donald Trump, of course, but his current success in the polls could have been predicted. Marketers can game out competitor moves. Business leaders can explore Michael Porter’s 5 Forces models of competitors, suppliers, customers, market entrants, and market substitutes. Government agencies can game out scenarios of news events that might raise or lower public opinions of their missions. The future is uncertain, but major environmental influences on campaigns can be pre-conceived.
2. Forecast. This is where you model the variables you can control, and forecast outcomes using various statistical methods. In marketing, this typically involves comparing financial inputs into a campaign into forecasts for impact in awareness, response, conversions, acquisitions, and ROI. Forecasts can be tied to benchmarks of prior similar campaigns (overall) and specific communication pathway performance, channel by channel, medium by medium. A $1 million investment, providing no shocking “game” influences from step 1 above, should lead to XX,XXX results flowing out.
3. Test. This is real-time event management by testing variables you can control. Messages, ad creative, media channels, influencer outreach, conversion pathways, all should be tested with different flavors, colors, images, media tactics, and path structures. Even the very audience you are trying to reach should be tested. A classic example comes from automotive; when Honda launched its boxy Element mini-SUV in 2001, it originally thought buyers would be young men in their 20s who wanted a cool beach-surfing-camping vehicle. Buyers turned out to be dads age 35+, who wanted a fun small SUV to carry kids around in without looking like a mom in a minivan. As test data flowed in, Honda rapidly pivoted its ads away from pictures of the Element on the beach with fold-down seats for fully reclining (wink, wink, young men) to more family-focused advertising creative.
4. Monitor. This step means setting up rapid-response monitoring systems, so you can react to the world’s events as they happen. Tied closely to Step 1 above, game, monitoring systems may include ongoing analysis of competitor organization communications campaigns; tracking of news stories; or sentiment monitoring of consumer discussions about your mission on social media using tools such as Radian6. One great, simple approach to setting this up is to draw a “touchmap” of data flows from all major outside factors (audiences, organization outlets that touch audiences, competitors, sales systems, news events, market entrants) to your internal data systems. Then, draw little stop signs where you have gaps from data you might need to your internal flows. Then, fix the gaps.
5. Measure. This is a simple word for a vast construction, and we’ve written elsewhere detailed guides to campaign measurement methodologies. But at its core, measurement means evaluating your campaign from the audience perspective—how their attention is reached efficiently; how their awareness is increased; positive or negative shifts in sentiment, responses, conversions; and the economic cost per desired action. Measurement puts the data dictionary and data flows against your Step 4, Monitor. It is worth being clear here that measurement does not necessarily mean investing heavily in new technology systems. For most organizations we work with, clients have all the systems they need in place already; instead of more investment in data systems or software, what is often needed is simple counsel in connecting the dots.
The danger of measurement is the output can be overly complex, leading to dashboards that look like a mathematician lost his breakfast. We recommend setting up KPIs (key performance indicators) that use a tree-branch structure, similar to drawing your family genealogy. What 2 or 3 main factors are you really trying to evaluate (such as your 2 parents)? For a marketing campaign, these might be lift in intent to use our product (a core brand metric) and cost per customer acquisition (a core direct-response metric). Behind them, what supporting data elements lead to these KPIs (similar to your grandparents and great-grandparents)? Etc. By nesting your measurement findings into a hierarchical structure that leads to a few core outcomes, you can both measure real progress in the major terms and also explore the more minor variables that create the chain to these results.
6. React. This is the punchline, the moment when you react to how the market around you has moved. But instead of reaction being “reactionary,” if you’ve successfully staged the preceding steps, you will be able to react smoothly and calmly to redirect your campaign as outside forces and audience results emerge. If the stock market crashes tamping down demand for your product gizmo, you will have anticipated this. If breaking news tosses a PR crisis your way that damages your brand, you will have a plan B and subsequent plan C in place. And whatever breaks, you’ve pre-installed measurement systems to gather news in as quickly as possible and gamed out rapid response pathways to maximize your influence.
It’s a very simple gameboard, filled with, yes, lots of complex work. But this work does not have to be expensive. We recommend as you consider predictive analytics that, instead of investing in a million-dollar data system, you throw this model on a whiteboard in your office, break out some yellow Post-It notes, and see how simply you can cover all the bases.
Update: We’ll be speaking on Predictive Analytics for the Future Nov. 5 in Washington, D.C. We’ll post details on the event in this space soon. If you would like to attend this morning session on Predictive Analytics, email email@example.com.