Category Archives: psychology

The novelty effect

This is a story about why stories no longer hold your interest.

About 30 years ago Cormac McCarthy sat down to write “Blood Meridian,” a gruesome Western tale with evocation like this:

“That night they rode through a region electric and wild where strange shapes of soft blue fire ran over the metal of the horses’ trappings and the wagonwheels rolled in hoops of fire and little shapes of pale blue light came to perch in the ears of the horses and in the beards of the men. All night sheetlightning quaked sourceless to the west beyond the midnight thunderheads, making a blueish day of the distant desert, the mountains on the sudden skyline stark and black and livid like a land of some other order out there whose true geology was not stone but fear.”

It’s beautiful stuff, but as I read it, I doze a bit after 10 pages or so and then pick up my MacBook to check more modern writing like this:

“I don’t know if I’ve ever seen @maddow deliver such a blistering, fed up, disgusted introductory monologue.”
“Love this Eury Perez guy. Our own Dave Roberts! #Nats”
“49 Things You Must Tell Your Baby…”

First, I have the author who would win the National Book Award and Pulitzer Prize. Second, I get Twitter gibberish about liberal talking heads and 49-things linkbait. Yet somehow Twitter, like its social-media brethren, feels more alive and interesting than all the beautiful words about horses poor Cormac can muster.

One possible explanation is what psychologists call “The Novelty Effect,” or the tendency of people who encounter a new experience to have a higher emotional or cognitive response. This effect can sometimes be a downer — like your first airplane ride, where you clutched the armrests in a silent scream — or an upper, like your first kiss, or A+ from a teacher, or cash bonus greater than $1,000. You get more jacked when you see Santa Claus for the first time, and over time, more meetings with the big elf never recreate the initial thrill. New technology creates similar juice by making us put more attention on new communications devices, and in turn we feel like we’re getting more out.

Google+ was a recent example. When the moderately improved version of Facebook first showed up, people loved the beauty of the layout, the thinner-but-more-meaningful social links, the feedback and debates that appeared rapidly beneath every post. G+ felt better than other tools at first because, in its novelty, it attracted our focus, we put more in, and the resounding ripples gave us more ego-boosting content out.

Eventually the love affair fades, like a sexy iPhone 4S suddenly looking boring next to the 5, but social networks in general still provide more arrogant reflection than the cold hard TV tube or the silent pages of a book.

Quality has little to do with the appeal of novel technologies, because the newness itself is what forces our attention. Social media is over-rated for its supposed radical restructuring of human communications; what we really like is the snazzy interface, which gives us new ways to reflect on our own intelligence and charm. Like the classic Dilbert cartoon in which one coworker introduces a handsome new manager with the caveat, “don’t worry, he gets stupider the longer you know him,” eventually the new thing goes “ooga!” and we realize we must find something more new.

It’s hard to believe that a sudden skyline stark and black can’t hold our attention. But then, McCarthy never retweeted you.

(Archive) The technology of love

Sarah Jamieson is a metro office worker who aspires to be a writer. She’s 24, slightly overweight, but knows she’s attractive because John at the front desk keeps ogling her chest. Sarah isn’t dating, though, because work is stressful and the hours are long and it’s just too damned hard to find time to go out. The last guy Brian was a jerk focused on unbuttoning her blouse, and is for losers — so a break is in order. Each evening after taking the G train home she cooks a microwave dinner in her apartment over a Brooklyn grocery, pours a glass of white wine, and retires to a wooden desk, a gift from her grandmother, to write a post for her blog While Sarah types, her Mac’s TweetDeck program flashes updates from online friends every 15 seconds or so — tidbits such as “RT @johnhenry57 Do you remember the first time you fell in love?” — and she feels the warmth of human connection, of belonging to a tribe, of knowing others who know her needs. John Henry lives in Britain, she thinks, unsure and too tired to click to his bio. She pecks out a final sentence, hits Publish Post, tells herself she’ll call her mother tomorrow, and goes to bed.

That story is fiction.

The reality is closer: Many people live two lives, one with a lover or cat at home and another far away in a fictitious corporate environment, a battle of spreadsheets for entities that exist only in legal documents with surnames such as Inc. or LLC, in small rooms under fluorescent tubes far from the sun. Hours there are traded for numbers, no more than ones and zeros, that flow like blood into electronic scoring tables called bank accounts, and then can be transferred for goods, food and shelter. Perhaps stunned by the fake ambience of math, these people take recess in online games that pretend to connect to other people, with scoring mechanisms telling them they are growing more popular.

This story is real.

How did our world splinter in two — a home life with flesh and blood, and a corporate matrix populated by artificial-numbered social reality? If veal is disdained by some who would never eat a calf kept in a small bin, not allowed to roam free, trapped indoors for life; then who would eat you? In the United States, 9 in 10 people commute to work by car, spending a collective 3.7 billion hours a year stuck in traffic, only to arrive at job sites that require 9 hours or more of input into devices that lead to numbers in banks. If humans are social creatures, driven by sexual urges to procreate and parental desires to protect our young, how did we mortgage our lust-and-love connections to spend so much time in artificial environs?

Why is that which is closest to our bodies now furthest from our souls?

Social scientist Geoffrey Miller posed in Spent that the world did not have to end up like this; rather, it was series of unforeseen inventions, some helpful — such as trading markets or artificial currency — that allowed us to build and buy self-pleasuring items such as tickets to Tori Amos concerts or Hummers with poor turning radiuses. Unfortunately, Miller suggested, these inventions pushed us away from the bucolic values that once kept tribes cohesive and love close at hand.

Yes, you own a shiny iPod that can pump emotional music into your brain to bathe you in warmth, but you can’t hug your wife or kids at 3 p.m. while flying to Dallas or typing downtown. Technology has expanded our need set; we can fill our lives with near-perfect entertainment tools, the equivalent of 300 plays running concurrently in any hour on our TVs, pre-cooked meals of any flavor, voice transmissions around the globe … and yet most of this time is disconnected from the children who make us laugh or lover who brings us pleasure.

Is this too negative? Look around on the highway in the morning, at the cars crowding you, each with only one person inside its steel box. We have mortgaged our lives, and the answer lies in our drive for loyalty, for the stability of people or places or things that we can count on that will do us no harm. We crave predictability, because it helped our ancestors survive. The best way to predict the future is to find environments that have repeatable events driven by loyal people we trust. As environments have become more artificial, they’ve also improved in stability — and we find that loyalty pleasing.

Consider what loyalty is. Psychology has defined three aspects of faithfulness: emotional attachment (affective), perceived switching costs (continuance), and feelings of obligation (normative). Fear of switching and feelings of obligation are two potential motives for our inertia in staying in jobs, in living the same commute, in not fleeing the business world to go build sea-shell necklaces on a beach in Mexico. The false thrill of numbers in a bank have given us 2 of the 3 loyalty mechanisms we need to stay put in evolving society — we fear switching, and we’re obligated to go on.

But what of the other: emotional attachment? The affective aspect of loyalty is harder to fulfill, because it resorts to such funny stuff as novelty, humor, friendship, compassion and love. You felt this as a child with your mother, and perhaps when dating as a teen or falling for your spouse, the incredible drive to stay forever with another being who is filling your emotional needs. Emotion is the strongest impulse for loyalty, for going on one path and neglecting all others.

About 15 years ago, technology began filling our loyalty gap.

Technology today has accelerated our fake relationships, the reinforcement of stability, of loyal beings who will give us what we need. Social media tools such as Facebook, Twitter, email (yes), texting, video-sharing, or Flickr all allow us to connect with others who seem to love us. Of course, they don’t, because love requires commitment and true understanding, but technology appeases those flaws by allowing each user to set up self-filters to screen the content most likely to simulate affection. Twitter brilliantly imposed a gaming-psychology device, a number of “followers” at the upper right that each user can track to see how many connections he or she has, a proxy for requited emotion. Facebook has taken another approach, installing an EdgeRank algorithm that pushes only updates from friends it deems interesting into your stream (based on how often you communicate with them, how many others have commented on the post, and how recent it was). The result is a warm flow of material that seems addressed to you by others who care, each item surrounded by popular comments showing a community of interest.

You are embraced by others who love the concept of you.

Yes, this sounds dark. Grave. Abysmal. But consider the deeper question: if we have lived for 500 or so years trading fictitious currency as a sign for the value of goods, instead of swapping real grain and furs, has the new set of follower numbers and social content that emulate real relationships provided an even more compelling fiction, which will further remove us from the real world in our lives? Perhaps that view is wrong. Perhaps you, reading this, think you have your reality under control, that the emerging smart phones and tablets and social network apps are simple extensions of your communication, just as eyeglasses help you see and sneakers ease the pain of your run.

Maybe there is no seismic shift away from physical, flesh-touching, semen-and-tear-and-Band-Aid- stained reality at all. The glowing screens around us are only tools, not encroaching windows ensnaring us in false worlds. We’ll think of that as we turn off this computer and go kiss our kids in bed.

Originally posted at Sundayed in November 2010. Image: cambiodefractal

Ben Kunz is vice president of strategic planning at Mediassociates, an advertising media planning and buying agency, and co-founder of its digital trading desk eEffective.

Animal conflict, or why we compete

If you follow triathlons you’ve heard of Team Hoyt, a father who is an incredible athlete and a son with cerebral palsy. The dad has pushed (on special bikes and strollers) or pulled (swimming, towing a boat behind him) his son through six Ironman competitions and more than two dozen Boston Marathons. It’s an amazing story, and the videos on their website will make grown men cry.

Yet it beckons the question, why?

We debated with some friends this weekend the meaning of Team Hoyt, and whether American culture in particular is becoming split between the weak and the strong, the TV-watchers and Internet intellectuals, those who sit comfy eating donuts and those who train to get their body fat down to 6%. Our society has bifurcated between the lazy and motivated. Could it be the lazy are now right?

Sociologists suggest competition is one of four main forms of social interaction — the others being conflict, accommodation, and assimilation. Darwin said competition was fundamental, the struggle for existence without which species would not survive. Machiavelli said it was the root of society, a war against all. Adam Smith expanded competition beyond the individual to our collective market intelligence, an invisible hand that guides society’s balance and growth. All suggest the world is not in equilibrium, and as we seek resources for ourselves, we must grasp for more.

Which poses an enormous conflict: If competition is good, and required to survive, and leads to progress, why does its fighting-against-others nature land at odds with the great spiritual and psyche beliefs of our time? Christianity’s turn the other cheek, Buddhism’s trascendental awareness, Maslow’s self-actualization at the top of the pyramid, and Freud’s Super Ego reigning in childish impulses all suggest higher levels of morality require turning competition off. Competition is a selfish impulse to pull ourselves ahead of others, to be faster, to gain more resources, to win fame, to succeed where others fail — and as such harms others, something truly civilized beings should not do.

Could it be that competition may no longer be needed? Not long ago the world was a dangerous and brutal place. We are only a few generations removed from days when Roman soldiers went to war with sharp blades to hack their opponents into meat, when tribal victory meant killing all the other villagers, when disease could decimate cities and medicine was witchcraft. We still yearn to fight, because our parents had to. Like animals salivating at the scent of blood, we can’t turn the instinct off.

Which makes competition a force like gravity we cannot control. In 1938, psychologists James Vaughn and Charles Diserens of the University of Chicago wrote “the fact of competition is scarcely more psychological than the movement of the balls on a pool table when the initial player breaks the set. To a spectator the balls may seem to compete more or less in their progress toward the other end of the table. There is interference and modification of movement, but no control or awareness of the process on the part of the ball. It is a phenomenon of the resolution of physical forces.”

If so, we are all small variables acting through competitions as physical forces in the great hive mind of human society. We’re subatomic particles that can’t help but be flipped negative with an electrical charge. We act like ants, rushing to lift more load, somehow building a colony whose purpose we do not see clearly. We hate conservatives or liberals, taxes or military, our neighbors or the illegal aliens from next door (who, we fear, may take more of our resources). We are driven by instinct to succeed, even if such success has no logical merit, even when we’ve reached a saturation point in resources where we no longer have to strive for food or shelter, even when the definition of success means taking something away from the other.

It’s a beautiful thing, to strive so hard with so little logic. Team Hoyt, your journey confuses me. Inspired, I’m going for a run to beat some illusion in my mind.

Ben Kunz is vice president of strategic planning at Mediassociates, an advertising media planning and buying agency, and co-founder of its digital trading desk eEffective.


Avatars and partial anonymity

Back in May 1996 when the Web was just getting out of its diapers social scientist John Suler wrote of a new thing called “avatars” — little pictures online users were posting in chat communities to represent themselves. His observation was that the graphics — which could be faces, or bodies, or ASCII smileys — enabled a form of half-anonymity, in which who you are is protected and yet you feel free to express anything. The Id was unleashed, because the Ego paid no consequence.

Suler’s most brilliant insight was that, even then with lousy graphics, user avatars fit nicely with well-known personality types including:

narcissistic = themes of power and perfection
schizoid = revealing detachment and indifference, perhaps combined with intellectualism
manic = energetic and impulsive
histrionic = attention-seeking and seductive

We haven’t evolved beyond this in 15 years. My avatar pics, upon reflection, tend to be schizoid, detached and intellectual, meaning I’m trying to look smart (or just think I look goofy when I’m smiling in real-life action as seen above). Narcissism runs rampant with many users posting avatars of perfect smiles, as if they just got laid, or histrionic with pouting lips and an iPhone visible in the mirror frame.

This protect-oneself-by-avatar-control psychology could explain why social media, with its rather antiquated focus on text typing beside a single photo, is so much more popular than video-conferencing — which is now technically simple and free but has yet to go mainstream as a major daily habit. We create avatars for ourselves because we want the freedom to reveal anything while controlling how much of our souls we expose. Wii dancing, for instance, will never make it to my G+ avatar box.

Ben Kunz is vice president of strategic planning at Mediassociates, an advertising media planning and buying agency, and co-founder of its digital trading desk eEffective.

Originally posted on Google+.

Hire friendly, not smart?

If you want your business to succeed, you hire really super-smart people, right?

MIT suggests not necessarily so. A recent study of “collective intelligence” explored what it takes to build teams most likely to succeed at solving problems. MIT found that individual IQ, that thing we all like to believe we have so much of, mattered far less than the ability of the group to perform functions such as clarifying the challenge, brainstorming, making “collective moral judgments,” and structuring limited resources. Group camaraderie, and not individuals’ IQs, was the greatest input required for success.

MIT called this the “C factor” for collective intelligence and noted that women tend to have more of it. Women, by nature, have less testosterone, which in high levels can lead to emotional, impulsive or illogical decisions (“I’m right!” “We must do this!”) and depresses sensitivity to others, often required to really digest all the data inputs to solve thorny problems. The study suggests that so-called “social sensitivity” would be a better prerequisite for hiring staff and managers than super-smart IQ, and that more women in groups — still, often missing in some business settings — leads to higher collective intelligence.

MIT found several things any group can do better to increase collective IQ:

– Avoid having a single smart person dominate the discussion.
– Get everyone in the room to participate.
– Watch nonverbal communication as well; individuals may signal they are confident or impatient or frustrated, and those are all elements that can be drawn out to improve the group’s decision — what does the confident person know, or why does the frustrated person believe the team is on the wrong track?

It’s an intriguing concept, that what makes one individual burn bright (aggression and brains) could increase the odds that a team will fail, while friendliness drives success. Obviously strong leaders are needed and can thrive; Apple and Steve Jobs may be the best case study. Yet powerful leaders and bright people might crank up their empathy a bit and assess whether the dynamic of their supporting groups is friendly enough to allow the best chance for success.

Ben Kunz is vice president of strategic planning at Mediassociates, an advertising media planning and buying agency, and co-founder of its digital trading desk eEffective.

Image: Paco CT

The psychology of polarization

Why is America so polarized? Back in 1961 MIT student James Stoner wrote a master’s thesis that suggested people in groups undergo a “risky shift,” making decisions that are more risky or extreme than the average group member would individually. This was counterintuitive — previously, psychologists thought groups would weigh facts and lean toward the moderate middle, like a jury building a logical consensus — but subsequent studies found Stoner was right. When we get in groups, we go to extremes.

Why? Reasons could be groups diffuse responsibility (you don’t worry as much about the impact on you personally if the group suggests something radical); risk-takers exude confidence and so may lead groups to the edge; and as group members begin paying attention to an issue or problem (global warming is a hoax!), they worry less about the potential negative impacts (um, if it isn’t, we might destroy the planet). The best answer, perhaps, is that people make decisions by weighing “pro” and “con” arguments — but if you hang with a group that leans only one way, the information you are exposed to is biased in your direction, accelerating your viewpoint (since you really aren’t consuming a broad enough array of data to make a truly informed decision).

This explains the Tea Party, 2010’s healthcare arguments, Fox News, MSNBC, global warming deniers, oil company haters, and the pendulum swing in U.S. politics between conservatives and liberals every two to four years. Fragmented consumer media and feedback from social media have accelerated this, as we can subscribe to only the data sets that reinforce our bias. We’re shifting opinions, and that may be risky.

Ben Kunz is vice president of strategic planning at Mediassociates, an advertising media planning and buying agency, and co-founder of its digital trading desk eEffective.

Original posted on Google+. Image: Sadie Hernandez.

Spider-Man renewed and the novelty effect

So if the first Spider-Man film with Tobey Maguire came out only nine years ago, why in the world is Sony redoing the same Spidey 1 plot — this time, with buffer actor Andrew Garfield — set for release in summer 2012? Is our culture completely out of ideas?

Sony is actually making a clever move, rebooting what was a profitable franchise to include more grit, sex, and videogame offshoots that appeal to older demos. The challenge is how to manage the novelty effect, or the tendency of humans to respond more strongly to something that is new.

In psychology, the novelty effect is the heightened response humans have — in terms of stress, anticipation, or pleasure — from something new. Through our Darwinian ancestry we survived based on novelty; men who sought more mates were most likely to pass their genes on; women who invented communication charms were more likely to get those unfaithful men to stick around and help protect the children; clans who ate diverse foods and built new tools were most likely to be healthy and survive storms and wars. Sexual nuance, language, art, cooking, housing, and automotive sheet metal designs all grew out of our need for new things to survive.

The novelty effect is why Google+ seems so amazing, when it is really a slight rehash of Facebook, Twitter and Skype. It’s why your iPhone 4 looked so incredible last year, and why you’ll want to toss it aside when Apple launches a next-gen phone with a bigger touchscreen and no clunky home button. Novelty is why we sit through stupid films such as Transformers or Captain America with little new plot, because there are new explosions to see.

To test this idea, we asked a teenager to review the new Spider-Man trailer above. He said, “yes, the plot is the same — but check it out! Now, when Spider-Man flies, we’ll watch it from the first-person viewpoint, all in 3-D!”

Ben Kunz is vice president of strategic planning at Mediassociates, an advertising media planning and buying agency, and co-founder of its digital trading desk eEffective.

The sexual impetus for your hatred of Gap’s logo

Word to the advertising community: The new Gap logo doesn’t suck. You’re just hung up about sex.

Before we explain, let’s review the rebranding kerfuffle. Gap, a purveyor of American denim and flannel, this week did what companies often do — redesigned its wordmark. The advertising world screamed bloody murder. Abe Sauer over at Brandchannel said the revamp “looks like it cost $17 from an old Microsoft Word clipart gallery.” David Brier of Fast Company called it “goop” and suggested protagonists would get fired. Someone launched the site offering infinite versions of Gap-crappy logos, and Adweek named the mock @GAPlogo to its top 25 Twitter accounts. And when Gap backpedaled suggesting it was open to new ideas, the blog ISO50 gathered more than 260 submissions.

What gives? Well, sex…

Ad gurus are steamed, you see, because Gap didn’t include enough nuance in its design, and nuance drives humans at the sexual core. It’s certainly not about the actual result, because Gap’s new use of the classic font Helvetica is similar to the wordmarks of other major brands — 3M, American Airlines, Panasonic, Toyota. Agency types are wringing their hands because such simplicity leaves their minds out of the game.

Nuance is a foundational human incentive because sex, food and shelter require it. For sexual attraction, humans look to symmetry as the core indicator of health and high-value sperm or eggs to produce strong offspring. Look at a photo of anyone you consider super attractive — Brad Pitt or Scarlett Johansson — and you’ll find near-perfect symmetry in their features. We focus on nuance because it signals reproductive health. In the long history of human evolution, nuance also led us to berries with more vitamins, tar for blocking shelter gaps, and metal better for battling enemies. Nuance is how we grow and survive.

If nuance is an over-focus of humans in general (Did you see the lines on the latest BMW? Did you try the latest Starbuck’s Via coffee?), it’s even more vital to ad agencies. Agencies are glorified temp workers, extensions of real marketing departments often filled with extremely intelligent right-brain creatives who are rewarded for ideas that scale memes across the masses. This is hard, because the idea marketplace is crowded, so ad creatives explore every angle of every communication and possible response. When found, a slight nuance is often the edge required to succeed. Nuance is the key to breakthrough success.

Gap’s logo failed for the design community because it lacks their core value: nuance. The logo is achingly simple, based on the old 1957 Helvetica typeface that has been used for decades by New York City subway signs. The irony of the outcry is the Gap logo’s Helvetica is one of the most beloved fonts among typography geeks; Helvetica is an everyman’s font because its thin lines are filled with nuance, such as a defined spur in the capital G or slight curves at the terminus of the lowercase a. Heck, designers love Helvetica so much they often mock the competing font Arial as a bastardized Microsoft knock-off. You see Helvetica in the logos for BMW and Target. You could argue Helvetica is the most popular font for brand icons in the world.

In its wrap-up of the debate, Yahoo Finance noted Nate Jones as one commentator who actually liked the Gap wordmark redesign. Jones wrote the new icon “brings to mind visions of a streamlined, technologically dominant future America where everyone wears white suits and cool glasses.” Gap’s icon moved away from the nuanced differences. Gap just went simple.

And since simplicity is the opposite of what you want in food, shelter and sexual partners, no wonder you are pissed.

Reality within

The movie Inception with its dreams-within-dreams is a metaphor for our times. We live in a layered onion of reality with the physical world in one stratum and business, data and (now) social media illusions inside.

We explore this idea over at the Sundayed blog: “How did our world splinter in two — a home life with flesh and blood, and a corporate matrix populated by artificial-numbered social reality? If veal is disdained by some who would never eat a calf kept in a small bin, not allowed to roam free, trapped indoors for life; then who would eat you? In the United States, 9 in 10 people commute to work by car, spending a collective 3.7 billion hours a year stuck in traffic, only to arrive at job sites that require 9 hours or more of input into devices that lead to numbers in banks. If humans are social creatures, driven by sexual urges to procreate and parental desires to protect our young, how did we mortgage our lust-and-love connections to spend so much time in artificial environs?”

More here.

The pricing genius of the $0.01 iPhone case

Ah, mimicry. is making hay off of Apple’s iPhone 4 reception troubles by running contextual ads online next to articles about iPhones. The banners are designed to look like official Apple ads (same fonts, layout style), and clicking through to the site offers a killer promise — get an iPhone case that solves your antenna issue for only 1 penny!

The math is impossible, you say? Why, yes. Check out and the company adds $3.99 for shipping and handling. USPS tells us the cost to ship a 3 oz. package is $1.22, leaving DefaultCase with a nice estimated $2.78 for each small piece of plastic. Great case study in how to manipulate prices to convey value, while also riding a major company’s bad press.

P.S. The site also suggests the cases are a $35 value. A touch of reference pricing to sweeten the deal. Yum.

.01 iPhone case' st_url='' class='st_twitter_large'>