Monthly Archives: August 2012

Savage Americans on Facebook: A future history


Teacher’s note: Students, your mental teleresponse is due Friday, June 27, 2249.
In the year 2211 humanologists exploring the wreckage of the Walmart Towers of Manhattan (once used to manufacture plastic goods that people wanted!) uncovered an archeological treasure in the form of a silver hinged apparatus with a strange Apple insignia. The flat box was made of the same aluminum material once used to wrap fish, only less wrinkled. Was it for food? Or planting seeds? Or throwing at animals? No! The device did nothing useful for survival. But our scientific team was able to connect it to a solarelectropulse and — there! — a glowing screen appeared.
Behold! A Facebook logo!
Our scientists has found a digital record of the legendary Inter Net!
The Inter Net until that point had been a fairy tale, a dark fable used to frighten little children, similar to silly stories of carbon pollution or slave labor or the ogre Glenn Beck, except people in this case performed a mind-numbing exercise called online “surfing.” Until this discovery, humanologists were unsure if the ADHD disease that almost decimated our species had actually originated in metal boxes shipped from California. But that hinged aluminum fish-wrap material held the answer, because there — on the glowing screen — was Facebook!
Facebook, you see, was at one point the center of the Inter Net — a human throng virtualization system, empowered by silver-fish-foil Apple machines, that allowed the appearance of social behavior without anyone actually talking to anyone. We now call this behavior “lying” or “masturbation.” Historians had believed Facebook was similar to the Bible or Talmud fads, the recounting of stories that ritualized society’s hope for survival after death, but discovered upon the silver box’s “booting” that it contained an ever-flowing cascade of fictional real-time information from imaginary friends, parents, spouses and ex-girlfriends with large breasts.
Apparently this Facebook illusion had become a network insinuated deep into the bowels of the entire Inter Net, collecting data as users hit blue “Like” buttons unveiling their preferences for biased news, dancing cat videos, and pornography. (On the archived colored page of FoxNews, users could find all of these in the same location!) Researchers believe the Apple device had belonged to a marketing businesswoman who luckily documented the Facebook “Open Graph” on a blog, recounting its launch as an “application” enabler (clever software given away for free to lure consumers into sleep modes) and subsequent morphing into a vast database of consumer preferences used to create email, direct mail, television and telepathic prospect lists (See chapter on “Experian, Equifax and TransUnion: the Ironic Transition from Financial Transactions to Social Graph Unprivacy Laws”).
Facebook, students, became the nexus of all marketing information! It combined self-submitted profile data with Inter Net observations and mapped the connections between humans! It observed everyone and used that knowledge to help advertisers sell everything anywhere! (See chapter on “Terms of Service Rebellion War of 2013 and Robocalls of 2014”). And as we now know, Facebook’s launch of artificially intelligent avatars in 2019 who passed the Turing test while aiding unhappy divorced men release sexual tension led to the development of blow-up robotic companions who today keep us all warm and comfy.
Unfortunately the silver-fish-foil-wrapped Apple device froze after 47 seconds of evaluation and our scientists were unable to capture the entire history of Facebook’s world domination. The last blog post uploaded into the Telecloud recounted 7.37 billion users interfacing on mobile devices in 2023 while avoiding Google pay-per-click ads (See chapter on “Search Engine Demise: The Shift of Consumer Modality following Chatroulette”). We will never know what our ancestors really did with Farmville or their old girlfriends, or if they knew the prophet Charlene Li was right, that social media would become like polluted, acrid, unbreathable air. It is our good fortune that virtual social media congregations are now behind us and those radiation-emitting mobile phones have been banned like cigarettes. Thank goodness after the head mutations of the 2100s, we all still have our left ears.
Special credit study guide: What color was Mark Zucker Burg’s hair?
(A) Red
(B) Brown
(C) Silver-fish-foil-wrapped with Apple logo on back
Answer: (C).
Note: From my old Sundayed column inspired by Jason Moriber. Image by Jason Wadsworth.

Meet Seyyer, creator of fake AI video avatars that might replace you

Damn, technology moves fast. Nine months ago in Businessweek I predicted artificial-intelligence simulation would soon power video avatars that mimic real moving, chatting humans. Now a company called Seyyer has done it, with “cognitive video realization” that uses software to modify original videos so human faces can say, well, anything. Think Pixar imagery on steroids, beefed up to video photorealism. Here’s Seyyer’s new Ronald Reagan, back from the dead, chatting about politics. Here’s a list of attractive actors who can star in your next TV commercial, no filming required, thanks to Seyyer’s mouth-morphing software. Seyyer suggests its technology is fast, cheap, and cheerfully promises this about its AI creations:

“They’re Alive! And they are brainy, funny, sassy, sexy, impossibly cute and available for your next commercial, pre roll video clip, or other video project. Our AI Avatar family is growing fast and ready to make your message come ALIVE.”

In the first stage, advertisers might rejoice while actors unions go out of business. It will now be incredibly cheap to get a beautiful person to say anything on film, because you simply have to type a script into a computer and software will push the face and vowels into position. No more expensive video shoots!

But in the second stage, this could truly disrupt society. Why not make virtual videos of your own face? Then tie your chatty image to a dataset such as Apple’s Siri, which can answer any question, and connect it to your social media profile history, so it can draw from your historical tone and wit? Toss in voice recognition software and your wife can have a chat with you after you’re dead, or you could handle business meetings by sending the fake other you off via video call while you play golf. With the right data and video verisimilitude, no one will know the difference. Creepily, the other you might be even more handsome (computer, please whiten my teeth) and more intelligent (add a feed from Wikipedia and I’ll answer any question). Don’t even get me started on the Kama Sutra.

When fake humans are funnier, smarter, and sexier than real humans, what happens to the rest of us? No matter. With the GOP in Tampa, it’s just nice to see Reagan’s back.

Sorry, Nintendo Wii. One gadget will not rule them all.

To understand why your iPhone and TV don’t talk to each other, and why you don’t want them to, let’s revisit the concept of personal space.

This concept is important because $144 billion is going into play in the next two years as most consumers wake up to the idea they have alternatives to cable. In the U.S., the typical consumer spends about $75 a month on cable subscription fees, which total about $74 billion annually. U.S. advertisers chase these viewers with another $70 billion in TV spots. Cable companies are fading as DVRs, Apple TVs, Google Nexus Qs, Netflix, Hulu, and YouTube all try to replace them. You’d think the holy grail of screen/device/phone/handset/tablet convergence would happen, because the big players want the money and you, poor dear, are sick of having three remote controls on your coffee table.

Every month brings a new attempt at screen convergence. Nintendo just announced its new Wii will include a touchscreen on the hand controller, so you can look down to play as well as look up at the TV. And Current TV, the struggling cable network, will broadcast the GOP convention from storm-washed Tampa, Florida, with half of the TV screen split to show a scrolling Twitter feed.

Convergence is almost here. But your psychology suggests it will never happen.

Deep inside, you want different gadgets to do very different things.

The big barrier is the concept of “personal space,” an idea birthed by anthropologist Edward T. Hall and psychologist Robert Sommer in two separate books in the 1960s. The basic idea is people have different “fields” of space around them, and we have unconscious and differing reactions to what happens in each space. At the closest, about 18 inches away, you have “intimate space” for people you love; in the mid-range, 18 inches to 4 feet away, you have “personal space” for working or speaking to friends; and from 4 feet to 12 feet away, you have a “social space” to gather news from strangers. Everything further out is public, the universe at large, and doesn’t really affect you.

These invisible fields vary by culture, by your mood, by your surrounding environment, and by your social position. The Queen of England might demand a wider personal space than you; when you walk into an elevator or subway, you let the fields collapse, because the environs have changed.

But you need different things in each space. Robert Sommer wrote, for instance, that you might let strangers elbow you on the subway, but you “dehumanize” them in your mind, ignoring them as beings, because psychologically you still need to protect your intimate space field. You just can’t let all inputs act the same way at each distance.

Now, look at the gadgets around you. Modern electronics fit into each of these fields almost exactly. Smartphones are held up to our ears for the whispers of lovers or family in our intimate space; tablets and laptops and PCs sit a few feet away in our tool-oriented, friendly personal space; and big-screen TVs are perched comfortably on the wall about 10 feet away in the social space meant more for learning news from strangers. We want intimacy, personal friendships, and social entertainment, but we want them from different ranges.

Electronics gadgets have evolved to fit each of our three personal fields. Devices will never converge completely, because Twitter is your lover and Netflix just a new friend.

Game over? Nike+ sensors may soon authorize your play.

Nike is one of the world’s largest marketing machines, spending more than $800 million in U.S. advertising annually. But Nike must fight the law of large numbers (any company that clears $24 billion in revenue faces a potential growth plateau), so it relies on technology to keep consumers begging for more. In its 2012 annual report, Nike frets “if we fail to introduce technical innovation in our products, consumer demand for our products may decline.” So Nikes are awash in engineering; “Air” is compressed air pockets for cushioning; “Zoom Units” are flatter versions of the Air pockets; “Flywire” embeds cables wrapped in fabric to automatically tighten and loosen the shoes; and the “Nike+” is a famous dongle that connects 5 million runners to iPod interfaces that track and share their running mileage.

The tech play works; in fiscal 2012, footwear sales in North America jumped another 15%.

So this month it comes as no surprise that Nike has launched its most expensive shoe ever, the Lebron X Nike Plus priced at $315, bleeding with new technology. Downscale customers can get the slightly less-costly Nike Hyperdunk shown here for just $250 — with both models offering the latest in Nike air-cushioning technology and a remote monitoring fob that wirelessly measures speed, distance, and the height of your jumps.
Hacking the Nike+
What’s missing from the press is an upcoming advance: a patent by Nike’s partner Apple that would “authenticate” whether you are wearing the right shoes for all this technology to work. You see, one risk Nike faces with all this newfangled science is that consumers could hack it; for instance, avid runners on a budget often duct-tape Nike+ remote fobs to their less costly Avias. Like a spy in a movie thriller cutting the tracking device out from under his skin, Nike+ gizmos can be pulled out of one shoe and shared in others. If only 5% of Nike buyers “cheated” like this to downgrade to lesser brands, the swoosh would lose a billion dollars.
Apple, which has a horse in this race called iPod and iPad sales boosted by Nike+ entanglement, recently won patent No. 7,698,101 to help protect itself and Nike from Nike+ hacking. The patent pushes the tracking technology forward in many clever ways, including embedding sensors into other types of clothing and monitoring the wear and tear on shoes … but a key point is “determining if the garment is an authorized garment; and electronically pairing the garment and the sensor only if the garment is authorized.” Sorry, runner with the Nike+ dongle duct-taped to your Avias; your iPod is now shutting down.
Neither Apple nor Nike have published statements on whether this new authentication technology is built into the new Hyperdunk shoes or when it will be released. Until then, the performance data and social networking enabled by Nike+ is fantastic for athletes who want to count leaps and bounds. Nike is now expanding monitoring gadgets into wristbands, and it’s not hard to predict Nike will use this cool tech to penetrate other markets where it is weak, such as cycling.
But the sensors are watching. Soon, if you want to play, you better be wearing Nikes.

The wall as your TV set, China as your lover

It’s hard to believe but a few years ago social-media gurus were proclaiming TV is dead. Actually, television has left the box and is morphing into screens everywhere, including this massive interactive demo contraption by LG. Along the way television is having an affair with computer screens and their touchscreen lovechildren, tablets and smart phones, until the delineation of what is television and video and computer projections and interactive touch panels blurs into, well, screens.

You have to wonder where this trend will go, and if eventually it will snap back into some form of sanity before every tabletop and ceiling is converted to glowing panels. Cells phones, after all, went through a phase of miniaturization in the late 1990s and early 2000s until the iPhone suddenly made us stop, and then itch for slightly bigger screens. We may pause before we reach a Total Recall remix of video around every corner.

But I doubt we’ll stop. Homeowners will want a contrast of natural elements and visual effects — leather couches still feel good — but in a few years fully interactive screens will be in every room. Windows will darken into shades upon request. Physical objects will respond to finger swipes (thanks to Disney’s Touche technology). Body tracking sensors will recognize you if you walk into a room, and respond if you wave your fingers in the air.

The first executions, as always, will be social enabling (the hologram of you can be beamed into your mom’s living room for a visit) and entertainment (porn, as always, will lead the tech adoption). But I’m always more intrigued by the sociological implications. Facebook has killed the Christmas card and high school class reunion industries but also made people slightly more lonely, as online chat rooms replace bowling leagues and knitting clubs. The over-color-saturation, high-contrast visuals of social media make flesh and blood look less appealing. Most impactful (don’t screw with me, grammarians, language evolves too), the efficient connection of your personality to others, with distance and time zones no longer being a barrier, means it’s easier to make like-minded friends who may live in Switzerland or Iran or China, rather that work through the awkward conversations with the blockhead living over the hedge next door. Love, friendship, and jobs may expand into vast virtual networks; the automobile gave us 40-mile commutes, but the perfection of visual interactive haptic holographic technology will let you converse with, touch and feel anyone on the other side of the planet.

Play it all the way out and technology could end wars, just as today’s commerce puts pressure on the U.S. not to battle any nation with a McDonald’s in it, as we learn to love humans based on their minds.

The downside is you might connect with a real-looking-and-feeling human who is just faking it. That glorious new job or girlfriend just could be a PC-colored avatar generated by a pimply geek running a boiler-room sweatshop.

Be careful when you fall in love with the beauty of tomorrow’s screens.

Still. Football is going to look so cool.

A gravitational-mass theory on why Digg died

Farhad Manjoo notes in Slate that Digg.com, the once-great crowdsourced social network, has died and been reborn as a plain-vanilla news portal. The site was once wildly popular, almost bought by Google for $200 million, but declining audiences led its URL to be sold to a new group, Betaworks, this year for only $500k.

Why did the old Digg die? Because all social networks eventually grow to the point where you lose your personal influence, and influence is the core reason you find human networks appealing.

First, understand that human networks follow the rules of gravity. Social physics suggests that any congregation of human beings (and other creatures, like birds on a telephone wire) follow a normal pattern. In the first phase, groups form in clusters centered around an invisible gravitational mass. The birds in this photo are most closely grouped near the center, with a few outliers on either side. People do this too, whether at parties, on the beach, or in social networks. In a 2003 paper at University of Texas, psychologist James Pennebaker noted the people-on-a-beach metaphor is most obvious; compared to the position of the stairs leading down to the sand, humans will congregate around the first person to set up a towel near the bottom of the steps something like this:

     |           |     |  |  ||  |||| ||||| ||||  ||||  |||  |     | |           |

This network effect happens in two dimensions (a line, like a narrow beach) or in three dimensions (such as parties, where people cluster around the bar or kitchen). Pennebaker posited that people act as gravitational attractors, with the “mass” being various attributes such as beauty, wit, leadership or strength. Our conflicting desires to listen to others and broadcast to others make us connect in unerring patterns of clusters. Most creatures form such clusters; flocks of birds, running horses, schools of fish.

Then gravity fails.

There is an important clue as to why social networks, like weekend parties or birds on telephone lines, eventually disband — and this is because we eventually lose our influence. Most people are attracted to social networks because they want to be heard; the vast majority of people I follow on Twitter spend most of their time broadcasting ideas, emotions, humor or links. Trouble is, as groups grow, your influence dwindles. Pennebaker noted that “a person’s impact on individual others will decrease with the size of the audience.” He observed fraternity parties (nice work) where groups of 2 people spoke for 20 minutes, 3 people for 10 minutes, and 8 people only for about 2 minutes. Like a high school drama beauty who goes off to Hollywood and discovers she is now only one of thousands of attractive women, large crowds make us work harder, create the dissonance of competition, and dilute our individual impact. If we lose our personal gravity, we want to move on.

This is the second phase of social networks — when they grow to the point that our voices are drowned out. Social networks typically grow and consolidate, thanks to Metcalfe’s (flawed) model of network utility. Facebook, Twitter and Google+ are well aware of this problem, which is why, for instance, Facebook constantly tweaks its edgerank algorithm to try to make your posts connect with and rebound among your closest circle of friends. (Facebook is trying hard to feel small and intimate, despite having nearly 1 billion users.) Unfortunately, your mass as an influencer always grows smaller in contrast to the greater, growing mass of larger networks. Like a small cluster of people at a party where the conversation gets stale, eventually you will disband to go somewhere else where your charm, wit and intelligence are more recognized.

Digg didn’t die because it had a bad system; we all just discovered there was a newer, fresher party where people dug our personal gravity somewhere else.

Image: Dmott9