Monday, May 3, 2010

Thoughts about "The Data Driven Lifesyle" article in the New York Times Weekend Magazine May 2, 2010

From the Author:
"People are not assembly lines. We cannot be tuned to a known standard, because a universal standard for human experience does not exist."

ME: which is why User Experience professionals tend to get frustrated (and designers, but that is an older story and much richer). Pat Whitney said it well when he spoke to the fact that "user research" and "data" based on behavior and sensor input automation has driven down costs and effort. Further, relying on older models that service older media channels (like television and radio advertising) will not provide the awareness or understanding it would take to create competitive experiences in the very near future (see now).


Comments:
"The map is not the territory." — Alfred Korzybski

ME: Richard Saul Wurman speaks to this. Maps are polical artifacts that speak to policy while the lives of people and culture etc form the basis of communities. We're used to looking at the map and the map is becoming less and less relevant with the rise of what we call "globalism".
"I think the loss of our human-ness is more the result of inadequate tools that make us adapt to them instead of the tools adapting to us.

The philosophical paradigm shift this represents is on a scale with the spread of written language, the development of agriculture, or the Enlightenment. Whether we like it or not, integrating the computer into the minutia of our daily lives means we are changing the game - externalizing the computing power of our own brains. The terror and the excitement people feel at this more and more obvious change is the most convincing evidence I can think of that it's real and it's accelerating."

ME: Jaron Lanier speaks to this in "You Are Not a Gadget". We tend to praise interfaces these days that would have been scoffed at 10 years ago in favor of the flash and glitter of the glint. It still amazes me that wiki is like the bomb these days. Still referred to as radical etc. Seems like we get lost in the end game and end result (or what we want it to be) rather than step back, as Pat Whitney said at AIGA's "The Death of Advertising", and abstract the real problems and human needs, intent, agendas... Further, bad interfaces that we are forced to rely on alter our workflow, our epistemology, our mental constructs; not to mention cause great inefficiency in workflow. The last point is a great one. The fact that its happening and being openly discussed means it's too late to stop it? Do we wish to stop it? Can we slow it down? No. Moore's Law - it applies to us as well as machines.
"I've met people like this.

I usually find them very very very very very very very very very very very very very very very very very very very very very very very very very very very very very very very very very very very very very very very very very very very very very very very very very very...

BORING."

ME: LMAO!!!! I wonder who wrote that... Anyway, Kurt Vonnegut was asked if he spends most of his time with his writer buddies and communities. His reply was short and sweet: No. When asked why he said something to the effect that it would be extremely boring and he would gain little in terms of the insight and awareness he relies on when he crafts stories for people who are not writers (like most of the world). When I attended graduate school, I always counted my fortunes when my life outside of the campus was not spent with other "human-centered designers". My mom always said "no one is more right or knows more than a graduate student". Not only can they be boring but offensively ignorant of the world outside of their own, specialized realms.
"literacy was once a threat to humanity because of the way it "represented" the vagaries of human life. (I am reminded of the belief in some cultures that photographing the human form is kind of theft of the soul.) I am sure you are right that we will eventually find humanity in data, as we have in the written word.

However, it is not honest or responsible to confidently assert, for example, that early critics of the written word were simply wrong. History does not show that. History shows, rather, that the written word made its wielders more powerful. Don't forget: the written word has often been used to oppress. Think of Martin Luther and the early Protestantism--it was largely a response to the way the Church had used literacy as a tool of oppression. Our idea that literacy liberates is basically a function of the fact that it equalizes the weak with their oppressors, not that it is "inherently" liberating.

Self-tracking will undoubtedly be used to oppress. It will wend its way into mainstream culture, eventually becoming something that employers expect of you as a matter of course. The temporal "productivity gaps" which we use to daydream, think about politics or other non-work related ideas, or simply consolidate memories, will be targeted and eliminated. Also, it is almost inconceivable that self-tracking data will avoid eventually going public.
Only by grasping the subtle seriousness of this issue will we give ourselves a chance at actualizing a future that does not involve blanketing ourselves in highly granular control mechanisms.

It's probably inevitable but that doesn't make it good. Look at it this way: we will never know what the world would be like today if writing hadn't been invented, and conversely, there are an indefinite number of technologies that weren't invented hundreds of years ago, and we will never know what the world would be like today if they had been invented."

ME: Yeah, people's initial reaction to change, usually when it is inevitable and will disrupt current behavior, is to shoot it down. We in the digital innovation group experience this daily. Especially when we're right-on in our response to a problem or thinking about something. I know we've done a great job when the reaction to our work is WTF!? Even if it's wrong the presentation serves as a "probe" to gain insight into what people think would be "right".

Further, what was missing from the comments and the article itself was mentioning about how much of the input AND analysis of the "data" about us will be automated so it won't require a "second life" of "reflection" to make sense, make use or, or find value in the "personal data stream". They also missed the point about personal control and our tendency to not use stuff we can't control - especially when it has to do with our ability to deny or ignore various aspects of our inner lives.

No comments: