This post began creation initially using Atomic web browser in a tab holding blogger's posting UI.
I was able to input the title (though I discovered my breath was a command to hide the keypad) but was unable to begin writing these last two sentences due to some incompatibility with my more like a "real" browser and the "open source API schema". Thankfully I was able to switch to evernote to write this post. I'll copy and paste it into the input box and format it using my laptop which is sometimes a desktop. Some parts of my post may happen via SMS or cell. These smaller mobile devices feel so sluggish in the catch up to the capabilities I tend to take for granted in my larger clunkier devices. Five* years ago or so the iPhone just came out. Touch screens prior to it on mobile depended on stylus input and touch screens on larger scale were tap and point and filled with puffy buttons (well suited for vending, service and terminal applications).
This is one of the places where the iPad feels less like a "robust" machine but a toy version of what's to come. Though i like the thinking around multiple orientations and locking (something I wish the iPhone had) I seem to prefer landscape mode over most for reasons of more space for more stuff or breathing room for focus (I tend to use the device on the toilette or in bed horizontal).
I still wish I could fluidly multitask like on a laptop or desktop and feel trapped within the shuffle of transitions that seem and feel redundant when I wait for feed or program loads (sometimes not the fault of the device). My states however are saved, like if I spazzed and accidentally hit the hardware recessed home button and closed evernote without hitting save. But like most novel things that are initially deemed "cool" in an interface can quickly become repetitive nuances hindering or breaking the flow of using a tool or application.
I can't deny that it serves as a great photo frame and music player and portable note taker as well as a sharing device in a show you kind of way. I sense slide shows coming back with it getting easier to wirelessly transfer images instantaneously to several places at once, like flickr, where I can preview and witness the shoot unfold.
Physically, my breath seems to say close keyboard in certain positions while typing. Again I think of the next manifestations of keyboard input like simulated 3D like tactile response inflation of the box so I don't have to scrunch or develop bad typing and spelling habits (it's much like a conversation on a cell phone, you're shown a possibility of how what you said could be interpreted and sometimes you have to repeat yourself several times before the other person can understand, sometimes through a crash or disconnection and others through distortion of my intended or expected input as represented by the device be its voice channel or text input channel).
When I switched to the safari web browser native to the iPad os I encountered the same input problems and again switched back to Evernote. At this point it may be fair to outline the pros and cons experienced thus far in my use of my iPad.
Screen brightness and size compared to the other "mobile" or "micro" devices I use and own (this includes a "netbook" loaded with both WindowsXP and Ubuntu Linux, an iPhone 3GS, a 13" MacBook Pro with a 7200 RPM custom hard drive and maxed out ram, among other gadgets) is impressive, in addition to the resolution. What I can admit is that computers and components are in fact shrinking and becoming more mobile in their use. In my early days of design and computers, a desktop was a necessity if one wanted to produce audio or video or high resolution graphics. Moore's law came faster (and slower - myths here) than many of us professional insiders will admit. The iPad isn't even a year old. All of these "game changing" devices are in their infancy.
Hardware mapping to function: seems like Apple has institutionalized the "home" metaphor through the application of providing a hardware key. It's like the early versions and applications of the esc key as the universal panic button. If I'm disoriented or want to switch to another application I hit the home key. This landing and routing scheme support single-tasking through requiring a user to ass through the gate of home before moving onto a sub-level within the architecture. The screen orientation lock button as hardware and the orientation scheme in general are disorienting. There is a conflict with the lock toggle and the volume controls. Despite owning the device and using it daily for several months, I still require the use of trial and error to discern up from down. Then there is the lock button. While I understand it's dependency for the iPhone (decrease butt dialing) I fail to see the value here. Especially when cases for the iPad are considered in this mix. A case seems essential to the ownership of an iPad if not for protection of a relatively frivolous and expensive gadget in an ecosystem of devices I utilize in my daily life. In my experience the case facilitates easier use via provision of inclined surface for typing on the keyboard or stand for when my iPad is in what I refer to (among others) as "passive viewing mode". What the lock breaks is the principle of on/off expectation. There is a mapping to the unlock in software form yet locking itself is initiated via hardware. There is no software based lock equivalent. Same goes for the screen lock. And volume. Why make these hardware based functions when everything else on the device seems to be software based?
Keyboard: here's where I get overly frustrated. No matter the position I sit, no matter how hard I concentrate, no matter how much practice, my rate of error using a touch pad keyboard is astoundingly high (inefficient). For a while the flashiness of the UI was able to salve my disdain and at first I welcomed auto-correct. What I don't get is that Apple took something that is a universally understood design vernacular and "innovated" it in ways that provide more reliance on acceptance of a learning curve and the limitations of the interaction than on using the input mode to foster more efficient input into the system — like switching "states" between symbolic/numeric input (see screen shot), or hiding and showing the keyboard (again, discovery initiated with a learning curve). Last, haven't figure out how "shift" works...
Oh! That's what the symbolic/numeric toggle button on the keyboard is for. It makes me wonder if apple is trying to change the game not only with platforms and gadgets but how we cognitively map our physical world into a virtual one. I assume they own the rights or patent on this QWERTY keyboard as well as the auto-suggest that I have a love/hate relationship with.
Though I can see the value of ownership locking out (and locking in) competition and fostering advocacy and adoption, I can't forget Sony strategy, among others in the industry deemed to be overly focused on proprietary nuances that made "open" systems closed to everyone not subscribing to a brand. I can't help but think that this is a very carefully planned and executed strategy on Apple's part. Not only are they innovative in terms of platforms, systems and hardware/software but lead the pack in terms of design thinking and business strategy.
That said, how could a closed system be a long term strategy when we are barreling towards a more "open" system? In the short term apple profits from locking out other players pitching their humanness to the public and positioning the perception of their company as the underdog misunderstood creative spirit counter to the business machines land of Microsoft and sun. People who whole-heartedly drink the Jobs punch are ignorant of the fact that non of apples work, position in the market, or focus on being different would be possible without competition. Yet, like most businesses trying to eke out market share, the goal seems to be complete control, monopoly. Like their relationship with AT&T over any other carrier. I've never been able to stomach why a device should control the service I use to make it a communications channel. One of the best ways I could see someone being "different" in this space is through providing customers with options and choices; much less ubiquitously open systems of syndication, access, consumption and management (metadata and content/messaging).
What I am trying to say is that apple isn't as "user friendly" once the surface is peeled back and the motives of their corporation become painfully obvious. Further I would say their lock in and forcing of the user to adopt to shortcomings in thinking or user testing before releasing to the market actually stifles innovation and human evolution. But I represent only .00000000003% of the people who consume these products due to my education, interests, history of use and background in HCI, human-centered design, product interface design. In other words, I have the vernacular to articulate where when and how interfaces fail while 99.000000007% of the population have no clue, live in a world where technology and gadgets take up far less time and space in their lives than mine.
What apple seems to do very well time and time again is to be first to market with technologies that other companies fail to realize at the same pace or same prowess in terms of delivery and value proposition. Perhaps that is where Apple is truly a leader - they are organized in such a way that they are able to produce in timely and efficient manners, products and services that appeal to the average "Jane".
Much of what I have written so far is bout expectations both personal and presented by the brand, the device and the baggage I carry from previous experiences. Yes, I am hard on design and user interfaces. That's because I see the risks involved with what I refer to as "captive audience" when using a "GUI". Periphery disappears and focus on a boxed in context is intense. At that point the device has undivided attention and thus control over both physical and cognitive processes. It would not be impossible to actively work to design user interfaces actually alter some very foundational physical and cognitive processes within us all, including what we say and how we say it (think about truncation these days and abbreviations and the countless reports coming out about the western human's decline of focus, depth or non herd adaption to shortcuts, workarounds, or system failures that actively destroy vital ability. Like Neil Stevenson and Jaron Lanier said in many ways in many forums to date: BEWARE. Be very conscious when using new technologies and note when you are forced to change behavior to adapt to an offering hidden behind messaging like "it's all about you" because it never is when products and services and agendas are involved in the value proposition equation. At the end of the day Apple is a company that is publicly traded and therefore beholden to shareholder buy in. Like all the other businesses out there.
Back to the iPad... These gripes and critiques aside, I do find much pleasure in using my iPad in several areas not initially intended. There has been much debate about the death of print and I am one of those old people stuck in a generation of publishing, of citation of source and the unmitigated/able nature of the printed word. The app I seem to use the most is Kindle. And it is ironic because it integrated with the Amazon product platform and facilitated much spending by me outside of the Apple Store ecosystem. The conduit to this were my lists on an existing platform focused on and somewhat good at a certain kind of product that warrants much of what we deem valuable on the net today and going forward (ubiquitous access to information and experts and social communities of use...).
I am so into the tactile interaction of a "multi-touch" screen. Having designed touch screen interfaces in my past and hating the poke input model, I love seeing stuff from the early days of Flash (called spark) in terms of responsive UI that engages users more subtly, less literally or metaphorically and more "intuitively" through true interaction and communication loops. However, looking through the human interface guidelines document I realize that within their closed development structure, there is little room for variation or defiance of the standard patterns put forth without a great deal of expertise, effort and an extreme amount of patience in a developer. With the rise of HTML5 I hope we'll see a mass exodus from the app store and a flocking towards a more open web that truly captures the advantages of the many channels and devices we use every day.
Some promising applications have been slow to realize like AirDisplay and Mobile Mouse. The lag with screen sharing is prohibitive to use. Lag when in response to input it death for an interface. Still it offers hope in that use case I'm waiting for "token devices" that fluidly share with one other, allowing me to unmoor or shed weight when needed while maintaining a home base or several home bases.
* pieces of the iPhone "GUI" were developed years before the iPhone appeared.
The default keyboard.
From numeric mode, I go to symbol mode. If this is a multitouch device, why not leverage the existing functionality of a multitouch keyboard like I'm used to on a "real" computer?
While in numeric mode, Apple remaps my punctuation keys which is again disorienting and causes much in the way of toggle-based mistakes on input. Where is my standard shift key?
From numeric mode, I go to symbol mode. If this is a multitouch device, why not leverage the existing functionality of a multitouch keyboard like I'm used to on a "real" computer?
While in numeric mode, Apple remaps my punctuation keys which is again disorienting and causes much in the way of toggle-based mistakes on input. Where is my standard shift key?
No comments:
Post a Comment