Disclaimer: I’ve not yet played with an iPad. The thoughts expressed here are based solely on conjecture from what was seen at the media gathering.
Last week (January 27, 2010) months of rumours and hype finally came to a close when Apple unveiled the iPad. The iPad is Apple’s much anticipated tablet form factor device. Such a device has been rumoured for years at varying stages of development. Part of this hype relates to Apple’s failed Newton and Apple’s marketing culture that has been achieved in the past decade by the iPod and iPhone. Some of the biggest hyped points of the iPad were the potential for an interactive tactile interface, OLED based screen, and of course running an actual version of Mac OS X. Of course none of these potential features are going to ship in the iPad which effectively makes the iPad a larger iPod Touch.
Most of the readers now will say, “Okay… we know that.” This is also the point in which either anti-Apple activists will stand up and most technically savvy people will stand up and slam the iPad. It’s entirely true that there are no real use cases for the iPad for technically savvy people who are already content with the hardware they use. There are two reasons to embrace the iPad even if you dislike Apple or don’t see a use case for yourself. The first of these reasons is that there are people out there who fit the market category adequately, that is the category of those in need of a basic internet appliance. The iPad is a perfect computer for someone who needs basic internet access, email, and video content. In my view, the person who would fit this demographic most is a parent or grandparent or someone who just simply wants a computer to use while watching TV. The second reason to embrace the iPad is that in the future it has the potential to be a paradigm shifting device.
The iPad currently “fails” in the views of many people today, including my own due to the fact the device is a relative of a smart phone. The first one of these inherited flaws is the lack of multitasking. Multitasking being absent on phones is understandable as a phone must be a phone before it becomes an internet device. Multitasking should be on the iPad as it doesn’t function as a phone unless Apple is holding out on some form of calling. Another drawback of the iPad is the lack of Flash. Flash is a double edged sword as it is so widely used on the internet but yet it is so poorly implemented that it runs poorly on Windows but even worse on Mac OS and Linux. Finally the point comes up about using a netbook instead of the iPad. The most obvious point to this argument is the fact that a netbook has a physical keyboard while the iPad relies exclusively on touch, using such a keyboard would be awkward on a horizontal surface.
In it’s current state the iPad is certainly nothing more than a larger iPod Touch. The iPad could become so much more in the future if a certain set of obstacles could be crossed which will be discussed later. Imagine your friend just got back from a trip to China and he wants to show you some pictures. You could bring your iPad to his house and simply once coming into range of his network hardware you would have access to all of his shared pictures at your finger tips. Another future use case that might already be possible through an application is video on demand either from the iTunes store or your personal media library, sit down at the couch, take a browse through your collection and press play. Finally, perhaps the most futuristic use of the iPad would be browsing the internet by voice recognition. One would simply open Safari and state what they want to find without typing further emphasising the slate form factor. Voice recognition is a relatively new and advancing field as very few decent AI recognition algorithms have been created.
Technological progress is currently at a crossroads. It’s not only Apple that has these problems moving their product space forward. It’s generally accepted that major corporations are leery of attempting a radical new product. This results in legacy hardware and software being the ruling demographic. Microsoft’s example of this is maintaining the same codebase for the last 20 years. Last month we saw a 17 year old Windows NT bug come to the surface. We often see 30 year old Unix bugs crop up in Mac OS X since OS X is based on Unix. The same can be seen on the web. The web was originally intended for computer to computer communication between academic and military installations. There was no mention of a web browser as we know it, let alone seeing http:// before every web address. The general point is that it’s hard to leave legacy hardware and software, it’s a constant pain for developers who want to move forward while not totally forgetting about the users.
More and more recently we are standing at a crossroads between old and new technology. New technology such as the iPad is struggling to define its place in the world while competing with the ancient keyboard and mouse. The argument against moving forward is that most current applications would have to be re-written to take advantage of new input devices and us humans would have to be re-trained. Of course there’s the standard argument that the current system works fine and there’s no reason to change. We’re also seeing the cross with multiple operating systems for different classes of devices which is confusing, this is particularly true when comparing iPhone OS to Mac OS and Windows Mobile to Windows 7. The final and perhaps hardest barrier to break is that new technology is expensive both to purchase and develop. The cross can only be broken when technology becomes affordable for everyone. New technology will always be expensive, the question is how can we move forward?
In order to move forward with technology as a whole, first and foremost technology should be accessible. In order to achieve this new technology should attempt to employ new methods while maintaining a small learning curve. If a user is not able to grasp the concept of a new device quickly (within the first few minutes) they will be turned off from using it. The next step necessary to ensure a forward looking path with technology is cooperation. Cooperation can be hard in the corporate space, however, cooperation in establishing open standards would go a long way to move technology forward. An example counter-intuitive to this would be having a company (Apple for example) develop a new hypothetical proprietary interface for storage access. Finally, it might be that companies just need to step and take a risk, that is be an old fashioned entrepreneur. It’s risky but it can certainly pay off as we saw with the iPod. The iPod was originally built for use on Mac OS only and used the obscure Firewire interface (at least to the PC crowd). This is one of many cases where being an entrepreneur brought new technology forward to the mass markets.
What does the iPad mean for computing? In the short term not much. In the future when other companies come out with slate style computers we will see them add new features and Apple will be forced to innovate. It’s this cycle that will ultimately drive the evolution of this form factor. The long term is somewhat more shrouded. My personal belief is that we will see the notebook/desktop PC as we know it slowly fade away as immersive entertainment becomes more mainstream. The general trend could very well end up being a trend towards appliance style computing, that is a computer that does one task and does it well. Before I end I want to share three potential situations that are entirely possible for the future:
If you have any questions or suggestions for future articles please send me a pm on the forums.