By: Bill Ray
Jan. 1, 2000 12:00 AM
When Pandora was given gifts from the gods, she had many wonderful things, but she also had a box that she was told never to open. The box contained all the bad things in the world, and as long as it stayed closed, the world was a wonderful place full of joy and happiness. But Pandora's curiosity got the better of her, and she couldn't resist opening it just a little to see what was inside. However, the box couldn't be opened "just a little," and all the wrongs in the world flooded out, creating the world we see around us. Just as she snapped the box closed again, one more thing pushed its way out; it was "hope," which the gods had placed to help man cope with all the bad things.
Perhaps if Pandora had been less keen to see what would happen, we'd still be living in that world of perfect serenity, but without hope, the world would be a much less interesting place.
I mention the myth of Pandora as I think it has parallels with the current arguments about mobile telephones. Should we open the box and allow anyone to run applications on our mobile phones? Is it really possible to open the box "just a little," or would we be better off with a closed box and a world we can rely on?
This article seeks to present the arguments on both sides, showing how both approaches have their advantages, and even exploring the idea of opening the box "just a little." Just like Pandora, once we open the box, it will be very hard to get it closed again, so we need to be certain we're doing the right thing.
What Is This Box Anyway?
Some of these we developed ourselves, and are pleased to be able to use and share, while others came with hardware we've fitted. Most of the time they all seem to work together nicely. But the unreliability attributed to desktop computers (particularly those running Windows) can generally be traced to incompatible software, and the freedom we enjoy to develop and run applications is certainly responsible for the viruses and Trojans we now have to be constantly on guard against.
Various plans for stabilizing the desktop have been suggested, including the Trusted Computing Alliance (TCA) and Microsoft's Palladium. These are based on controlling what applications the user can execute. The idea is that only applications that have been digitally signed by some higher authority can be executed; without approval your programs just won't run.
The approval process will cost money, so no more hacked-together utilities, but also no more viruses or Trojans. Something like this can be seen in many office systems, where a locked-down desktop for the users results in greatly reduced support costs - stable desktop computing at the cost of freedom to run what you want.
The technical practicality of bolting such functionality onto the desktop PC platform is very much open to question, but getting the same functionality into a mobile phone is a lot easier. Indeed, most mobile phone platforms already have the capability, if we decide we want to use it.
The Case for a Closed Platform
Creating an application that crashes a mobile phone isn't difficult. Most of us spend our time trying to stop our applications from doing exactly that! I've written applications that, if used the wrong way, could require a mobile to be rebooted, and those applications have been made available to other users.
When someone is running one of my applications and their phone crashes, are they going to blame me? Or will they conclude that the phone itself is faulty? Of course, every application I've ever written pops up with perfect error messages explaining the problem and accepting full responsibility, but not every programmer is as fastidious as me...
If we allow everyone and their brother to develop applications, then, by definition, the phones will become unstable. There's no point blaming the users for running "unsuitable" applications. Users neither understand nor want to understand the technical questions; they just see a cool game and download it. If their phone then crashes, it's not a good phone. Presenting the users with warnings is next to useless, especially when download sites advise the users to just click "OK." Users don't want to spend time learning the risks, they just want to use their phones.
The alternative, as Orange has done in the UK, is to control completely what applications the user is allowed to install and run. The SPV (Sound, Pictures, Video, apparently) uses Microsoft Smartphone, an interface layer on top of WinCE, and can run a very wide range of software, but only if it's approved by Orange. This is a decision that the network (Orange, in this case) has made, not Microsoft.
Orange doesn't want to have to deal with the technical support issues involved with users, having installed the latest free game, calling for help to get it working. Orange feels that stability is more important to their customers than flexibility. Orange will digitally sign applications they feel are of benefit to their customers, having checked them to make sure they don't contain anything nasty.
Viruses are one of the nightmares of running an Internet-connected PC. Most of us pay a regular fee for some sort of virus protection, and accept a cost in processing power and memory for constantly running a virus scanner. But if every application is digitally signed then there should be no more viruses! Even worse are Trojans, programs hidden in others, sometimes carried by a virus, which infect your system then hang around gathering data such as passwords, banking details, and contacts, before sending the data off and deleting themselves.
Such attacks are increasingly common on desktop machines, with users often unaware that their security has been compromised until someone starts using the data. The potential for such a program on a mobile phone is terrifying. The ability to make calls on your bill, listen in to your communications, and even present you with questions that are indistinguishable from genuine requests, means that every effort should be made to avoid them becoming widespread. We can't rely on every programmer to be trustworthy, but we can rely on our network provider (who has a reputation to protect), and only with digitally signed applications can we be sure what we're running.
The secured mobile phone also provides the perfect environment for Digital Rights Management (DRM). The next generation of mobiles will feature "lock-forward" functionality, where it will be possible to send a message to a phone that cannot be forwarded to other phones. This, combined with a secure platform, has enormous implications for the distribution of copyright content such as music and video. The ability to allow users to download content without fear of them making copies of it, or passing it on to their friends, is a very compelling proposition, and something the copyright owners are crying out for.
Taking this secure DRM further, with careful use of certificates it's possible to consider the mobile phone as a hardware key ring, holding licenses for all sorts of digital content from music to films to desktop computer applications. Using Bluetooth or something similar, a desktop computer could check for the appropriate key on a nearby phone before running.
All the above is possible only as long as the operating system is secured. Attempts to secure content on desktop computers have always been broken, but if the hackers can't run their programs, they can't attack the security of the content.
The Case for Opening the Box
Without an open platform, application development will be limited to large companies with the resources to pay for licenses and testing. The days of the lone programmer changing the world are almost over, but applications like Napster still demonstrate that sometimes the bedroom programmer is better placed to make the real advances than the largest corporate R & D department.
By creating a community of developers, we can make enormous progress. A community limited to employees of competitive development companies will never achieve the same level of advance.
The Slightly Open Box
Programs created in a secure environment, like Java, are limited to doing things that are explicitly made available. A good example of this is Bluetooth; 100% Java programs still have no access to Bluetooth hardware, making it impossible for the home developer to deploy innovative applications using Bluetooth on secured devices. Meanwhile constant attempts to extend Java are threatening the standard itself. It's very debatable whether platforms limited in this way will ever provide the kind of fast-track innovation seen in the PC market.
What's in the Box?
As developers, most of us would prefer an open platform, depending on how much our livelihood is dependent on protecting copyright, but if we want users to accept an open platform then we'll have to be sure our applications don't show up the instability inherent in the system.
Like Pandora, we might regret opening the box when the first GSM worm starts cutting off our phone calls.
Reader Feedback: Page 1 of 1
Latest Cloud Developer Stories
Subscribe to the World's Most Powerful Newsletters
Subscribe to Our Rss Feeds & Get Your SYS-CON News Live!
SYS-CON Featured Whitepapers
Most Read This Week