Security Holes in the Mobile World
That Smart Phone application may be robbing you blind
By: Dean Coclin
Jan. 15, 2010 11:30 AM
The speed of communication and connectivity today of the smart phone is unparalleled. Even better and faster than the Internet itself, because your phone is always talking to the network, always available and open to receive and send information without you having to dial a number or open a browser.
However, this phenomenal capability also has a dark side. Convenience has a price. The same smart phone that is always connected and serving you content can also be serving you malicious content or stealing from you for someone else. People are buying and downloading applications to their smart phones by the millions. These applications potentially contain code to enable a hacker entry to your phone, access to your address book, identity information and more, where they can cause damage to your device and data. Much like the malicious tool bar on the Internet, an ordinary application can contain code that can damage your phone and wallet.
But it is not just your wallet. How about your company’s?
As more commercial applications develop a mobile interface, they open a new door to the corporate network for hackers.
Security Holes in Mobile World
Enterprise users are now using their mobile devices to perform the same functions that they previously performed on their desktop PC. Only now, these tasks can be done from a much smaller device from virtually anywhere at any time. One of the hidden dangers to which people aren’t paying much attention is rogue code infecting mobile phones. That’s unfortunate because although no major incidents have been reported yet, it’s only a matter a time before some serious event occurs.
Depending on the type of application, a piece of malware could cause a phone to dial foreign numbers, make exorbitant text messages, copy keystrokes (key logger) when owners log into their financial institution or cause some other form of disturbance for the end user. It might flood the network with meaningless messages or render the device inoperable, causing increased help desk costs for the carrier, and your phone to be refused service from the cell network. The same criminals spoofing websites in order to gain access to your personal information have figured out that access to enterprise information is far more rewarding. And while major hacks into corporate sites seem like monthly news, mobile device hacks are lurking in the wings.
This is possible, since smart phones today can browse the Internet and download code from many different places. In fact, many carriers offer "download sites" for their customers to use as a one-stop shop. In addition, vendors such as Handango provide applications for many different operating systems. Also, scammers can advertise rogue code and point browsers to their website to trick users into downloading an application that is not legitimate. Consider a phishing attack, for example, where an unsuspecting user receives an email with a link to "update" his bank account info. He is then directed to a rogue website where code can either be silently downloaded or a he is directed to a link to download a game, widget or some other application that looks legitimate but is really malware.
The fact is that mobile phones are here to stay and have become woven into the fabric of corporate information processing. Where once mobile devices existed simply as a phone, they are now very intelligent data devices and are getting smarter and more robust every day. This is a classic case of balancing convenience against absolute security. Security professionals need to consider what steps and policies they can adopt to ensure that the applications being downloaded by employees are safe and do not wind up causing a material information breach. How Vulnerable Are Smartphones? Is there an answer? The answer today is the digital signature that accompanies the application, whereby the developer digitally “signs” the application and a third party that issues the digital signature vouches for the identity of the individual. This is much like a driver’s license, where you can see an individual’s photo and the fact that the license was issued by the state, which acts as the trusted third party. In this way, signed applications and content can be downloaded and we know who signed it and that it has not been tampered with.
One example of this action in the mobile device world is Symbian, the world's most popular mobile operating system, accounting for 50% of smart phone sales. For creating applications on Symbian’s mobile operating system, authors are required to fax identity information (passport, driver’s license, etc.) to confirm they are who they say they are. They must also include information about their business and pay with a credit card. This process is called vetting and is what the trusted third party does to confirm identity.
Interestingly, other mobile operating systems aren’t quite so thorough. In fact, some only require that authors pay a certificate fee with a credit card, which could, of course, be stolen. There is no vetting or trusted third party. Little can be done to identify the perpetrator in such cases.
Beyond this, some operating system manufacturers like Symbian require that code be tested by a third-party test house before it gets signed by recognized commercial certificate authorities. The test house runs code through a battery of tests before it puts a seal of approval on it. Then it passes it back to the commercial certificate authorities to sign before being returned to the developer.
What are the others doing?
While Symbian has robust process, technology and rigorous testing programs in place to prevent malicious code from being distributed globally and almost instantaneously, the approaches other large mobile operator providers take vary greatly. Here are a few examples.
How Can We Better Protect Smart Phones?
Step 1: Make Sure Code Is Signed By Trusted Individuals
The first step in protecting mobile devices is to ensure that digital certificates are used to authenticate downloaded code. A digital certificate is an ID that contains information about the person, machine or program to whom the certificate was issued. Certificates provide you with assurance that what you are about to use comes from a reliable source. In short, a certificate enables digital trust.
If you are a developer, certificates enable you to sign your work and to verify that this program and version of code is the code that you wrote (i.e., it has not been tampered with). Mobile phone code developers use certificates today to ensure programs are valid before being downloaded to literally millions of devices globally.
The good news is that certificates are inexpensive and, in fact, most mobile device suppliers require that all code be signed before it is used. Certificates serve as a deterrent to malicious behavior, since we know both who signed the code and when they signed it. And since authors of malware don’t want this information to be known, protection is enhanced.
Step 2: Vetting
As noted, if a company allows workers to download “unsigned” programs from sites, rogue code could infect the device and then possibly the entire network. Digital signatures are a necessary component of the security solution, but aren’t enough. For example, how do you know that authors of code are who they say they are? In fact, the process of verifying the identity of authors varies widely.
Typically, certificates are issued to developers after an identity check. More thorough organizations use recognized commercial certificate authorities that follow OMTP (Open Mobile Terminal Platform) standards (mobile network operator forum focused on standards) for identity validation and to conduct email address, valid credit card and identity card (passport or drivers license) checks. In addition, these organizations may even translate foreign documents.
Step 3: More Vetting
Properly done, vetting is about tying all the disparate loose ends together to eliminate or make extremely unlikely any mischief. But there’s one more step that is often missing. Some OS vendors provide certificates that sign the code directly to developers. In theory, that’s fine. As long as the developer uses and stores the certificate properly, security directors can sleep at night. But what if that certificate is given to another developer? Or stolen? Or misplaced? Then the entire security process has been compromised. The proper way to ensure security is to maintain the signing key in a portal so that developers must upload their signed code each and every time they create new software. In that way, the portal ensures the security of the signing key and the integrity of the code. Only the portal can sign the code with a key that will allow it to run on the phone. And since criminals don’t like to be identified, it greatly reduces the risk of rogue code.
Another advantage of this approach is that bad applications can be rescinded by revoking the certificate for that application. Because each application has a unique certificate, the revocation of the certificate for one application has no effect on the other applications. If a single certificate, such as the developer certificate, is used for multiple applications, this granular revocation capability is lost.
Enterprises, too, can take a role in ensuring authenticity. For example, some OS providers do not require applications to be signed, but provide tools for enterprises to manage devices on their network. An enterprise could implement a policy that all code be signed before executing on the device.
Smart phones are not going away and won’t get dumber. By following these few simple and inexpensive steps – using certificates and proper vetting – consumer and business mobile users can be assured of safe application experiences.
Reader Feedback: Page 1 of 1
Latest Cloud Developer Stories
Subscribe to the World's Most Powerful Newsletters
Subscribe to Our Rss Feeds & Get Your SYS-CON News Live!
SYS-CON Featured Whitepapers
Most Read This Week