Electronic Frontier Foundation has drafted a Mobile User Privacy Bill of Rights that seeks to codify the best practices for app developers.
The EFF's document starts from the premise that given the sensitivity of the data that many consumers store on their phones, manufacturers, carriers, app developers, and mobile ad networks need to respect user privacy in order to earn and retain the public trust. It uses as a basis its existing Bill of Privacy Rights for Social Network Users and and the recently released White House white paper "Consumer Data Privacy in a Networked World".
The document lists six rights that have to be respected by applications:
Individual control: Users have a right to exercise control over what personal data applications collect about them and how they use it.
Focused data collection: App developers need to be especially careful about concerns unique to mobile devices - address book information, photo collections, location data, and the contents and metadata from phone calls and text messages. Applications should only collect the minimum amount required to provide the service and attempt to keep personal information anonymous.
Transparency: Users need to know what data an app is accessing, how long the data is kept, and with whom it will be shared. Users should be able to access human-readable privacy and security policies, both before and after installation.
Respect for context: Applications that collect data should only use or share that data in a manner consistent with the context in which the information was provided. If contact data is collected for a "find friends" feature, for example, it should not be released to third parties or used to e-mail those contacts directly.
Security: Data should be encrypted wherever possible, and data moving between a phone and a server should always be encrypted at the transport layer.
Accountability: Ultimately, all actors in the mobile industry are responsible for the behavior of the hardware and software they create and deploy. Users have a right to demand accountability from them.
It then provides the following best practices:
Anonymizing and obfuscation: Wherever possible, information should be hashed, obfuscated, or otherwise anonymized. A "find friends" feature, for example, could match email addresses even if it only uploaded hashes of the address book.
Secure data transit: TLS connections should be the default for transferring any personally identifiable information, and must be the default for sensitive information.
Secure data storage: Developers should only retain the information only for the duration necessary to provide their service, and the information they store should be properly encrypted.
Internal security: Companies should provide security not just against external attackers, but against the threat of employees abusing their power to view sensitive information.
Penetration testing: Security systems should be independently tested and verified before they are compromised.
Do Not Track: One way for users to effectively indicate their privacy preferences is through a Do Not Track (DNT) setting at the operating system (OS) level. Currently, DNT is limited mostly to web browsers, and only Mozilla's Boot2Gecko supports the Do Not Track flag at the OS level. But developers would benefit from the clear statement of privacy preferences, and should encourage other OS makers to add support.
EFF notes that some of these issues will need other parties such as mobile carriers to get on board, but this code of practice looks like a good place to start for app developers.
Now this really is an interesting iOS app. Gofor is a new company that is promoting the idea of drones on demand. You want to take the ultimate selfie? Scout ahead to see if the road is clear or just [ ... ]
An updated version of the Google Dart SDK brings performance of the Dart VM to that of Node.js. It is claimed new version is twice as fast as its predecessor for asynchronous Dart code combined with s [ ... ]