Security. We all want it — with our jobs, our bank accounts, in our relationships and even for our identities. Yet much of the population — business owners included — spend far too little time thinking about or understanding security when it comes to the very devices upon which we have come to depend.
Computers can be found in phones, cars, homes and appliances, and we trust them with our money, our passwords, our most sensitive personal data and our deepest, darkest secrets. But how much attention are we really paying to the safety of all of this information?
As the debate between Apple and the US Government rages on, let’s look at exactly how this feud could affect not just our iPhones and iPads, but our data security across all of the software applications we use.
In case you’ve been hibernating in a cave and missed out on last week’s big tech story, let’s recap: Tech giant Apple is refusing to build a backdoor into its iOS software, a request that was court-ordered to assist the FBI in accessing encrypted data on a phone linked to the perpetrators of the San Bernardino shooting.
At first glance, the issue at hand may seem very specific (massive tech company rejects court mandate to access data on a single device) — but the outcome of this debate could have major implications across the tech world, from the programs we use on our devices to how our data is protected. On the broadest scale, the question becomes how secure your data is in the hands of the software and hardware companies in which you entrust it.
Apple’s position: From Apple’s standpoint, building new software with an intentional security breach puts all Apple devices at increased risk (they argue that it’s impossible to develop a backdoor for one specific device, as all of their devices run on variations of the same system). In the wrong hands, this piece of software could enable hackers, terrorists and others to access virtually any iOS system and de-code sensitive, encrypted data — whether it belongs to tech moguls, government officials, public figures or companies themselves.
The FBI’s case: Encryption technology makes it incredibly difficult to access coded information on devices pertaining to criminal acts and cases. Law enforcement needs a way to bypass these security measures in order to do their job. If tech companies aren’t willing to voluntarily help, their hand should be forced for the larger good of all of society.
Apple’s case against the FBI will not only set a precedent for how tech companies are able to react to government- or court-mandated requests, but also affect the future security of all of our computer-driven devices.
Here’s a breakdown of how the outcome would impact you and your computer, smartphone and tablet:
Your Personal Data — If built, this security backdoor (or master key, as Apple CEO Tim Cook has described it) would allow direct and immediate access to any information on your devices, encrypted or not.
Your Privacy — A ruling in favor of the FBI would essentially authorize the government to access data without requiring a warrant. That means that any time law enforcement believes they require special access to the contents of a device, they may do so without legally justifying the action — in doing so trampling all over your so-called rights as a citizen or resident of this country.
Your Safety — On a broader scale, this debate isn’t just about giving law enforcement access to your iCloud photos and memos. It’s about developing code that enables the person or people in possession of said code to sidestep one of the strongest advancements to date in digital security. Code that can then be used to hack into any of your Apple accounts, software or devices — or that can be manipulated to hack encryption systems developed by other tech companies.
Software is a fluid and complex industry, and because of that no universal regulations for data security currently exist. Sure, every company claims to have top-of-the-line data protection measures — and many of them do — but at the end of the day our data is only as secure as our government and the companies who own the applications we use allow it to be.
Encryption is one of the most powerful security technologies around, making it impossible for someone without the encryption key to read a coded set of data. And encryption is currently used by many companies in charge of sensitive client data, from financial businesses to healthcare providers to social platforms. (Check out this Time article for a really great explanation of encryption.)
Yet adding an asterisk to encryption methods that states certain parties or individuals have backdoor access to encrypted data, at any time and without legal proof of value, weakens the entire encryption infrastructure. It also raises the issue of whether, as Julian Sanchez described it in Time, “technology companies can be conscripted to undermine global trust in our computing devices.”
This debate isn’t just about the safety of our devices, our software or even our digital information — it’s about whether we want to cripple the future of data security and erode any sense of confidence we as consumers have in our technology.
Share your thoughts on data security and the Apple-FBI debate by tweeting us @BiznessSoftware! And be sure to stay up to date on all of our exclusive News & Trends posts by visiting the Business-Software.com blog.
[Photo courtesy of Pexels.]