Jump to content
YOUR-AD-HERE
HOSTING
TOOLS

Locked What You Should Know About Congress's Latest Attempt to Criminalize Encryption


top10

Recommended Posts

What You Should Know About Congress's Latest Attempt to Criminalize Encryption

 

A new draft bill in Congress will force tech companies to undermine or break their own security features and encryption anytime law enforcement asks them to. Sound terrible? It is. Here’s what the bill says, and what you can do about it.

 

For those just catching up, Apple and the FBI had a big legal throwdown recently over an iPhone owned by Syed Rizwan Farook, the gunman in the San Bernandino mass shooting. The FBI demanded that Apple create a tool to get around the phone’s PIN lock. Apple argued this was an undue burden and would weaken the security of all iPhones. Eventually, the FBI backed down and found a third-party firm to unlock the iPhone, although there’s another phone in play right now, just across the country.

 

In response to the whole affair, however, Senators Dianne Feinstein (D-CA) and Richard Burr (R-NC) are currently working on a bill to make sure law enforcement can get what they need without having to beg. The Feinstein-Burr bill would, if passed, force tech companies to comply with court orders to turn over data, even if that data is encrypted or if the company can’t actually access it. A preliminary version of the so-called “Compliance With Court Orders Act of 2016" was released last Friday. This version isn’t necessarily final, but it’s already pretty terrible. Unless major changes are made, this bill is dangerous to anyone who values their security.

What This Bill Would Do

 

According to the draft released on Friday, any time a tech company is provided with a court order for information, they must be capable of complying with it. Either by having access to the data itself, or by helping the government find a way to get access to the data. In other words, a company can’t say “That’s impossible” and call it a day. A tech company faced with such an order would have two options:

 

Turn over the information directly. If a company has data on their servers relevant to the court order, they would be required to hand it over to law enforcement. It must be “in an intelligible format.” This means the company must have the ability to translate encrypted data to a readable format. That would require tech companies who offer encryption to either hold the keys to decrypt the data themselves, making their customers data more vulnerable, or worse, only use encryption that the company itself could break, making the encryption effectively worthless.

Help law enforcement get access to the information. If a company doesn’t have the data stored somewhere, it would have to provide “technical assistance as is necessary” in order to help the government get access to the data. In other words, tech companies would be forced to throw their weight into investigative forensics until the government decided the job was done. Notably, there is no limitation in this bill on just how much effort the government can demand from a company. There is, however, a provision stated they will be “compensated” for any costs incurred by providing technical assistance.

 

Using the San Bernardino case as an example, under this new law Apple would’ve been required to gain access to Farook’s iPhone, since it was the subject of a court order, regardless of how much Apple felt it could damage their business or their customers’ security. However, somewhat confusingly, it very deliberately doesn’t say how Apple must accomplish this. One section of the law reads as follows:

 

Nothing in this Act may be construed to authorize any government officer to require or prohibit any specific design or operating system to be adopted by any covered entity.

 

In other words, the FBI can’t come to Apple with a demand for a specific software feature that would get around a phone’s encryption (which they did, in the San Bernardino case). Instead, it simply mandates that Apple must do this somehow. It also says that Apple’s job wouldn’t be done until the government decided it was done.

 

The scope of the bill also extends to app stores. One section says that any company that “distributes licenses for products, services, applications, or software” must ensure that those products are capable of complying with the law. In other words, if Apple can’t ensure that an app developer is capable of handing over its customers data, Apple cannot legally allow their apps in the App Store. Once again, the bill doesn’t say how a company is supposed to make sure that every single app they distribute can comply with a court order. At best, it legally mandates a lengthy security audit on every single communications app in the store. At worst, it requires app store owners to dictate which security features that developers can use. No matter how you interpret it, it’s a bad sign.

Everything Else Wrong With This Bill

 

In its current form, this bill is disastrous for tech companies and consumers. One of the biggest problems is that what it requires may not actually be possible in many situations. For example, WhatsApp recently enabled end-to-end encryption on all messages. Google has done the same with Gmail for a long time. Both companies are incapable of accessing the data sent via its service without physical access to an endpoint device, and any data obtained in transit is certainly not in a “readable format.” This bill would require that the company find a way to turn over that data in a way law enforcement can read and use, even though it’s literally impossible. As policy analyst Julian Sanchez puts it, in some cases this bill is tantamount to asking a company to perform magic:

 

WhatsApp would have two choices. On one hand, they could disable end-to-end encryption, which makes their products inherently less secure and upsets their customer base. Alternatively, they could build in a backdoor or maintain a database of their customers’ encryption keys, which undermines the platform’s security. It would be like requiring that the person who built your house own a set of keys to your home, and have a special door that only they can get into.

 

Both choices make consumer security weaker, and open the door to other bad actors who may want to steal user data, messages, and anything else sent through the app. Under this law, strong security practices would be illegal. If a product or service is so secure that the company or the government can’t access or decrypt it, the company will have to weaken that security in order to comply with the law.

 

Another major problem is that this bill requires companies like Apple, Google, and Microsoft to police their app stores for secure apps and remove them. Not only would those companies have to weaken security for their own products, they would have to make sure that any app in their app stores also has weak security. In addition to banning secure apps, it would place an excessive burden on both app developers and app stores to make sure that each and every app complies with this law.

 

The bill also might not even be necessary. The All Writs Act (which we mentioned when we covered the Apple/FBI case) allows a court to order a company to assist in an investigation in whatever way is necessary, as long as compliance is not an unreasonable burden. In the case of Apple’s fight with the FBI, Apple argued that creating the tool the FBI wanted represented an unreasonable burden that would risk the security of many more iPhones than just the one in that case. However, Apple has helped law enforcement extract data from other iPhones under different circumstances many times before. The new bill would compromise the security of every device and app in the world simply to deal with a few outlier cases where the government can’t use existing laws—or, as the FBI proved in the San Bernardino case, existing security researchers and contractors—to get the information they need.

 

Supporters of the bill say that it’s necessary to fight the “going dark” problem. As security technology gets better, law enforcement’s job gets harder. In the past, advanced encryption and security layers were exclusive to governments and highly organized criminals. Now, everyone with a recent smartphone could potentially thwart federal investigations. This puts the substantial and growing burden of keeping up with technologically advanced criminals on to law enforcement. However, as our own Editor-in-Chief Alan Henry explains, this is exactly how it should be.

 

The FBI and the NSA and the CIA shouldn’t come crawling to Silicon Valley to break phones or encryption. They should already have the capabilities to do so, and if not, wtf are they waiting for and where have they been for the past 20 years?

 

While “going dark” is a legitimate problem, conscripting the tech companies that make all of our gadgets and apps is a poor solution. Law enforcement agencies should be equipped with the tools they need to perform their investigations without compromising the security of users who have done nothing wrong. Every US citizen shouldn’t have to keep a weak lock on their front door just in case the government needs to knock it down some day.

How To Make Your Voice Heard On This Issue

 

Currently, the final version of this bill hasn’t been officially released yet, so there’s still plenty of time to speak up. You can find information on how to contact the offices of Senator Burr here and Senator Feinstein here. You can also use 4USXUS to find information on your Senators and Representatives and get contact info. You can also use Democracy.io to contact them easily without hunting for contact info. Use either of these methods to let your representatives know how you feel about the bill, and how you think they should vote should it reach committee, or heaven forbid, the floor for a full vote. You can also contact the White House here to let the President know that you support a veto, should the bill pass. Gizmodo also has a roundup of the computer security views of each presidential candidate currently running for the office.

 

You can also use 4USXUS to follow the bill itself once it’s officially introduced (for example, here’s what CISA looks like). In the meantime, it’s always a good time to reach out to your Congressional representation to make your voice heard. Right now, it might seem like it’s unlikely that this bill is going to pass (and it has gotten a lot of very negative attention). However, worse bills have passed when no one was looking, so take the time to make your opinion heard while you can.

 

Source:

This is the hidden content, please

Link to comment
Share on other sites

Guest
This topic is now closed to further replies.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.