The dispute between Apple and the government about searching iPhones is back. In 2015, when terrorists launched an attack in San Bernardino, CA, the FBI asked Apple for help in searching the now-dead suspects’ phones and then obtained a court order compelling Apple’s assistance. Ultimately the FBI was able to search the phones without Apple’s assistance and the issue went away. But now, the FBI is requesting help in searching the iPhone of the late Mohammed Saeed Alshamrani, the Saudi Air Force lieutenant who killed three people and injured eight others at the Naval Air Base in Pensacola FL in December.

Apple has declined to help, claiming that it has built the iPhone so that even Apple cannot get in.

This situation raises two serious questions. The first is whether we should allow companies to make devices that are “unsearchable.”

The Fourth Amendment does not ban all searches, only “unreasonable ones.” A search warrant determines that a search is reasonable. In nearly 20 years of handling computer crimes, I developed some knowledge of searches of computers and cell phones in varied cases. Every FBI computer forensic lab has a collection of iPhones that it is unable to search, because Apple has made the phones unsearchable. (It takes years to figure out how to break the encryption in one generation of iPhones.) Sometimes, the government gets a court order compelling the suspect to unlock his phone, but these cases present difficult Fifth Amendment self-incrimination issues.

Unsearchable phones present serious problems for investigations of terrorism, as well as narcotics, kidnapping, and child sexual exploitation. If criminals and terrorists can use an iPhone to communicate and store files, knowing that it can never be searched, that puts us in a new world. It means that even if law enforcement officers can seize a suspect’s phone and obtain a search warrant, his communications stored on the phone (e.g. e-mails, text messages) and any saved files (e.g. images of the sexual victimization of children) will remain hidden.

Resolving this issue is not simple. Experts tell us that creating “back doors” to encryption systems means that they can be broken more easily. Apple’s encryption system does not protect one device; it protects all of Apple’s phones. If Apple creates an encryption key, it could be used against every iPhone. Even if Apple keeps the key, there is the danger that the key could leak out, endangering the privacy of all iPhones. Although Coca-Cola, for example, has managed to keep its formula secret, in August 2016, the National Security Agency lost control of some major cyber tools, resulting in major internet attacks. This could happen again.

This choice between privacy and security should not be one of absolutes. We make decisions like this daily. We would never be hurt in an auto accident if we never went outside. Most of us go outside and get into cars, in spite of the risks. In such cases, we make tradeoffs between security and convenience.

We need to make a similar decision about unsearchable phones. We should not leave this decision to a large international company, which exists to make money. If Apple thinks taking this stand will help it sell more iPhones, it will not help the government, regardless of the seriousness of the crime or danger to the country.

The second key question here is when, and to what extent, the government can compel a private person to help with a search. Does it place too great a burden on Apple to order it to assist? It’s one thing to ask a landlord to unlock an apartment door with his key, and another to ask a company to perform services for the government that may include writing new software or creating new search tools. There is also the question whether, and when, the government should be allowed to get court orders forcing such help.

These are issues that the American people should discuss, with a view toward pressing Congress to pass legislation dealing with them. We should not leave these issues to international corporations, or to periodic lawsuits between these companies and the government.

Michael Levy served as an Assistant United States Attorney for 37½ years. From 2001 until 2017, he was Chief of Computer Crimes in Philadelphia.