FBI Director James Comey, Director of National Intelligence James Clapper, and NSA Director Mike Rogers continue to lament the ability of people to secure the privacy of their communications with end-to-end encryption that even governments cannot break. But the push for laws mandating “backdoor access,” or built-in security flaws for the state to exploit, has run into some unexpected opponents: former national security officials on the other side of the revolving door.
“My position is probably going to be a little surprising to people here,” Michael Chertoff, the former secretary of homeland security, told an audience last week at the Aspen Security Forum. “I think that it’s a mistake to require companies that are making hardware and software to build a duplicate key or a back door even if you hedge it with the notion that there’s going to be a court order.”
That’s long been the position of most technologists and many tech companies in the communications business. As founder of The Chertoff Group, where former national security officials draw on the knowledge and relationships they acquired as public servants to service the needs of corporations that pay them as consultants, Chertoff is “working with some companies in this area,” he disclosed.
Michael Hayden has served as director of national intelligence as well as head of the NSA and CIA. He is now a principal at the Chertoff Group. And according to The Daily Beast’s Noah Shachtman, who moderated a panel at the Aspen Security Forum, Hayden declared in an interview, “I think I come down on the side of industry. The downsides of a front or back door outweigh the very real public safety concerns.”
Michael Leiter has doubts about mandatory “backdoors” too. A former director of the United States National Counterterrorism Center, he presently works for Leidos, a defense and homeland security contractor. Appearing on the same panel as Chertoff, he declared that “we are clearly going to a world where end-to-end encryption with temporary keys that disappear immediately after any communication occurs, that is the future. There is no way around that; we are not going to stop that. And because of that, for the technology issues, I don't think there is a long term way to preserve the US government's ability to intercept or get access to those.”
“We have to accept that the degree to which we undermine our national security by having that back door or front door, depending upon how you define it, is very real,” he added. “We have seen that because of the cyberthreat.” Policymakers can try to design backdoor access to communications, he said, “but reality is going to overtake you and it's a funny thing that when technology and law conflict, law's not going to change that technology for long, it's going to overtake it. And you have to have a law which addresses reality, and not what you hope reality will be."
Journalist Marcy Wheeler, one of the first members of the press to take note of the panel, observed that Chertoff’s answer is notable because of who he is. Through much of his career, “Chertoff has been the close colleague of FBI Director Jim Comey, the guy pushing back doors now,” she wrote. “It’s possible he’s saying this now because as a contractor he’s being paid to voice the opinions of the tech industry; as he noted, he’s working with some companies on this issue. Nevertheless, it’s not just hippies and hackers making these arguments. It’s also someone who, for most of his career, pursued and prosecuted the same kinds of people that Jim Comey is today.”
Chertoff’s paymaster isn’t the only thing that has changed.
Being in private industry exposes him to people with different values, incentives, institutional imperatives, and responsibilities than he met when he traveled in national security circles. And the rest of Chertoff’s remarks, whatever motivated them, included cogent, hard-to-refute arguments against requiring “backdoors” and weakening encryption.
First of all, he said, “you’re basically making things less secure for ordinary people.”
Second, he said, “the really bad people are going to find apps and tools that are going to allow them to encrypt everything without a back door. These apps are multiplying all the time. The idea that you’re going to be able to stop this, particularly given the global environment, I think is a pipe dream. So what would wind up happening is people who are legitimate actors will be taking somewhat less secure communications and the bad guys will still not be able to be decrypted.”
Third, he explained, looking abroad, “what are we going to tell other countries? When other countries say great, we want to have a duplicate key too, with Beijing or in Moscow or someplace else? The companies are not going to have a principled basis to refuse to do that. So that’s going to be a strategic problem for us.”
He concluded by observing that “we do not historically organize our society to make it maximally easy for law enforcement, even with court orders, to get information”––and that past experience suggests “we’re not quite as dark, sometimes, as we fear. In the ‘90s –– when encryption first became a big deal –– there was a debate about a Clipper Chip that would be embedded in devices or whatever your communications equipment was to allow court ordered interception. Ultimately, Congress and the president did not agree to that. And talking to people in the community afterwards ... we collected more than ever. We found ways to deal with that issue. So it’s a little bit of a long-winded answer. But I think on this one, strategically, requiring people to build a vulnerability may be a strategic mistake.”
Again, none of these points is new. Opponents of “backdoors” have made them many times. It is nevertheless striking that even Chertoff, Hayden, and Leiter, alums of the national security state who are sympathetic to its needs, cannot be convinced that “backdoors” are a prudent solution to the problem of bad guys “going dark.”
“My position is probably going to be a little surprising to people here,” Michael Chertoff, the former secretary of homeland security, told an audience last week at the Aspen Security Forum. “I think that it’s a mistake to require companies that are making hardware and software to build a duplicate key or a back door even if you hedge it with the notion that there’s going to be a court order.”
That’s long been the position of most technologists and many tech companies in the communications business. As founder of The Chertoff Group, where former national security officials draw on the knowledge and relationships they acquired as public servants to service the needs of corporations that pay them as consultants, Chertoff is “working with some companies in this area,” he disclosed.
Michael Hayden has served as director of national intelligence as well as head of the NSA and CIA. He is now a principal at the Chertoff Group. And according to The Daily Beast’s Noah Shachtman, who moderated a panel at the Aspen Security Forum, Hayden declared in an interview, “I think I come down on the side of industry. The downsides of a front or back door outweigh the very real public safety concerns.”
Michael Leiter has doubts about mandatory “backdoors” too. A former director of the United States National Counterterrorism Center, he presently works for Leidos, a defense and homeland security contractor. Appearing on the same panel as Chertoff, he declared that “we are clearly going to a world where end-to-end encryption with temporary keys that disappear immediately after any communication occurs, that is the future. There is no way around that; we are not going to stop that. And because of that, for the technology issues, I don't think there is a long term way to preserve the US government's ability to intercept or get access to those.”
“We have to accept that the degree to which we undermine our national security by having that back door or front door, depending upon how you define it, is very real,” he added. “We have seen that because of the cyberthreat.” Policymakers can try to design backdoor access to communications, he said, “but reality is going to overtake you and it's a funny thing that when technology and law conflict, law's not going to change that technology for long, it's going to overtake it. And you have to have a law which addresses reality, and not what you hope reality will be."
Journalist Marcy Wheeler, one of the first members of the press to take note of the panel, observed that Chertoff’s answer is notable because of who he is. Through much of his career, “Chertoff has been the close colleague of FBI Director Jim Comey, the guy pushing back doors now,” she wrote. “It’s possible he’s saying this now because as a contractor he’s being paid to voice the opinions of the tech industry; as he noted, he’s working with some companies on this issue. Nevertheless, it’s not just hippies and hackers making these arguments. It’s also someone who, for most of his career, pursued and prosecuted the same kinds of people that Jim Comey is today.”
Chertoff’s paymaster isn’t the only thing that has changed.
Being in private industry exposes him to people with different values, incentives, institutional imperatives, and responsibilities than he met when he traveled in national security circles. And the rest of Chertoff’s remarks, whatever motivated them, included cogent, hard-to-refute arguments against requiring “backdoors” and weakening encryption.
First of all, he said, “you’re basically making things less secure for ordinary people.”
Second, he said, “the really bad people are going to find apps and tools that are going to allow them to encrypt everything without a back door. These apps are multiplying all the time. The idea that you’re going to be able to stop this, particularly given the global environment, I think is a pipe dream. So what would wind up happening is people who are legitimate actors will be taking somewhat less secure communications and the bad guys will still not be able to be decrypted.”
Third, he explained, looking abroad, “what are we going to tell other countries? When other countries say great, we want to have a duplicate key too, with Beijing or in Moscow or someplace else? The companies are not going to have a principled basis to refuse to do that. So that’s going to be a strategic problem for us.”
He concluded by observing that “we do not historically organize our society to make it maximally easy for law enforcement, even with court orders, to get information”––and that past experience suggests “we’re not quite as dark, sometimes, as we fear. In the ‘90s –– when encryption first became a big deal –– there was a debate about a Clipper Chip that would be embedded in devices or whatever your communications equipment was to allow court ordered interception. Ultimately, Congress and the president did not agree to that. And talking to people in the community afterwards ... we collected more than ever. We found ways to deal with that issue. So it’s a little bit of a long-winded answer. But I think on this one, strategically, requiring people to build a vulnerability may be a strategic mistake.”
Again, none of these points is new. Opponents of “backdoors” have made them many times. It is nevertheless striking that even Chertoff, Hayden, and Leiter, alums of the national security state who are sympathetic to its needs, cannot be convinced that “backdoors” are a prudent solution to the problem of bad guys “going dark.”
No comments:
Post a Comment