Cybersecurity is too reactive, needs more upfront planning, professors stress to Congress


Jim Faber/JFaberPhotography
Stephen Wicker, Cornell professor of electrical and computer engineering, addresses privacy concerns at a Congressional briefing April 30.

WASHINGTON, D.C. -- Most information systems could be a whole lot more secure, according to two Cornell faculty members who briefed Congress April 30.

"We're in a situation in which we're basically always putting patches on security and always cleaning up after databases have been hacked into," said Stephen Wicker, professor of electrical and computer engineering. "The world doesn't have to be like this. It can be much safer if we follow certain design practices."

Wicker, the Cornell principal investigator for the TRUST Science and Technology Center, addressed privacy concerns on Capitol Hill; Andrew Myers, associate professor of computer science, tackled security issues.

Privacy concerns arise whenever data is collected unnecessarily, Wicker told congressional staffers. Such data can have operational and opportunity costs.

"Certain systems like the cellular [phone] platform are not used in ways they could be used because they are recognized as being surveillance tools," Wicker said.

To avoid these costs, privacy needs to be more than an afterthought, said Wicker. "What I'm proposing is a set of privacy-aware design practices," he said. "When you apply these guidelines to the process of design, interesting things happen and potential train wrecks are avoided."

Take demand response and Advanced Metering Infrastructure, for example. AMI promises to cut summer peak electricity usage by as much as 20 percent by showing consumers their energy costs in real time. Studies show that with this information, many more people will choose to delay power-hungry activities until evening when electricity is less expensive.

But as currently proposed, AMI would collect data on individual power usage at a central location. "There's a lot of things I can tell about what's going on in your home based on minute-by-minute power consumption data," said Wicker. "And all of this wonderful information about what you do all the time in your home is going to be available to third parties, potentially for marketing purposes."

Myers, whose research focus is developing secure programming languages, said computer security costs billions of dollars annually.

One reason security has become such an intractable issue is that computers have become so complex, Myers said. "The computer you use every day probably has more than 50 million lines of code on it," he said. "One error could allow someone to take over your whole computer."

But as with privacy, the deeper issue is that most of the effort that goes into computer security is reactive, Myers said.

Security was not incorporated into the design of older, "unsafe" languages like C, with hundreds of potential loopholes for criminals to exploit. "Safe" languages like Visual Basic are better, but not a panacea, as shown by the 2000 "I Love You" virus. So Myers and his colleagues have been working on an even safer language called Jif, for Java information flow.

With Jif, developers must specify security policies that the code is checked against. "The compiler is then enforcing this end-to-end security on your code, and the software construction process itself is checking security properties," said Myers. "So to a first approximation, you can't write code that's insecure."

Such code might someday make Internet voting possible, but a lot of fundamental scientific issues related to security need to be resolved first, Myers said.

Robert Emro is assistant director of communications and media relations in the College of Engineering.

 

Media Contact

Blaine Friedlander