When it comes to computer security, there are three main goals: privacy — making sure no one can steal your data; integrity — ensuring that your data has not been altered in an unauthorized way; and availability — ensuring you have access to the resources you need to do what you need to do.
Most research focuses on the first two, said Ning Zhang, an assistant professor of computer science and engineering at Washington University’s McKelvey School of Engineering in St. Louis. It’s easy to see why. “If you prevent me from using my credit card, that’s fine. It’s not as bad as if it was stolen and used by a thief,’ he said, but what about a self-driving car driving down a road riddled with honeycombs? -hen at 80 mph surrounded by other vehicles doing the same? In this situation, a little access — to the brakes, perhaps? – would be helpful.
Zhang’s student presented his research at the 43rd IEEE Symposium on Security and Privacy in San Francisco, May 23-25, which described a new framework for system availability in cyber-physical systems such as self-driving cars. It ensures that the user has an uptime guarantee for some of the mission controls so that in the event of a cyber attack, the system remains safe.
The method described by Zhang is based on two principles, isolation between critical and non-critical components and complete mediation on critical system resources. In order to keep critical components out of a hacker’s reach, it must be isolated from the rest of the complex system. “It’s like a castle,” Zhang said, referring to the isolated environment where computers keep potentially dangerous software away from their critical components.
In order to keep the trusted computing base small, this trusted execution environment maintains very narrow functionality for the cyber-physical system, like the ability to brake, or disengage the gas or maybe spin the wheel a bit . These features remain accessible to the vehicle operator even if the car’s operating system is attacked.
Maintaining availability is no small feat; after all, the operating system controls everything in the car. “If the system is controlled by a hacker,” Zhang said, “then of course it won’t give you control.”
This is where reducing the attack surface comes in to limit the points at which an attacker can impact the trust environment through their influence on the operating system. To do this, the secure environment will only respond to a particular set of commands. If the request does not fall within these commands, access is denied.
This process is known as the benchmark monitor and works in two parts. “First, for example, I say to you: ‘You can only write me letters. No calls. No email. No texting,” Zhang said. If you send an e-mail, it is deleted without even being read.
“Once I get a letter, it’s only allowed to make certain requests,” he said. Turning left or right may be acceptable, “But what if you ask for something else? I’m kicking you out. No access.
After that there could be a setting for how many degrees the wheel can turn, how long the wheel can stay in that position and so on. The information must come from a certain location and the request must meet certain parameters to access this limited functionality.
“Because they interact with the physical environment, cyber-physical systems must provide real-time execution of computational tasks such as controllers,” said Chenyang Lu, Fullgraf Professor of Computer Science and Engineering and author of the article.
“Traditionally this is done by a real-time scheduler in the operating system, which can however be included. A key advancement of RT-TEE is to provide a secure real-time scheduling framework that maintains real-time performance guarantees for safety-critical tasks, even when the rest of the system is compromised,” Lu said.
The idea that someone’s car could be hacked is not a concern for the distant future. It has already been done. Zhang says that for self-driving cars – and all cyber-physical systems – it’s crucial that the third pillar of security – availability – is also protected.
“When it comes to the security of the critical systems I develop, I ask, ‘Would I be willing to sit in a car with such an advanced hacker attacking the car?’ If I don’t, then I’m not doing a good job.
This work is supported in part by National Science Foundation and Department of Homeland Security subgrants ECCS-1646579, CNS-1837519, CNS-1916926, and CNS-2038995, and by the Fullgraf Foundation. We thank David Corman for his support on this project.
The McKelvey School of Engineering at Washington University in St. Louis promotes independent research and education with an emphasis on scientific excellence, innovation, and collaboration without boundaries. McKelvey Engineering offers some of the best research and graduate programs in all departments, especially in biomedical engineering, environmental engineering, and computer science, and offers one of the most selective undergraduate programs in the country. With 140 full-time faculty, 1,387 undergraduate students, 1,448 graduate students, and 21,000 living alumni, we work to solve some of society’s greatest challenges; prepare students to become leaders and innovate throughout their careers; and to be a catalyst for economic development for the Saint-Louis region and beyond.