A new cybersecurity reseach agenda from Dan Geer in three minutes or less - some snippets
- We would need a lot less research if we put into practice what we already know. But we don't. Ergo, why we don't put into practice what we already know is itself a research-grade topic.
Comment: the main blocking factors are usability and integration. As to integration, security is not just "put in the policy and everyone will implement it", its integration engineering to make it faster and cheaper to do the right thing.
Security is not composable. However, in cyberspace, everything critical is a melange. Gilbert and Lynch's proof of Brewer's theorem finds that in a distributed system it is Consistency, Availability, and Partition Tolerance, choose any two. That tells me there is a research grade result for cybersecurity that will be found to be parallel.
Most security isn't composeable but all the work on federation, SAML, ABAC and other protocols means that we're getting the ability to perofrm more granular and dynamc access control checks across age old point to poitn boundaries. ABAC, PBAC and RADAC are all examples of this and as with the previous point require integration engineering to be effective.
In the 1990s, the commercial world pulled even with the military world in the application of cryptography. It is now doing the same with traffic analysis (heretofore the strategic redoubt of the intelligence community). While the intelligence community has had the pre-eminent sensor fabric, integrated messaging coupled to geo-location technology is the stuff of hegemony. This is a fact which is not lost on Russia, is not lost on China, and one hopes is not lost on Google. Is resistance to traffic analysis a research grade question, or is it merely wishful thinking?
PCI helped the private sector make comparatively massive investments in monitoring technology, but PCI solved the easy part - the back end. The domain knowledge required for integration of the audit log messaging and events is lacking in most enterprise deployments, that's a limiting factor to making any of these successful
My research security agenda would have three things on it
After all, why did SAML succeed? Digital signatures and message encryption were not exactly new ideas, nor was capabilities, session management or Single Sign On. It succeeded for several reasons but among them were - pushing the PKI style complexity down the stack where the developers neednt worry about 99% of the ridiculous complexity; in addition it recognized there was - wait for it - a user(!) in the equation and a browser, so there was not some abstract output but rather a set of user operations.
There were well defined ways to interact with the protocols from a user and a system perspective. This is what we need much more of, to enable the effectiveness of ABAC, PBAC, and RADAC, Bob Blakley's work on moving from Push to Pull protocols in Identity Management is an item that should be high on research agenda because it shows how to get better use of these protocols in real world systems. Federal work on NSTIC, OIX and the like is enormously helpful but as with PCI and logging it helps to solve the first part of the problem. More work is needed to get the front end user and back end containers inherently conversant with the new protocols.