« ENISA Smartphone Security Report | Main | Happy and Merry from Grant's »

Comments

Clive Robinson

Gunner,

The problem you identify with "trust" is endemic in InfoSec. So much so "InfoSec outlook" might become the new slang for "wooly thinking".

At the heart of the problem is "metrics" or more correctly the almost complete absence of metrics that can be used as part of "the scientific method".

Gunnar

@Clive,
Who said anything about science? I would settle for humble engineering, engineers don't "trust" steel they understand its capabilities and constraints and design accordingly.

Andy Steingruebl

Are you saying you have the capability/capacity to fully "certify" each of those components of your architecture, including the ones you buy from a third-party? Are they liable if the product has a bug that compromises your security?

If not, then you're trusting your security gateway providers, those who wrote your STS, etc. Implementation quality matters, and you cna't simply architect it away. You probably can't adequately or meaningfully test for it, and you almost certainly can't hold any of your software/hardware producers liable.

So, you're trusting them right?

Gunnar

@Andy - its a question of what kind and how much naivete. Naivete by omission or naivete by commission. You can transfer liability (not the same as trust). How much do you pay or get paid for that naivete?

Blithe assumptions are the enemy of all design and security due to trsut-y thinking is among the worst at this.

Andy Steingruebl

That still doesn't answer the question I'm afraid. You go with a certain security gateway, or PDP, and it was written by someone. If you can't audit it, or don't have realiable audit (almost certainly the case), then you're trusting that they did it right, since your ability to hold them contractually liable for any negatives is essentially impossible. Go ahead and sue IBM/Datapower if (when?) their security gateway has a flaw. Try to buy that device with a contract that has "hard" security guarantees, and liability. You can't.

That leaves you trusting that the vendor has sold you a quality product.

Gunnar

@Andy
Bond people demand a higher rate of return on a bond issued by Spain than a bond issued by Germany, no trust involved

Greek bonds pay 12%
Spain pay 5.5%
German pay 2.9%.

Trust has no basis when the goal is safety. Bond market is saying - I dont think Greece is paying me back so they pay a (much) higher rate.

Any transfer or acceptance of liability should be made in the context of cost. Perhaps the cost of buying from IBM is so cheap (versus what it would take build from scratch) that it is worth taking on the liability. Again its not trust its cost and liability.

Note, you can certainly build a gateway from scratch. That's one reason why we have seen open source be so successful.

Clive Robinson

@ Gunner,

"Who said anything about science? I would settle for humble engineering, engineers don't "trust" steel they understand its capabilities and constraints and design accordingly"

And for that "understanding" the engineers need reliable methods of measurment or metrics.

There is the old saw about the difference between scientists and engineers,

A scientist is looking for a problem to investigate and characterize, engineers look for problems to not just investigate and characterize, but solve in as workman like fashion as possible as well.

Engineering with out science is the artisanal behaviour of the wheel wright using patterns passed down from father to son.

It is interesting to see the code cutters have atleast progressed to "patterns".

I'm not sure I can say the same for most ICT security practicioners they appear to still be artists applying themselves to cave walls and shaking the witch doctors stick at problems.

And for whatever reason they don't apear to want to go out into the light of day, why I don't know.

But religion went from cave paintings and shaking sticks to burning heretics at the stake and some of thoseheretics were what we would now call doctors and scientists.

Lets hope that, that fate does not await you or I for "breaking the faith".

Adrian Lane

That was your best rant of the year. Well said!

-Adrian

Kanchanna

Trust has to be replaced with liabilities and actions - this is true in case trust is needed within an enterprise product that does not cross the domain. In this case too, customers trust the enterprise and provide their personal data. The enterprise asserts on its liabilities taking responsibility over the customers' data.
In case of cloud computing, trust does not just involve 2 parties. Trust involves all the cloud collaborators. When a service from a domain X invokes a service from another domain Y, and passes data, it is inevitable that domain X trusts domain Y with its data for carrying out ONLY the operation intended and that the data will not be used for any other purpose. As the data crosses the domain, we lose control over the execution flow of data and the only way that a collaboration can work in this environment is mutual trust between the parties involved. Domain X can provide assurance on data security in its own domain plus assurance on security beyond the domain based on trust.

Brian Sniffen

This post and comment thread is quite surprising. I thought everyone knew the standard Infosec definition of trust: a trusted system is one that can hurt you. Even Wikipedia gets this right.

Schneier appears to mean this in exactly the same way: we must trust our cloud providers. That is, the only way to benefit from their services is to let them choose whether to hurt us.

Is this definition surprising?

The comments to this entry are closed.