?

Log in

No account? Create an account

Tue, Sep. 29th, 2009, 12:19 pm
Signatures don't do what you think they do

Security people tend to think that we live in a secure world, one in which everyone is constantly auditing the behavior of everyone else, and the end result is widespread mutually ensured honesty. We don't even vaguely live in that world. We live in a trusting world, where most people are mostly good, and the need for auditing is much lower than it would be if everyone were greedy sociopathic automatons. I would not want to live in a world which worked that way, and it would probably not only be an unpleasant place to live, but an extremely unproductive one as well, as every attempt by anyone to get anything done would be destroyed by theft and corruption.

I say this not to engage in broad philosophizing but because I have a very concrete point to make about a very specific thing: signatures. People  think of signatures as being a strong form of physical evidence, useful in court proceeding for proving that a particular person really did sign a particular thing. While this belief being widespread does a good job of denying that things they signed are actually their signature, which is a good thing, the claimed difficulty of forging signatures is simply not true. Anyone can practice forging a signature from a few samples for a few hours and be able to do a passable replica. Anyone with decent skill who practices a bit can get quite skilled. And people aren't very consistent about how they sign their own signatures, making even legitimate matches sometimes look fake. Thumbprints would be far better as a piece of evidence.

Despite that, signatures are still very important and good at what they're used for. What is that? It's to make it clear that someone knows when they're entering into a binding agreement. You can't be forced into an effective contract just because you said 'yeah, whatever' when asked if you want to participate, and you can't be forced into a contract by being tricked into signing a document which says something different than what you think it says. The theory of contracts is based on parties mutually agreeing to be contractually bound, and requires they all go through sufficient ceremony that it's clear when a contract has been entered into (sometimes merchants can get into binding contracts much more easily, but that's because they're expected to be more savvy, the law is big on protecting little old ladies from being suckered).

For example, take the use of signatures for receiving packages. There isn't even a contract entered into when a package is signed for, but the reasoning behind it is the same - it's to make clear that the person receiving the package knew they were receiving a package, and not claim later that there was a misunderstanding. To the extent that the signature has any evidenciary power in this case, it's mostly in that people generally by default put down their real name, and since the delivery person generally doesn't even know what the potential name of a recipient might be, it's hard for someone to lie later and claim that no package was delivered at all.

The hoopla around cryptographic signatures is largely misplaced. Having signatures which were on a web page which clearly stated what was being indicated and the signature was done by moving the mouse like a pen in a drawing area would do a much better job of indicating what signatures are supposed to indicate, and probably be much easier to back up in court later.

Now somebody please explain this to Bruce Schneier, because he doesn't get it.

Tue, Sep. 29th, 2009 08:32 pm (UTC)
jered

What you say about written signatures above is entirely correct. However, I don't know a single security professional who "[...] think[s] that we live in a secure world, one in which everyone is constantly auditing the behavior of everyone else[...]". Security professionals tend to be the most paranoid people around, and are critically aware of the flaws in most of the systems we depend upon.

I guess the big problem is one of naming. Cryptographic "signatures" do something far stronger than what signatures can do, hence the confusion a few years ago when Congress passed a bill to allow "electronic signatures" -- electronic representations of your John Hancock, and nothing more. It's too late to change the name, but really what we talk about in crypto are "authenticators" or "verifiers".

The interesting thing with the world today is that we now have to tools to inexpensively improve security, and yet there is no interest in doing so. For example, modern cryptography could easily make credit card number theft a thing of the past... and yet, when the credit card companies adopted a contactless smartcard standard they designed one where the secret number is still passed off the device where it can be intercepted. Mechanical keys are woefully insecure, yet there is widespread denial that this problem exists. Electronic voting systems could be designed to protect against tampering, and yet the common ones today are more easily tampered with than anything ever seen in the past.

In security, the standard to which new technology is held is entirely based on what the previous generation technology could do. This is in conflict with nearly every other area of technology, where constant improvement (in speed, accuracy, or cost) is demanded. Perhaps this is because near-perfect security has been so rare in the past, or perhaps it is because humans are absolutely terrible at assessing risk. Far more valued than security is accountability -- we don't care if someone can steal our identity information as long as we can track them down and hold them accountable later.

I wonder if this ongoing lack of mathematically enforced security helps prevent us from becoming a society of "greedy sociopathic automatons"? That a lack of rigor in enforcing the social contract allows us to weed out the bad eggs that might be lurking among us? Probably not, I imagine social reciprocity runs much deeper than that, but it's an interesting thought.

Wed, Sep. 30th, 2009 12:38 am (UTC)
bramcohen

Digital signatures do a great job of producing evidence, but a relatively lousy one of making clear that the person knew what they were doing when they did the signature.

The credit card companies have an entrenched interest in keeping things the way they are. They expressly don't want later systems to be more secure, because that would directly cut into the need for the service they provide, and hence their profits.

Wed, Sep. 29th, 2010 09:43 pm (UTC)
jered

Wow, what an elaborate spam response to this post. Surely not an AI? Gold farmer?