I came up with something called the Philip Sidney Game, which I'm rather proud of. The... the Philip Sidney game is all about Sir Philip Sidney, who, you'll remember is an English poet on a battlefield in Holland, and he's wounded and is lying wounded on the battlefield, and lying next door to him is a wounded soldier. And the story has it that he handed his water bottle to the soldier, saying, 'Thy necessity is yet greater than mine.' Now, this was after this... actually, I think they both died, so I can't see that it did them much good, but that's not in my model. My model is this, that Sir Philip Sidney can either give the water bottle to the soldier or not give it. Now, the soldier can either be really in a bad way, so that if he doesn't get the water bottle, he'll die, or he can be sort of thirsty, he'd like the water bottle, but not... it's not real serious, he'd got a decent chance of surviving if he doesn't get it. The question is, can there be an honest signalling between the soldier and Sir Philip Sidney, such that if the soldier is really in a bad way, he can make a signal saying, 'Look, I'm dying, give me the water bottle,' but he doesn't give that signal if he's not really in a bad way. And at first sight, you'd say, well, look, this whole thing is absurd because there's no way, in an evolutionary context, Sir Philip Sidney is going to give the water bottle, how is his fitness going to be increased by giving the water bottle? In a Darwinian context, there's got to be something in it for the... for the donor. And what I haven't... this is really borrowing an idea from Bill Hamilton, you see, what I didn't tell you was that the soldier is actually Sir Philip Sidney's brother, and Sir Philip Sidney has a stake in the survival of the soldier. And you can set this up all really quite simply, and it turns out to be, you know, quite simple to do the algebra. And it also turns out that in order... there are some contexts in which it turns out that before the signalling can be honest, so that only really soldiers in a really bad way make the signal, in order for that to be true, it has to be true that making the signal is expensive. It's costly, exactly in Zahavi's sense. So, you can make a model, which is... it's a much simplified version of what Alan [Grafen] is saying, and it works. However... and that was really not adding anything except providing a very, very simple model which didn't involve you having to be able to do hard calculus in order to be able to understand it. But something sort of fell out of the model which is sort of obvious, but I hadn't actually thought about it until I did the model; which is that there's a whole range of circumstances in which actually cost-free signals can be... can be reliable. It isn't always true that signals have to be expensive in order to be honest. And, in fact, there are lots of contexts in which there isn't a conflict of interest between the soldier and... and the... and Sir Philip Sidney, that in those contexts when it would pay Sir Philip Sidney to give the water bottle, it would pay the soldier to signal. If it doesn't pay the soldier to signal, it doesn't pay Sir Philip Sidney to give it. There's no conflict of interest between signaller and the receiver.