* NUMERICAL ESTIMATE OF ENTROPY IN EXAMPLE (1) :
For instance in French : << et alors elle a ouvert la mouche pour
parler >> is correctly understood as << la bouche >> instead of << la
mouche >>. Still the set X of all possible choices is big : X = {
bouche, couche, douche, louche, mouche, souche, touche }
But the context prevents other Xouche words than body organs to be
selected, so the entropy is very low in context.
p(bouche) > 0.9, sum p(Xouche) < 0.1 for other X, since worst case is
equiprobability for all those,
S(example 1) < -(0.9*log_2(0.9) + 6*0.1*log_2(0.1)) = 2.129
* NUMERICAL ESTIMATE OF ENTROPY IN EXAMPLE (2) :
Now consider a lojban sentence where somebody would talk of a number
of people in a room, he would use so'V but would mistype the V letter.
The X set has 5 elements but none of them can be best selected given
the context.
THEN S(example 2) ~ -5*log_2(0.2) = 11.609
* 11.609 IS MUCH BIGGER THAN 2.129