[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [lojban] Re: Are Natlang the best case for entropy in communication ?



* NUMERICAL ESTIMATE OF ENTROPY IN EXAMPLE (1) :

For instance in French : << et alors elle a ouvert la mouche pour
parler >> is correctly understood as << la bouche >> instead of << la
mouche >>.   Still the set X of all possible choices is big : X = {
bouche, couche, douche, louche, mouche, souche, touche }

 But the context prevents other Xouche words than body organs to be
selected, so the entropy is very low in context.

p(bouche) > 0.9, sum p(Xouche) < 0.1 for other X, since worst case is
equiprobability for all those,

S(example 1) < -(0.9*log_2(0.9) + 6*0.1*log_2(0.1)) = 2.129


* NUMERICAL ESTIMATE OF ENTROPY IN EXAMPLE (2) :


Now consider a lojban sentence where somebody would talk of a number
of people in a room, he would use so'V but would mistype the V letter.

The X set has 5 elements but none of them can be best selected given
the context.

THEN S(example 2) ~ -5*log_2(0.2) = 11.609


* 11.609 IS MUCH BIGGER THAN 2.129

-- 
You received this message because you are subscribed to the Google Groups "lojban" group.
To post to this group, send email to lojban@googlegroups.com.
To unsubscribe from this group, send email to lojban+unsubscribe@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/lojban?hl=en.