[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [lojban] Re: Are Natlang the best case for entropy in communication ?



> Also "scientificity" isn't a word.

It is in my natlang, but sorry this does not exist in English.



Come, what you want is the prove that S(Example 2) much bigger than
mean value of S(Example 1s) for all Example 1s we can imagine.

I reply with a minoration.   A minoration IS A MATHEMATICAL proof.
If I was asked by you if an elephant is bigger than a mouse, even
without precisely measuring both animals, I could reason like that :

Size(Elephant) much bigger than 1 meter

Size(Mouse), smaller then 0.30 meter

=> thus Size(Elephant) much bigger than Size(Mouse)


This is the same here, simply you don't perceive how much bigger is
S(E2) relatively to all S(E1).


But, anyway, well, I surrender.   There must be somewhere  studies
recording estimates of S(E1s) for a lot of E1 situations where one
letter is mistyped.

And I bet mean value of S(E1s) is between 1 and 1.58, which is the
most natural estimate tells us that in natural context there are in
the worst cases 2 ou 3 possibilities.

You see, I make this a prediction that can be falsified, in a
popperish way of science.   This is what I bet.

And this is still less than 2.3 which is THE PROVED VALUE of S(E2).

-- 
You received this message because you are subscribed to the Google Groups "lojban" group.
To post to this group, send email to lojban@googlegroups.com.
To unsubscribe from this group, send email to lojban+unsubscribe@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/lojban?hl=en.