[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[lojban] Re: Loglish: A Modest Proposal
Steve,
My idea with the "qui" connector in Loglish is not that different from your
idea of using WordNet.
The idea is that, rather than memorizing separate words for each WordNet
sense, one uses context-specifiers to indicate which sense is intended.
So, for instance you could say
Ben rock qui sway baby
Ben listen rock qui music
This avoids the need to memorize separate words for the different senses of
rock ("rock" as in "rock the baby" and "rock" as in "rock music").
I didn't say so in the Loglish language specification, but there is probably
a need for a qui terminator just in case the context-specifier is more than
one word, so one could say (using "quiha" for the terminator)
Ben listen rock qui music quiha
(unnecessary in this case but useful in rare cases where more than one word
is used in the position "music" is used here)
One could argue that qui is unnecessary because tanru can handle
disambiguation, but I think it's better to have a specific mechanism for
sense-specification as opposed to compound-concept-formation.
-- Ben
> -----Original Message-----
> From: lojban-list-bounce@lojban.org
> [mailto:lojban-list-bounce@lojban.org]On Behalf Of Steven Arnold
> Sent: Saturday, August 13, 2005 8:12 PM
> To: lojban-list@lojban.org
> Subject: [lojban] Re: Loglish: A Modest Proposal
>
>
>
> On Aug 13, 2005, at 4:00 PM, Arnt Richard Johansen wrote:
>
> > To quote your web page:
> >
> > # [...] avoid what's really annoying about Lojban (the lack of a full
> > # vocabulary).
> >
> > I suppose that lack of vocabulary will always be a problem in
> > knowledge representation systems, until someone develops AGI or a
> > way to extract a suitable dictionary from a text corpus.
>
> Wordnet is a system that attempts to take a set of "core meanings"
> and associate those meanings with words from different languages. It
> is accessible over the Internet. I invented a language by writing a
> program in Python that fetched the list of core meanings and assigned
> words to them from a list. It was a very fast route to a 26,000+
> word dictionary. Granted, the dictionary needed a little data
> grooming -- there were a number of words that, to me, didn't deserve
> a separate term. There were also words that I wanted to make sure
> got shorter words, since I expected them to be used more often. But
> I think the data grooming was by far the minor portion of the task,
> and by using Wordnet, I saved probably hundreds of hours of word
> development compared to doing it all by hand.
>
> That, combined with using Markov chains for word generation, created
> an excellent base language in a very short time. I'd be happy to
> share the source code of these tools with anyone who is interested;
> email me privately for that.
>
> steve
>
>
>
> To unsubscribe from this list, send mail to lojban-list-request@lojban.org
> with the subject unsubscribe, or go to http://www.lojban.org/lsg2/, or if
> you're really stuck, send mail to secretary@lojban.org for help.
>
>
To unsubscribe from this list, send mail to lojban-list-request@lojban.org
with the subject unsubscribe, or go to http://www.lojban.org/lsg2/, or if
you're really stuck, send mail to secretary@lojban.org for help.