Å·±¦ÓéÀÖ

Linguistics Discussion 2013 and Beyond discussion

21 views
Cognition and Language > Artificial Intelligence

Comments Showing 1-7 of 7 (7 new)    post a comment »
dateUp arrow    newest »

message 1: by Aloha, The Enthusiast (new)

Aloha | 113 comments Mod
This section is on computer intelligence, its design and engineering, in particular knowledge engineering. AI researchers study the human cognitive and language learning process in order to best design AI to "reason."


The Pirate Ghost (Formerly known as the Curmudgeon) (pirateghost) Okay, who's big idea was it to give speach to GPS navigators. Now my wife feels unemployed on long trips.

More seriously, does somebody know, or have a good article or link that gives the status quo when it comes to speach and AIs? Is Kindle Text to speech the gold standard or have they made more progress? 2001 Space Odessy or... are we still "Cheeseburger...onion rings... and a large orange drink....SAY IT." (hopefully I'm not the only one old enough to remember that comedic schtick.)


message 3: by Aloha, The Enthusiast (new)

Aloha | 113 comments Mod
Not sure where you would automatically find the latest info., how about searching here:


In particular:



message 4: by Aloha, The Enthusiast (new)

Aloha | 113 comments Mod
So far, I can't find AI journals that are available on the web, but here's a list:




message 5: by John (new)

John Brown | 17 comments I write computer programs that analyse text.
I started in university research in AI and logic programming around 1982. I went to study for a PhD in Computational Linguistics 15 years ago. Head Driven Phrase Structure Grammar (HPSG) with its implementation in English in the Lingo grammar from Princeton was all the rage then. I did not complete, because I found HPSG excessively complex.
Early parsers were written in the Prolog language, and any book on that area (Prolog for parsing/language-processing) makes a good introduction to recursive parsing of a Context Free Grammar, which Chomsky thought was a good theoretical model of natural languages such as English.
Pinker has written a couple of very readable books on interesting issues that affect computer language understanding. Guy Deutscher's book "Unfolding of Language" is also very good.
Jurafsky and Martin is a very good technical textbook on "Speech and Language Processing".
"Wordnet" from Princeton is a computerised thesaurus/ontology from which a lot can be learnt. It is quickly downloadable free.
Dr. Deutscher likes the work of Adele Goldberg at Princeton, on Construction Grammar, and I am currently reading her, with great interest.
On LinkedIn there is a group called "Text Analytics". They all seem very keen on the Gate Language Architecture, a lot of which is freely downloadable.
I am from an engineering background initially, and I found it useful to read the books that are used on University courses on English. You get hundreds of parse trees to look at, and quickly learn about language structure without having to understand parsing algorithms.
People like Chomsky, Deutscher, Goldberg, and Pollard and Sag (they invented HPSG) leave the writing of parsers to Computational Linguists.
Drew V. McDermott, who has a web-page with a lot of references (all rather technical), wrote a book around 1994 that said "forget parsing and use statistics". That viewpoint leads you into document retrieval algorithms (an old, well-established field), well described in Artificial Intelligence: A Modern Approach by Stuart Russell and Peter Norvig. Very big and expensive, though. That book is not so good on parsing. If you follow the statistical route, books on data-mining are very useful.
To sum up, the field is still in its early stages and is fragmented into a number of disciplines that don't talk to each other very much. What you need to read depends on where you want to get. A lot of people just learn to use the Gate architecture without understanding the linguistic fundamentals. And, to repeat myself, a lot just produce theories of the structure of language without writing any programs.


message 6: by John (new)

John Brown | 17 comments I just went looking for a book on Prolog and Language, and found the following, with a very good recommendation:

An Introduction to Natural Language Processing Through Prolog (Learning About Language) [Paperback]
Clive Matthews

"I teach a computational linguistics class that has to be accessible to students with zero computer experience. This is the only book I have ever found from which such students can actually learn a thing or two. A few years ago, it became unavailable in the USA, but now I am glad to see that it has returned to the market.
For learning Prolog as a beginning programmer, this book is unmatched. For learning the basics of NLP as a novice, this book is unmatched."


message 7: by John (new)

John Brown | 17 comments I.Curmudgeon,
Text to Speech is not really my field. I do know that the big challenge is getting the prosody right (the correct emphasis on each syllable). If you Google
"Dealing with Prosody in a Text to Speech system" you will get a lot of documents. There is a 2012 one but it is from an Indian author: that has the green url as
.
There are also course notes from the University of Chicago, but these are not so recent.
I get the impression that the problem is far from being solved. I suspect that large tables of word collocations will be necessary, and these are more likely to be available on a cloud server than in a small Kindle.
GPS navigators should have a much easier job since the whole vocabulary is predetermined. As far as I know they use compression techniques that model the vocal tract, and are essentially piecing together pre-recorded spoken phrases.


back to top