Register  Login  Active Topics  Maps  

GoogleTranslator is killing my future

 Language Learning Forum : Languages & Work Post Reply
63 messages over 8 pages: 1 2 3 4 57 8 Next >>
translator2
Senior Member
United States
Joined 6922 days ago

848 posts - 1862 votes 
Speaks: English*

 
 Message 41 of 63
23 June 2011 at 2:15pm | IP Logged 
The problem is that the latest incarnation of machine translation depends on the use of bilingual texts translated by human beings. As more and more output (websites, forums, publications, etc.) gets generated by the machine (slowly introducing errors) the highest statistical match for a given phrase or segment would now be a machine-generated translation rather than a human one. In other words, this kind of system would eventually crash in upon itself without constant human-translation models.

Now, if we ever do get computers that can think like human beings and actually translate rather than find matches, a lot of people's jobs are in trouble and not just translators.

4 persons have voted this message useful



s_allard
Triglot
Senior Member
Canada
Joined 5433 days ago

2704 posts - 5425 votes 
Speaks: French*, English, Spanish
Studies: Polish

 
 Message 42 of 63
23 June 2011 at 2:23pm | IP Logged 
As I said in an earlier post, the fundamental problem in machine translation is how to design a machine that really understands L1 before generating L2. The issue goes way beyond statistical analysis of word combinations in massive samples. This is not how the human brain works.

What kind of language is relatively easy to machine translate? Short phrases in formulaic language from specialized fields. A example would be translating weather forecasts. The following phrase is quite easily translated:

The probability of rain is 50%.

As soon as you start using idioms, puns, metaphorical language and cultural or historical references, this become much more complicated. I would be curious to see how the following example would be translated:

Our latest offering is going to eat the competition's lunch.

What is a passable machine translation? Is it something where you have a gist--some vague idea--of what was said? Sure, it's better than nothing. And I'm sure there are many practical situations where anything will do.

Let's say for example that I'm making email enquiries with a dental office in Costa Rica for certain kinds of procedures. The translations don't have to be perfect, just as we can agree on price and what needs to be done.

I'm sure that's how a lot of people use Google Tranlsate today. It's great when you want to get a rough idea of what has been said. But nobody in their right mind would translate this way a paragraph into a language that they do not know and present it as is to their foreign counterpart. This is asking for trouble.

I see machine translation developing into sophisticated translation aids. The low-level straightforward stuff can be automated. Let the machine take care of that boring stuff. This will free up the translators to work on the really interesting high-level stuff that requires true human understanding.

Actually, this discussion reminds me of what it's like to watch a play or a tv series in a target language that we understand imperfectly. We get the gist of what is going on, especially with the help of all the contextual cues, but we are only too aware that we are missing a lot because there is much of the language that we don't really understand. This is exactly what distinguishes us from a native speaker. Just something as simple as the names of the characters will resonate differently for native and non-native speakers. That's the whole difference between getting the gist or thinking we understand and truly understanding.



Edited by s_allard on 23 June 2011 at 2:41pm

5 persons have voted this message useful



Guests
Guest Group
Joined 7379 days ago

0 - 22 votes
Logged on

 
 Message 43 of 63
24 June 2011 at 3:37am | IP Logged 
I have never seen an automated translation method ever used in practical situationals.

Google translation is far from being used in translating legal documents or for that matter school essays and the like. I use Google translation a lot for just making sense of what a phrase could mean in another language.

Even if Google transation improved dramatically, translators will need to be available to proofread the output. This is not any different from software systems that diagnoze patients. No matter how smart these systems are, a doctor will have the final say.



ScottScheule
Diglot
Senior Member
United States
scheule.blogspot.com
Joined 5231 days ago

645 posts - 1176 votes 
Speaks: English*, Spanish
Studies: Latin, Hungarian, Biblical Hebrew, Old English, Russian, Swedish, German, Italian, French

 
 Message 44 of 63
24 June 2011 at 4:18am | IP Logged 
ismyaccount wrote:
I have never seen an automated translation method ever used in
practical situationals.

Google translation is far from being used in translating legal documents or for that
matter school essays and the like. I use Google translation a lot for just making sense
of what a phrase could mean in another language.

Even if Google transation improved dramatically, translators will need to be available
to proofread the output. This is not any different from software systems that diagnoze
patients. No matter how smart these systems are, a doctor will have the final say.


But what about when the computer is smarter than the doctor? Surely this is only a
matter of time, no?
1 person has voted this message useful



Guests
Guest Group
Joined 7379 days ago

0 - 22 votes
Logged on

 
 Message 45 of 63
27 June 2011 at 12:31am | IP Logged 
Hi ScottScheule,

You may have heard about Watson. It is a computer system that IBM is working on. You can find more about about
Watson here
http://www.youtube.com/watch?v=_1c7s7-3fXI

If you read the article in this link
http://www.foxbusiness.com/personal-finance/2011/06/03/ibms- watson-to-take-on-health-care/
you will see that IBM is not planning to replace doctors with Watson.

Why do you think the auto pilot system in most airplanes didn't replace real pilots?

The list goes on... For example, the closest WalMart store form where I live has abandoned self-checkout.

What I am trying to say is that the advancement of technology doesn't always result in elimination of human
beings from the work place.








Iversen
Super Polyglot
Moderator
Denmark
berejst.dk
Joined 6706 days ago

9078 posts - 16473 votes 
Speaks: Danish*, French, English, German, Italian, Spanish, Portuguese, Dutch, Swedish, Esperanto, Romanian, Catalan
Studies: Afrikaans, Greek, Norwegian, Russian, Serbian, Icelandic, Latin, Irish, Lowland Scots, Indonesian, Polish, Croatian
Personal Language Map

 
 Message 46 of 63
27 June 2011 at 12:55am | IP Logged 
translator2 wrote:
The problem is that the latest incarnation of machine translation depends on the use of bilingual texts translated by human beings. As more and more output (websites, forums, publications, etc.) gets generated by the machine (slowly introducing errors) the highest statistical match for a given phrase or segment would now be a machine-generated translation rather than a human one. (...)


You are correct. Garbage in, garbage out. But before this happens machine translations will hopefully have improved (for instance by incorporating some of the things I mentioned in an earlier post in this thread), and thee will still be tons of bilingual text around from the good old days with human translators, and they can be culled again with better software. So the effect will not be catastrophic. And besides we have agreed that there still will be some humans who translate (although they may be using software to speed up their work).


Edited by Iversen on 27 June 2011 at 12:55am

1 person has voted this message useful



translator2
Senior Member
United States
Joined 6922 days ago

848 posts - 1862 votes 
Speaks: English*

 
 Message 47 of 63
28 June 2011 at 3:14pm | IP Logged 
Click here for full Article from the Atlantic

“The intriguing problem is the way that over-use of automatic translation can make it harder for automatic translation ever to improve, and may even be making it worse. As people in the business understand, computerized translation relies heavily on sheer statistical correlation.
Crucially, this process depends on “big data” for its improvement. The more Rosetta stone-like side-by-side passages the system can compare, the more refined and reliable the correlations will become. Day by day and comparison by comparison, the translation will only get better.

That’s the problem. The more of this auto-translated material floods onto the world’s websites, the smaller the proportion of good translations the computers can learn from. In engineering terms, the signal-to-noise ratio is getting worse.”
Click here for full Article from the Atlantic

1 person has voted this message useful





Iversen
Super Polyglot
Moderator
Denmark
berejst.dk
Joined 6706 days ago

9078 posts - 16473 votes 
Speaks: Danish*, French, English, German, Italian, Spanish, Portuguese, Dutch, Swedish, Esperanto, Romanian, Catalan
Studies: Afrikaans, Greek, Norwegian, Russian, Serbian, Icelandic, Latin, Irish, Lowland Scots, Indonesian, Polish, Croatian
Personal Language Map

 
 Message 48 of 63
28 June 2011 at 4:05pm | IP Logged 
translator2 wrote:
As people in the business understand, computerized translation relies heavily on sheer statistical correlation.


The key to fundamental improvements in automated translation is to supplement the purely statistical methods with tidbits of knowledge from trustworthy dictionaries and grammars.

Those who first attempted to make machine translations used ONLY those things, i.e. dictionaries and some kind of formalized grammar. This demands an immense amount of human programming, and the results didn't match the expectations. Actually this is like the old grammar-translation school.

The big breakthrough came with the implementation of statistical methods, which once programmed demand little human intervention. This is actually like using the natural method in its strictest form.

And with thesis and antithesis we now are ready for the third stage: use the statistical methods to make trasnsl<tions that include alternatives. Check the results against lists of proper names (Google knows all those from its search machines!) and against lists of function words/verbs valence tables etc. oin some palatable form, and let the answers from these enquiries count heavily in the statistical analysis. And yes, somebody may have to program the rules of thumbs in the beginning, but later neural networks may become relevant (I'm still somewhat sceptical about this technology).

If these things aren't implemented while there still are decent human-made or human-controlled translations around for the statistical analysis then the effect may disappear in garbage - you cannot make sound decisions on junk. But there is still time...




Edited by Iversen on 12 July 2011 at 12:29pm



2 persons have voted this message useful



This discussion contains 63 messages over 8 pages: << Prev 1 2 3 4 57 8  Next >>


Post ReplyPost New Topic Printable version Printable version

You cannot post new topics in this forum - You cannot reply to topics in this forum - You cannot delete your posts in this forum
You cannot edit your posts in this forum - You cannot create polls in this forum - You cannot vote in polls in this forum


This page was generated in 3.0322 seconds.


DHTML Menu By Milonic JavaScript
Copyright 2024 FX Micheloud - All rights reserved
No part of this website may be copied by any means without my written authorization.