Forum for Science, Industry and Business

Sponsored by:     3M 
Search our Site:


Machine translation for the most part targets Internet and technical texts

Like all technology, machine translation (MT) has its limits, says Mike Dillinger, President of the Association for Machine Translation in the Americas and Adjunct Professor, Department of Psychology, San José State University.

At the invitation of the Department of Artificial Intelligence, Mike Dillinger has been giving a course on paraphrasing and text mining at the School of Computing. Dillinger considers that without clean and clear texts machine translation will not work well and that, despite technological advances, there will always be a demand for human translators to render legal or literary texts.

Machine translation, he adds, for the most part targets Internet and technical texts. MT’s Internet bias means training web content creators to assure that the documents are machine translatable.

- As an acclaimed expert in machine translation, how would you define the state of the art in this discipline?

The state of the art is rapidly changing. A far-reaching new approach was introduced fifteen to twenty years ago. At the time we faced a two-sided problem. First, it took a long time and a lot of money to develop the grammatical rules required to analyse the original sentence and the “transfer” or translation rules. Second, it looked to be impossible to manually account for the vast array of words and sentence types in documents.

The new approach uses statistical techniques to identify qualitatively simpler rules. This it does swiftly, automatically and on a massive scale, covering much more of the language. Similar techniques are used to identify terms and their possible translations. These are huge advances! Before system development was very much a cottage industry; now they are mass produced. Today’s research aims to increase the qualitative complexity of the rules to better reflect syntactic structures and aspects of meaning. We are now exploiting the qualitative advances of this approach.

- Machine translation systems have been in use since the 1970s, is this technology now mature?

If maturity means for use in industrial applications, the answer is definitely yes. MT has been widely used by top-ranking industrial and military institutions for 30 years. The European Community, Ford, SAP, Symantec, the US Armed Forces and many other organizations use MT every day. If maturity means use by the general public to enter a random sentence for translation, I would say, just as definitely, no. Like all technology, machine translation has its limits. You don’t expect a Mercedes to run well in the snow or sand: where it performs best is on a dry, surfaced road. Neither do you expect a Formula 1 car to win a race using ordinary petrol or alcohol, it needs special fuel. Unfortunately, very often people expect a perfect translation of a not very clear or error-riddled text. For the time being at least, without clean and correct texts, machine translation will not work properly.

- Do you think society understands MT?

Not at all! It’s something I come across all the time. A lot of people think that “translation” is being able to tell what the author means, even if he or she has not expressed himself or herself clearly and correctly. Therefore, many have great expectations about what a translation system will be able to do. This is why they are always disappointed. On the other hand, those of us working in MT have to make a big effort to get society to better understand what use it is and when it works well: this is the mandate of the association I chair.

- What is MT about? Developing programs, translation systems, computerized translation, manufacturing electronic dictionaries? How exactly would you define this discipline?

MT is concerned with building computerized translation systems. Of course, this includes building electronic dictionaries, grammars, databases of word co-occurrences and other linguistic resources. But it also involves developing automatic translation evaluation processes, input text “cleaning” and analysis processes, and processes for guaranteeing that everything will run smoothly when a 300,000 page translation order arrives. As these are all very different processes and components, MT requires the cooperation of linguists, programmers and engineers.

- What are the stages of the machine translation process?

1. Document preparation. This is arguably the most important stage, because you have to assure that the sentences of each document are understandable and correct.
2. Adaptation of the translation system. Just like a human translator, the machine translation system needs information about all the words it will come across in the documents. It can be taught new words through a process known as customization.
3. Document translation. Each document format, like Word, pdf or HTML, has many different features, apart from the sentences that actually have to be translated. This stage separates the content from the wrapping as it were.
4. Translation verification. Quality control is very important for human and machine translators. Neither words nor sentences have just one meaning, and they are very easy to misinterpret.

5. Document distribution. This stage is more complex than is generally thought. When you receive 10,000 documents to be translated to 10 different languages, checking that they were all translated, putting them all in the right order without mixing up languages, etc., takes a lot of organizing.

- Is this technology a threat to human translators? Do you really think it creates jobs?

It is not a threat at all! MT takes the most routine work out of translators’ hands so that they can apply their expertise to more difficult tasks. We will always need human translators for more complex texts legal and literary texts. MT today is mostly applied to situations where there is no human participation. It would be cruel even to have people translate e-mails, chats, SMS messages and random web pages. The required text throughput and translation speed is so huge that it would be excruciating for a human being. It is a question of scale: an average human translator translates from 8 to 10 pages per day, whereas, on the web scale, 8 to 10 pages per second would be an extremely low rate. The adoption of new technologies, especially in a global economy, seldom boosts job creation. What it does do is open up an increasingly clear divide between low-skilled routine jobs and specialized occupations.

- Is the deployment of this technology a technical or social problem?

First and foremost it is a social engineering problem because people have to change their behaviour and the way they see things. The MT process reproduces exactly the same stages as human translation, except for two key differences:
a) In translation systems, you have to be very, very careful about the wording. Human translators apply their technical knowledge (if any) to make up for incorrect wording, but machine translation systems have no such knowledge: they reproduce all too faithfully the mistakes in the original/source text. It is hard to get them to translate more accurately, but there are now extremely helpful automatic checking tools. Symantec is a recent example that uses an automatic checker and a translation system to achieve extremely fast and very good results.

b) Translation systems have to handle a lot of translated documents. What happens if an organization receives 5,000 instead of the customary 50 translated documents per week? Automating the translation process ends up uncovering problems with other parts of document handling.

- You mentioned the British National Corpus that includes a cross section of texts that are representative of the English language. It contains 15 million different terms, whereas machine translation dictionaries only contain 300,000 entries. How can this barrier to the construction of an acceptable MT system for society be overcome?

This collection of over 100 million English words is a good mirror of macro language features. One is that we use a great deal of words. However, word frequency is extremely variable: of 15 million terms 70% are seldom used! To overcome the variability in vocabulary usage barrier, we now use the most common words to create a core system to which 5,000 to 10,000 customer-specific words are added. This is reasonably successful. For web applications, however, this simply does not work. Even the best systems are missing literally millions of words, plus the fact that new words are invented every day. At least three remedies are applied at present: ask the user to “try again”, ask the user to enter a synonym, and automatically or semiautomatically build synonym databases. As I see it, we will have to develop web content author guidance systems, such as have already been developed for technical documents. There are strong economic arguments for moving in that direction.

- The Association for Machine Translation in the Americas that you chair is organizing the AMTA 2008 conference to be held in Hawaii next October, what innovations does conference have in store?

There is always something! Come and see! One difference this year is that several groups are holding conferences together. AMTA, the International Workshop of Spoken Language Translation (IWSLT), a workshop by the US government agency NIST about how to evaluate translation evaluation methods, a Localization Industry Standards Association meeting attracting representatives from large corporations, and another group of Empirical Methods in Natural Language Processing (EMNLP) researchers will all be at the same hotel in the same week. Finally, as it is to be held in Hawaii, our colleagues from Asia will be there to add an even more international edge. For more information, see conference web site.

Eduardo Martínez | alfa
Further information:

More articles from Information Technology:

nachricht Next Generation Cryptography
20.03.2018 | Fraunhofer-Institut für Sichere Informationstechnologie SIT

nachricht TIB’s Visual Analytics Research Group to develop methods for person detection and visualisation
19.03.2018 | Technische Informationsbibliothek (TIB)

All articles from Information Technology >>>

The most recent press releases about innovation >>>

Die letzten 5 Focus-News des innovations-reports im Überblick:

Im Focus: Researchers Discover New Anti-Cancer Protein

An international team of researchers has discovered a new anti-cancer protein. The protein, called LHPP, prevents the uncontrolled proliferation of cancer cells in the liver. The researchers led by Prof. Michael N. Hall from the Biozentrum, University of Basel, report in “Nature” that LHPP can also serve as a biomarker for the diagnosis and prognosis of liver cancer.

The incidence of liver cancer, also known as hepatocellular carcinoma, is steadily increasing. In the last twenty years, the number of cases has almost doubled...

Im Focus: Researchers at Fraunhofer monitor re-entry of Chinese space station Tiangong-1

In just a few weeks from now, the Chinese space station Tiangong-1 will re-enter the Earth's atmosphere where it will to a large extent burn up. It is possible that some debris will reach the Earth's surface. Tiangong-1 is orbiting the Earth uncontrolled at a speed of approx. 29,000 km/h.Currently the prognosis relating to the time of impact currently lies within a window of several days. The scientists at Fraunhofer FHR have already been monitoring Tiangong-1 for a number of weeks with their TIRA system, one of the most powerful space observation radars in the world, with a view to supporting the German Space Situational Awareness Center and the ESA with their re-entry forecasts.

Following the loss of radio contact with Tiangong-1 in 2016 and due to the low orbital height, it is now inevitable that the Chinese space station will...

Im Focus: Alliance „OLED Licht Forum“ – Key partner for OLED lighting solutions

Fraunhofer Institute for Organic Electronics, Electron Beam and Plasma Technology FEP, provider of research and development services for OLED lighting solutions, announces the founding of the “OLED Licht Forum” and presents latest OLED design and lighting solutions during light+building, from March 18th – 23rd, 2018 in Frankfurt a.M./Germany, at booth no. F91 in Hall 4.0.

They are united in their passion for OLED (organic light emitting diodes) lighting with all of its unique facets and application possibilities. Thus experts in...

Im Focus: Mars' oceans formed early, possibly aided by massive volcanic eruptions

Oceans formed before Tharsis and evolved together, shaping climate history of Mars

A new scenario seeking to explain how Mars' putative oceans came and went over the last 4 billion years implies that the oceans formed several hundred million...

Im Focus: Tiny implants for cells are functional in vivo

For the first time, an interdisciplinary team from the University of Basel has succeeded in integrating artificial organelles into the cells of live zebrafish embryos. This innovative approach using artificial organelles as cellular implants offers new potential in treating a range of diseases, as the authors report in an article published in Nature Communications.

In the cells of higher organisms, organelles such as the nucleus or mitochondria perform a range of complex functions necessary for life. In the networks of...

All Focus news of the innovation-report >>>



Industry & Economy
Event News

Virtual reality conference comes to Reutlingen

19.03.2018 | Event News

Ultrafast Wireless and Chip Design at the DATE Conference in Dresden

16.03.2018 | Event News

International Tinnitus Conference of the Tinnitus Research Initiative in Regensburg

13.03.2018 | Event News

Latest News

Modular safety concept increases flexibility in plant conversion

22.03.2018 | Trade Fair News

New interactive map shows climate change everywhere in world

22.03.2018 | Earth Sciences

New technologies and computing power to help strengthen population data

22.03.2018 | Earth Sciences

Science & Research
Overview of more VideoLinks >>>