A key concept that computing has been steeped in over recent years is big data.

However, the focus is shifting gradually towards ‘deep learning’ or ‘machine learning’. In a nutshell, this means that the machines learn through sophisticated algorithms capable of detecting patterns. This is a great leap in artificial intelligence and, in essence another step forward for machines to learn to understand language.

It involves technology geared towards making computers learn from an action performed either by a human or by the computers themselves and that, with the due ‘feedback’, adapts to new scenarios or changing scenarios. In the case of translation, we would be talking about algorithms that would analyse the corrections of the translators in order to improve the automatic translations. This, in essence, is complemented by the use of language recognition programs (spoken or written), and programs that allow vision or detection of motion. The machines detect patterns and in this way they learn to predict the results or responses. Any exception or correction is systematically learnt.

When machines or computers learn to think or react as a human being would, or start to reprogram among themselves, the paradigm shift is inevitable. “It is the dreaded moment, much maligned by translators, of “the machines will never learn / translate like a professional translator””. There are some who warn that we could be facing the last decades in human history in which we, as humans, are the most intelligent beings. But, who knows how computing will evolve. “It is impossible to predict the next few decades”, points out Francisco Casacuberta, Professor at the Polytechnic University of Valencia, who leads the PRHLT (Pattern Recognition and Human Language Technology Research Center). Gartner, the best technological consultant in the world, predicts an increasingly intelligent digitalization and that some kinds of intelligent machines will become widespread as of 2020. At the cutting edge are virtual assistants (Amazon Alexa, Google Assistant, Siri and similar programs) and robots that interact with people, but apps are also responsible for complex tasks (such as one that prioritizes its emails or messages). And all in a growing framework of virtual reality.

Quantum computing has been on the horizon for years. “If we made the leap to quantum computing, that would change everything”, says Alex Helle, automatic Automated translation Translation researcher Researcher at Pangeanic. The computers process the information through a binary system (0/1); the quantum system aims to apply the states of the elements of the atom, which allow more positions and to process even more information at a faster rate.  Alex maintains that, in any case, the hardware/software duality will change and computing will be increasingly faster with more information, and storage capacity will move forward alongside it.


Related Posts

The Creation of Custom Data Sets to Meet Customer Needs: A BSC Project

Rapidly advancing technology and the growing need for accurate and efficient data analysis have led organizations to seek customized data sets tailored to their specific needs. 

Read more

Exploring the Differences Between Human Translation and Machine Translation

The technological advances that have occurred over the course of the last few decades have made it possible to optimize and streamline the work of human translators. One of these advances is machine translation (MT).

Read more

Synthetic Data vs Anonymized Data

What is synthetic data? 

Synthetic data is data that has been artificially generated from a model trained to reproduce the characteristics and structure of the original data. The goal is for the synthetic data to be sufficiently similar to the...

Read more