The exploration of Big Data perhaps holds untapped potential, but the data volume to be considered may require higher power processing solutions.
An algorithm pulling quantum computing party could come to the rescue. Researchers from MIT, the University of Waterloo and the University of Southern California just published an article in the Nature Communications Journal describing a new approach to understand massively complex problems. “By combining quantum computing and topology — a branch of geometry — the new machine-learning algorithm can streamline highly complex problems and put solutions within closer reach.” (PCWorld)
The topology focuses on properties that remain unchanged even when distorted, and this is particularly useful for analyzing the complex mesh of connections as the US electrical grid or global interconnections Internet. It can also help to focus on the most important characteristics of a massive data set. The downside of topological analysis is that it is very costly from the standpoint of its computer processing, but it is precisely here that the researchers think that quantum mechanics can help.
Feasibility tests have already started
“Topological methods for analysis face challenges: a data consisting of n data points possesses 2n possible subsets that could contribute to the topology,” the researchers said in the article. If we take an example with a data set with 300 data points. A traditional approach to analyze all of the topological features of this system would require a “digital computer on the size of the entire universe,” says Seth Lloyd, the lead author of the paper published in Nature Communications, and Nam P. Suh , Professor of Mechanical Engineering at MIT. In other words, it would require almost as many processing units there are particles in the universe, making it effectively impossible problem to solve.
Consider the problem with the new algorithm and a quantum computer would be much more reasonable. In quantum computing, information is represented by “quantum bits” ( Qubit ), which are similar to binary bits used in the digital computer, and not only are able to take into account the states ‘0’ and ‘1’, but also as both at once. This time, the 300-point data set would require a quantum computer with just 300 quantum bits, and devices of that size could be available in the next few years, according to Seth Lloyd. “Our algorithm shows that you do not need to use a large quantum computer to tackle some major topological pieces.” The same approach could be used to analyze the global economy, social networks or ‘almost any system that involves long-distance transport of goods and information.” According to the researcher, experiments feasibility tests have already started.