Graphs, quantum computing and their future roles in analytics

Graphs, quantum computing and their future roles in analytics

Graphs are used in mathematics, engineering and computer science, and they are growing as a technology in IT analytics. Here’s how they relate to quantum computing.

 

” data-credit=”Image: iStock/monsitj”>stock market investment graph with indicator and volume data.

Image: iStock/monsitj

A graph is a collection of points, called vertices, and lines between those points, are called edges. Graphs are used in mathematics, engineering and computer science, and they are growing as a technology in IT analytics.

More about Innovation

“Graphs can be much more flexible than other [artificial intelligence] techniques, especially when it comes to adding new sources of data,” said Steve Reinhardt, VP of product development at Quantum Computing Inc., which produces quantum computing software that operates on graphs. “For instance, if I’m storing patient data and I want to add a dimension to track the unlikely event of testing positive for coronavirus after being vaccinated, graphs only consume storage proportional to the number of patients encountering the rare event.”

SEE: The CIO’s guide to quantum computing (free PDF) (TechRepublic)

Graphs can be heady stuff, so let’s break that down.

A database software, such as SQL or NoSQL, would be a logical technology to use if you want to plot the many different relationships between data. Analytics programs then operate on this data and how it is interrelated to derive insights that answer a specific business query.

Unfortunately, to process all of the data relationships in Reinhardt’s patient example, a relational database must go through all patient records and store them in order to identify that subset of patients who tested positive for the coronavirus after being vaccinated. For an average hospital, this processing could involve hundreds of thousands of patient records and all of their multiple relationships to the coronavirus and the vaccine.

Now let’s put that same problem into a graph. The graph uses data points, lines connecting those points and vertices which show where the lines intersect because they have a common shared context. This shared context enables the graph to identify a subset of patients who tested positive for COVID-19 after they had a vaccine and only store that subset of data for processing. Because a graph can intelligently identify a subset of data through its relationships before data gets processed, processing time is saved. 

SEE: Big data graphs are playing an important role in the coronavirus pandemic (TechRepublic)

As IT expands into more data sources for its analytics and data stores, processing will grow more complex and cumbersome. This is where a combination of graphs and quantum computing will one day be able to process data faster than traditional methods.

“Graphs have a rich set of well-understood techniques for analyzing them,” Reinhardt said. “Some of these are well-known from analyzing graphs that occur naturally, such as the PageRank algorithm that Google originally used to gauge the importance of web pages, and the identification of influencers in social networks. … This is why we are focused on making these algorithms more practically usable.”

That sounds good to IT, where there is an issue of understanding enough about graphs and quantum computing to put them to use.

SEE: Research: Quantum computing will impact the enterprise, despite being misunderstood (TechRepublic)

“The goal is to develop solutions so users need to know nothing about the details of quantum computers, including low-level architectural features such as qubits, gates, circuits, couplers and QUBOs,” Reinhardt said. “Today, quantum processors are almost never faster than the best classical methods for real-world problems, so early users need to have appropriate expectations. That said, the performance of quantum processors has been growing dramatically, and the achievement of quantum advantage, superior quantum performance on a real-world problem, may not be far off, so organizations that depend on a computing advantage will want to be prepared for that event.”

And that is the central point: While graphs and quantum computing are still nebulous concepts to many IT professionals, it isn’t too early to start placing them on IT roadmaps, since they will certainly play roles in future analytics. 

Also see

Source of Article