HPE Announces a New Paradigm for Computing in the Big Data Era

In May 2017, Hewlett Packard Enterprise announced the completion of a computer set to change the way in which organizations process big data in the future. The computer is the latest product of HPE’s The Machine project. The goal of the project is to create a computer that abandons many of the traditional design elements that computer manufacturers have utilized for decades. In doing so, The Machine promises to give rise to supercomputers capable of processing the equivalent of all the digital data the human race has generated to date — in the blink of a photon.

Memory-Driven Computing

The Machine architecture utilizes multiple ARM systems on chips, each running an optimized version of Linux. The prototype computer features 160 TB of total storage, but future versions of The Machine will have even greater capacity measured in exabytes — perhaps even yottabytes. In other words, the system could potentially store and work with all of the digital data that exists today.

Quantum leaps in digital storage technology are nothing new. The feature that makes The Machine unique is something that Hewlett Packard calls Memory-Driven Computing. In The Machine, every system on a chip addresses the same pool of memory that serves both as system memory and bulk storage space. Eliminating the need for one system to communicate with another when accessing storage promises to greatly reduce the time needed for reading and writing data.

Non-Volatile System Memory

With many processors working together to manipulate the same set of data simultaneously, storage speed has the potential to become a bottleneck. Even current flash memory technology has no chance of keeping up. The DIMM chips that PCs and servers use for system memory are much faster, but they’re also volatile — they lose their data when the computers in which they reside are turned off. Hewlett Packard has already developed a technology that uses batteries to supply constant power to DIMM chips and prevent them from losing data. The company hopes to use the technology as a stepping stone toward developing a new type of extremely fast non-volatile memory for The Machine.

Photonics

After eliminating the processor and storage bottlenecks in traditional computer design, one major bottleneck remains which is the connections between the computer’s components. The copper interconnects on a computer’s motherboard — and the cables linking one system to another in a network — don’t allow for the extremely fast data transfer speeds that The Machine requires. Hewlett Packard has already developed a new optical technology for the cables connecting the systems in The Machine. Eventually, the company intends to build motherboards and other components using the same technology.

What Does The Machine Mean for the Future?

Computers with the ability to process extremely large data sets are already beginning to transform fields such as science, research, medicine, politics and business — and the more data that you can process, the better decisions you’ll make.

What can we do with big data in practice? Consider, for example, the typical process of testing a new drug. In a clinical trial, each participant randomly receives an experimental drug or a placebo. If the experimental drug is more effective than the placebo, it probably works. What if the drug has potential negative side effects, though? What if it works 100 percent of the time — but only under a certain set of circumstances?

In medicine, the most that doctors can often say about drugs is that they work “most of the time” and that “very few” people experience side effects. With enough data, though, it’s possible that we could do even better than that. Imagine a system that’s configured to research a given medical condition. The system has access to:

  • The genomes and full medical histories of every known person who has ever had the condition
  • A complete record of each prescribed treatment for the condition — and each treatment’s outcome
  • A record of every reported negative side effect

With the above data, the system could potentially increase the success rate in treating the condition and eliminate all side effects by predicting exactly when they will occur.

Final Thoughts

The above example is only one way in which big data may transform the world for the better. Ultimately, what we can do with big data will depend on the quality of the data we have and the skill with which developers build solutions for processing that data. No matter what, we’ll need computers that are up to the task — and Hewlett Packard appears to have developed exactly that.

For the latest HPE server builds, spare parts and expert service contact our specialised HP Enterprise team at EMPR Australia.