The Machine is here!
Dec. 1st, 2016 11:07 amHewlett Packard Enterprise Demonstrates World's First Memory-Driven Computing Architecture
Proof-of-concept prototype represents major milestone for The Machine research project
LONDON, UNITED KINGDOM--(Marketwired - Nov 28, 2016) - Today, Hewlett Packard Enterprise (NYSE: HPE) announced it has successfully demonstrated Memory-Driven Computing, a concept that puts memory, not processing, at the center of the computing platform to realize performance and efficiency gains not possible today.
Developed as part of The Machine research program, HPE's proof-of-concept prototype represents a major milestone in the company's efforts to transform the fundamental architecture on which all computers have been built for the past 60 years.
(...)
HPE has demonstrated:
- Compute nodes accessing a shared pool of Fabric-Attached Memory;
- An optimized Linux-based operating system (OS) running on a customized System on a Chip (SOC);
- Photonics/Optical communication links, including the new X1 photonics module, are online and operational; and
- New software programming tools designed to take advantage of abundant persistent memory.
During the design phase of the prototype, simulations predicted the speed of this architecture would improve current computing by multiple orders of magnitude. The company has run new software programming tools on existing products, illustrating improved execution speeds of up to 8,000 times on a variety of workloads. HPE expects to achieve similar results as it expands the capacity of the prototype with more nodes and memory.
In addition to bringing added capacity online, The Machine research project will increase focus on exascale computing. Exascale is a developing area of High Performance Computing (HPC) that aims to create computers several orders of magnitude more powerful than any system online today. HPE's Memory-Driven Computing architecture is incredibly scalable, from tiny IoT devices to the exascale, making it an ideal foundation for a wide range of emerging high-performance compute and data intensive workloads, including big data analytics.
Proof-of-concept prototype represents major milestone for The Machine research project
LONDON, UNITED KINGDOM--(Marketwired - Nov 28, 2016) - Today, Hewlett Packard Enterprise (NYSE: HPE) announced it has successfully demonstrated Memory-Driven Computing, a concept that puts memory, not processing, at the center of the computing platform to realize performance and efficiency gains not possible today.
Developed as part of The Machine research program, HPE's proof-of-concept prototype represents a major milestone in the company's efforts to transform the fundamental architecture on which all computers have been built for the past 60 years.
(...)
HPE has demonstrated:
- Compute nodes accessing a shared pool of Fabric-Attached Memory;
- An optimized Linux-based operating system (OS) running on a customized System on a Chip (SOC);
- Photonics/Optical communication links, including the new X1 photonics module, are online and operational; and
- New software programming tools designed to take advantage of abundant persistent memory.
During the design phase of the prototype, simulations predicted the speed of this architecture would improve current computing by multiple orders of magnitude. The company has run new software programming tools on existing products, illustrating improved execution speeds of up to 8,000 times on a variety of workloads. HPE expects to achieve similar results as it expands the capacity of the prototype with more nodes and memory.
In addition to bringing added capacity online, The Machine research project will increase focus on exascale computing. Exascale is a developing area of High Performance Computing (HPC) that aims to create computers several orders of magnitude more powerful than any system online today. HPE's Memory-Driven Computing architecture is incredibly scalable, from tiny IoT devices to the exascale, making it an ideal foundation for a wide range of emerging high-performance compute and data intensive workloads, including big data analytics.