We are sorry, This PDF is available in download format only

Intel® Enterprise Edition for Lustre* Software white paper

Not so long ago, storage for high performance computing (HPC) meant complexity and massive data sets, and was the concern of only a small group of computer users. Superscale computing was the province of government-sponsored research in national labs, data-intensive simulations for weather forecasting and climate modeling, or certain information-intense industries, such as defense, aeronautics, and oil and gas. Today, HPC is undergoing democratization: Extracting knowledge and information from ever-expanding flows of data is now seen to be a key source of competitive advantage for modern businesses of any size. Enterprises of all kinds now generate huge volumes of data. They rely on high-performance data processing applications to analyze and derive value from their data flows. They require a storage infrastructure that can scale endlessly and deliver large volume I/O for high-throughput data processing.

Not so long ago, storage for high performance computing (HPC) meant complexity and massive data sets, and was the concern of only a small group of computer users. Superscale computing was the province of government-sponsored research in national labs, data-intensive simulations for weather forecasting and climate modeling, or certain information-intense industries, such as defense, aeronautics, and oil and gas. Today, HPC is undergoing democratization: Extracting knowledge and information from ever-expanding flows of data is now seen to be a key source of competitive advantage for modern businesses of any size. Enterprises of all kinds now generate huge volumes of data. They rely on high-performance data processing applications to analyze and derive value from their data flows. They require a storage infrastructure that can scale endlessly and deliver large volume I/O for high-throughput data processing.

Related Videos