Articles and news about RedLine Performance Solutions!
While public clouds have been used for HPC for at least the past decade, several key challenges have traditionally slowed the adoption of cloud computing for HPC: Slow, high-latency interconnects High virtualization overhead and poor isolation from other workloads Data staging challenges and costs (getting large data sets into and out of the cloud) Moving […]
InfiniBand is among the most common and well-known cluster interconnect technologies. However, the complexities of an InfiniBand (IB) network can frustrate the most experienced cluster administrators. Maintaining a balanced fabric topology, dealing with underperforming hosts and links, and chasing potential improvements keeps all of us on our toes. Sometimes, though, a little research and experimentation […]
Scaling HPC applications on your hardware is a vital task in supercomputing. If an application can’t use all of the hardware devoted to it, then you have expensive equipment sitting idle and not doing anyone any good. When looking at scalability, it is important to first ensure that the problem can scale. In other words, […]
Most organizations need some level and type of archive, whether it’s small or large. Sometimes it’s a legal requirement, like for companies that keep medical or financial records. Firms utilizing HPC systems often need to store large data sets long-term in order to analyze them later. Organizations with small archive requirements, say less than a […]
Scientists and researchers increasingly use high-performance computers, among other tools, to acquire, organize, and process data. As experts in their domains, it’s natural to expect them to be more productive and successful if their primary focus is on the frontiers of their fields of study rather than on deeply understanding their digital and computational tools. […]
IBM Spectrum Scale (formerly known as GPFS) is a high-performance, highly scalable global file system that provides a single namespace to data. Given its long history with high-performance computing (HPC) and data-intensive media and data stream serving applications, Spectrum Scale has traditionally been viewed as a niche data solution: complex to install, optimize, and maintain, […]
Editor’s Note: RedLine Senior Program Analyst Kit Menlove contributed to this post. For industry and academia, managing workflows for large-scale data-intensive computational processes is a constant challenge. As every industry and scientific discipline becomes more reliant on ever increasingly complex and robust computational solutions, the ability to create repeatable and agile end-to-end processes becomes an […]
There are many build systems available for software development, and if your software is still using the venerable Make system, it can certainly pay to investigate an upgrade. Kitware’s cross-platform CMake system is one such framework, and it offers a considerable improvement. At its base level, CMake offers a single approach to building your software […]
While the annual SC conference is always a worthy experience, SC16 may have been the most rewarding ever – filled, as it was, with some fantastic insights from fellow HPC professionals, news on the latest technologies that will advance our industry, and the chance to catch up with colleagues and friends old and new. At […]
3D XPoint is a dramatic new memory technology developed jointly by Intel and Micron. Intel claims that 3D XPoint is 1,000 times faster than NAND, has 1,000 times its endurance, and is 10 times denser than DRAM. Thus, this non-volatile computer storage medium could significantly disrupt current HPC technologies. This technology is more than just […]
Keep connected—subscribe to our blog by email.