Articles and news about RedLine Performance Solutions!
Most organizations need some level and type of archive, whether it’s small or large. Sometimes it’s a legal requirement, like for companies that keep medical or financial records. Firms utilizing HPC systems often need to store large data sets long-term in order to analyze them later. Organizations with small archive requirements, say less than a […]
Scientists and researchers increasingly use high-performance computers, among other tools, to acquire, organize, and process data. As experts in their domains, it’s natural to expect them to be more productive and successful if their primary focus is on the frontiers of their fields of study rather than on deeply understanding their digital and computational tools. […]
IBM Spectrum Scale (formerly known as GPFS) is a high-performance, highly scalable global file system that provides a single namespace to data. Given its long history with high-performance computing (HPC) and data-intensive media and data stream serving applications, Spectrum Scale has traditionally been viewed as a niche data solution: complex to install, optimize, and maintain, […]
Editor’s Note: RedLine Senior Program Analyst Kit Menlove contributed to this post. For industry and academia, managing workflows for large-scale data-intensive computational processes is a constant challenge. As every industry and scientific discipline becomes more reliant on ever increasingly complex and robust computational solutions, the ability to create repeatable and agile end-to-end processes becomes an […]
There are many build systems available for software development, and if your software is still using the venerable Make system, it can certainly pay to investigate an upgrade. Kitware’s cross-platform CMake system is one such framework, and it offers a considerable improvement. At its base level, CMake offers a single approach to building your software […]
While the annual SC conference is always a worthy experience, SC16 may have been the most rewarding ever – filled, as it was, with some fantastic insights from fellow HPC professionals, news on the latest technologies that will advance our industry, and the chance to catch up with colleagues and friends old and new. At […]
3D XPoint is a dramatic new memory technology developed jointly by Intel and Micron. Intel claims that 3D XPoint is 1,000 times faster than NAND, has 1,000 times its endurance, and is 10 times denser than DRAM. Thus, this non-volatile computer storage medium could significantly disrupt current HPC technologies. This technology is more than just […]
The programming language Fortran has been in existence since 1957, and a large percentage of software that runs on HPC systems worldwide is written, at least in part, in Fortran. Since it has been in existence for so long, and so many scientists and engineers learned it early in their careers and have utilized it […]
There’s a big difference between basic system monitoring and performance monitoring. In the world of HPC, this distinction is greatly magnified. In the former case, monitoring often boils down to checking binary indicators to make sure system components are up or down, on or off, available or not. Red light/green light monitoring is certainly a […]
As HPC architectures continue to evolve and offer ever-increasing performance, it has become imperative to adapt existing software in order to fully harness that power. As discussed in an earlier post, the architectural approach to parallelism has come full circle in many ways. Nonetheless, MPI remains a fixture in HPC software design, and finding ways […]
Keep connected—subscribe to our blog by email.