Rocks 6.1 – IPoIB

Disclaimer

The instructions/steps given below worked for me (and Michigan Technological University) running Rocks 6.1 (with Service Pack 1, CentOS 6.3 and GE 2011.11p1) – as has been a common practice for several years now, a full version of Operating System was installed. The HPC cluster (wigner) used to prepare this documentation has Mellanox 56 Gb/s FDR InfiniBand switches and ports. Further, it is assumed the eth0 interface is used for private ethernet network and ib0 is the InfiniBand interface. These instructions may very well work for you (or your institution), on Rocks-like or other linux clusters. Please note that if you decide to use these instructions on your machine, you are doing so entirely at your very own discretion and that neither this site, sgowtham.com, nor its author (or Michigan Technological University) is responsible for any/all damage – intellectual and/or otherwise.

Continue reading … “Rocks 6.1 – IPoIB”

Rocks 5.4.2 – Scheduling GPU jobs via SGE

Disclaimer

The instructions/steps given below worked for me (and Michigan Technological University) running Rocks 5.4.2 (with CentOS 5.5 and SGE 6.2u5) – as has been a common practice for several years now, a full version of Operating System was installed. These instructions may very well work for you (or your institution), on Rocks-like or other linux clusters. Please note that if you decide to use these instructions on your machine, you are doing so entirely at your very own discretion and that neither this site, sgowtham.com, nor its author (or Michigan Technological University) is responsible for any/all damage – intellectual and/or otherwise.

Continue reading … “Rocks 5.4.2 – Scheduling GPU jobs via SGE”

SC12 in Salt Lake City

The description of the conference didn’t change – SC12 is still The International Conference for High Performance Computing, Networking, Storage and Analysis, but the location did: Salt Lake City in Utah, known to the historically inclined as the crossroads of the west. I had never been to SLC ever before but the last time I was in Utah was August of 2005 – a momentary step in and out of state border as part of Glenn Canyon Dam tour which, in itself, was part of a 3 day trip to the Grand Canyon.

Continue reading … “SC12 in Salt Lake City”

Rocks 5.4.2 – HPCC 1.4.1 benchmark with GCC 4.1.2

Disclaimer

The instructions/steps given below worked for me (and Michigan Technological University) running Rocks 5.4.2 (with CentOS 5.5) – as has been a common practice for several years now, a full version of Operating System was installed. These instructions may very well work for you (or your institution), on Rocks-like or other linux clusters. Please note that if you decide to use these instructions on your machine, you are doing so entirely at your very own discretion and that neither this site, sgowtham.com, nor its author (or Michigan Technological University) is responsible for any/all damage – intellectual and/or otherwise.

Continue reading … “Rocks 5.4.2 – HPCC 1.4.1 benchmark with GCC 4.1.2”

Rocks 5.4.2 – HPL 2.0 benchmark with GCC 4.1.2

Disclaimer

The instructions/steps given below worked for me (and Michigan Technological University) running Rocks 5.4.2 (with CentOS 5.5) – as has been a common practice for several years now, a full version of Operating System was installed. These instructions may very well work for you (or your institution), on Rocks-like or other linux clusters. Please note that if you decide to use these instructions on your machine, you are doing so entirely at your very own discretion and that neither this site, sgowtham.com, nor its author (or Michigan Technological University) is responsible for any/all damage – intellectual and/or otherwise.

Continue reading … “Rocks 5.4.2 – HPL 2.0 benchmark with GCC 4.1.2”

Rocks 5.4.2 – Ganglia’s gmond Python module for monitoring NVIDIA GPU

Disclaimer

The instructions/steps given below worked for me (and Michigan Technological University) running Rocks 5.4.2 (with CentOS 5.5) – as has been a common practice for several years now, a full version of Operating System was installed. These instructions may very well work for you (or your institution), on Rocks-like or other linux clusters. Please note that if you decide to use these instructions on your machine, you are doing so entirely at your very own discretion and that neither this site, sgowtham.com, nor its author (or Michigan Technological University) is responsible for any/all damage – intellectual and/or otherwise.

Continue reading … “Rocks 5.4.2 – Ganglia’s gmond Python module for monitoring NVIDIA GPU”

CUDA/C – Hello, World!

Disclaimer

The instructions/steps/programs given below worked for me (and Michigan Technological University) running site licensed Red Hat Enterprise Linux 6.2, with NVIDIA CUDA SDK 4.1.28, NVIDIA GPU Driver v290.10 & two NVIDIA GeForce GTX 570 cards – as has been a common practice for several years now, a full version of Operating System was installed and all necessary patches/upgrades have been applied. These instructions may very well work for you (or your institution), on Red Hat-like or other linux distributions. Please note that if you decide to use these instructions on your machine, you are doing so entirely at your very own discretion and that neither this site, sgowtham.com, nor its author (or Michigan Technological University) is responsible for any/all damage – intellectual and/or otherwise.

Continue reading … “CUDA/C – Hello, World!”

RHEL 6.2 – Ganglia’s gmond Python module for monitoring NVIDIA GPU

Disclaimer

The instructions/steps given below worked for me (and Michigan Technological University) running site licensed Red Hat Enterprise Linux 6.2 – as has been a common practice for several years now, a full version of Operating System was installed and all necessary patches/upgrades have been applied. These instructions may very well work for you (or your institution), on Red Hat-like or other linux distributions. Please note that if you decide to use these instructions on your machine, you are doing so entirely at your very own discretion and that neither this site, sgowtham.com, nor its author (or Michigan Technological University) is responsible for any/all damage – intellectual and/or otherwise.

Continue reading … “RHEL 6.2 – Ganglia’s gmond Python module for monitoring NVIDIA GPU”

SC11, fond memories from a fantastic nerd fest

Apart from what I could infer from the title, SC11, The International Conference for High Performance Computing, Networking, Storage and Analysis, that it was being held in a city that I had already been to once before (and had gained some familiarity, with at least parts of it) as part of The Great American Road Trip with dear friend Nils Stenvig (@UPBeaches), that the city had more than its fair share of friendly & familiar faces and that the event would offer the first of opportunities to put a face to many a names that I had known & interacted with for many years, I knew little about SC11 and of what I wanted from it.

Continue reading … “SC11, fond memories from a fantastic nerd fest”

Subversion – Changing Repository Location

Subversion Subversion (SVN) is a version control system initiated in 2000 by CollabNet Inc. It is used to maintain current and historical versions of files such as source code, web pages, and documentation. Its goal is to be a mostly-compatible successor to the widely used Concurrent Versions System (CVS). Subversion is well-known in the open source community and is used on many open source projects.

Subversion was started in 2000 as an effort to write a free version control system which operated much like CVS but with fixed bugs and misfeatures in CVS. By 2001, Subversion was sufficiently developed to be capable of hosting its own source code. More information, including this above paragraph, is here.

Continue reading … “Subversion – Changing Repository Location”