BASH – Filename Toggle Case

Disclaimer

The instructions/steps/scripts/methods given below worked for me running CentOS. It may very well work for you on other linux distributions, Red Hat-like or otherwise. Please note that if you decide to use these instructions on your machine, you are doing so entirely at your very own discretion and that neither this site, sgowtham.com, nor its author is responsible for any/all damage – intellectual or otherwise.


What is BASH?

BASH is a free software Unix shell written for the GNU Project. Its name is an acronym which stands for Bourne-again shell. The name is a pun on the name of the Bourne shell (sh), an early and important UNIX shell written by Stephen Bourne and distributed with Version 7 UNIX circa 1978, and the concept of being born again. BASH was created in 1987 by Brian Fox. In 1990 Chet Ramey became the primary maintainer. BASH is the default shell on most GNU/Linux systems as well as on Mac OS X and it can be run on most UNIX-like operating systems. It has also been ported to Microsoft Windows using the POSIX emulation provided by Cygwin, to MS-DOS by the DJGPP project and to Novell NetWare.


The Script

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
#! /bin/bash
 
# BASH script to convert upper (lower) case filenames to lower (upper) case.
# Converts all files in a given directory, including any directory.
#
# Wed, 17 May 2006 22:09:15 -0400
 
# Uppercase to lowercase
for file in `ls *.*`;
do
  lfile=$(echo $file | tr [A-Z] [a-z])
  echo "  Converting ${file} to ${lfile} "
  mv ${file} ${lfile}
done
 
# Lowercase to Uppercase
# for file in `ls *.*`; 
# do
#   ufile=$(echo $file | tr [a-z] [A-Z])
#   echo "  Converting ${file} to ${ufile} "
#   mv ${file} ${ufile}
# done

PenguiCon 4.0

Day 0

As planned, I woke up around 4 am and Kyle (along with Steve and Peter) were at my house around 6 am to pick me up. Journey started on scheduled and it was after a long time that I was sitting in the backseat. This (I mean sitting in the backseat) has its own advantages – one, I don’t have to wear seat-belt; two, I don’t have to drive; third (very important), I can keep napping ๐Ÿ˜€ We stopped at several different places and somewhere on I-75/US-23 I learnt about booting a Mac in Target mode. What this does is to show one Mac’s hard drive as an external storage in another machine. This technique was used to transfer very important data from Kyle’s Mac to mine. We reached Livonia around 4 pm or so, and this is only the second conference/meeting where I my name-tag had just my name.

To start things off on a lighter note, I attended Got Filk? and then moved on to serious things, Writing Techniques and such. After the opening ceremony, I happened to talk to Chris DiBona (Google!’s Open Source Director) for quite a while and then attended How’s Linux Doing? – a panel discussion chaired by Chris DiBona and Kathy Raymond. Amongst things that surfaced during my discussion with Chris was Google! AdSense issue. Chris has asked me to write to him and he would look into it. Hopefully, I will get to hear something positive this time around…

So far, I have been having fun at this conference – I see geeky nerds (or nerdy geeks – whatever you want to call them) and people dressed up in scientifically fictitious costume in every direction. Organizers have put in enormous efforts in making elaborate arrangements and what has impressed me the most is the continuous supply of food (bread, cheese, fruits, coke, coffee, …). What more can any geek/nerd ask? Upon browsing through the schedule a bit more carefully, I realized that this is a 24/7 conference! Though Holiday Inn (or Wayport, to be precise. I overheard the news that Wayport demanded $30,000 – thirty thousand dollars – per day to make it free for all!) should have provided free WiFi access, it would be hard to manage 1000+ hackers over a three-day period!! On a sleepy note, please note that this is one conference/meeting where you never want to seen (by anybody) working on Windows. Three such people were hung (to death) in the Holiday Inn lobby earlier in the night and it wasn’t funny at all ๐Ÿ™ Just kidding – though the treatment one receives for working on Windows is nothing short of being hung to death. Guess I have written a bit too much for the day; need to get some sleep now – I have some very interesting talks to attend tomorrow (or later in the day).


Day 1

As I didn’t have any talks early in the morning, (give them a break – it’s a geek conference. Nothing starts till about 10 – mostly because many people went to sleep around 8 am!) I had just enough time to transfer my pictures to my computer. I discussed certain issues with Aaron (ConChair for the event) – like for example having PlanMyCon type of thing. Let me explain what this means. Since the schedule is prepared and finalized at least a month in advance, it can be made online (it’s done) with a facility to select the talks/presentations that someone likes to attend, and the system would then produce a page with just those events. This is what’s popularly known as the Shopping Cart approach and has been used to great effect in APS March Meetings.

Coffee Ritual was probably the best way to start the day off. Like the name goes, it’s a ritual, something that’s done at every PenguiCon at least once – to honor the significance of Coffee in geeks’/nerds’ lives. The ritual started with a nice song, about Cappuccino, sung in a melodious tune by Andrea Dale, and that was followed by the actual ritual. I don’t have enough words in my vocabulary to explain how hilarious the proceedings were. Let me give you some hints: it was a church-sermon like ceremony with people walking in with their pillows, coffee-making procedure was explained in a sermon-like tone, coffee prepared earlier was distributed to the audience (coffee by one and sugar+cream by another), then everybody raises, faces east, sips the Holiest of the Holies and yells in unison, God, I needed this.

The next mentionable event I attended was about Technical (or Technological Writing). I had known much of the presentation material but I did learn some tricks – especially about making end-users read the manual (popularly known as the RTFM approach, which will reduce the SysAdmin’s Popularity Quotient, PQ, drastically) and README files, reverse debugging mechanism and making some $$$ on technical material I have written so far. I am not quite sure how easily I can adopt the last one, owing to VISA related issues, but it’s still an idea worth keeping in mind.

While volunteering at the registration desk close to three hours, I met few interesting people – an elderly gentleman well acquainted with the UP, as well as the 2002 computing facility at Tech was one of them. It’s a much smaller world than one imagines… Next session that I attended was about Physics – Racing Light: Real Physics. The talk dealt with methods (fair and unfair alike) to beat light (photons) in a race. I thought it was a very well structured presentation and surprisingly, well attended too, by well-informed audience. If only music from the adjacent room was at lower volume, people would have enjoyed this talk lot better. After this one, I roamed around in the convention center for a while, bought few geeky/nerdy T-Shirts and started searching for a food court in the vicinity when hunger reminded its presence. Not being able to find any familiar food courts only added to my despair and it wasn’t long before my eyes caught a not-so-familiar (but always wanted-to-eat-at) place – Panera Bread. There is one more reason I like this shop now – apart from the very tasty veggie soup and sandwiches, they also provide free WiFi access ๐Ÿ˜€ What more can I ask?

I guess I have got more than I hoped for by attending this convention – lot of new people, most of them very interesting, learned new & efficient tricks and tips for solving problems, and experienced first-hand the interaction of science and technology – all while having plenty of fun. Though we are starting our return journey early tomorrow morning (owing to the remoteness of Houghton), I would definitely plan on attending subsequent versions of PenguiCon ๐Ÿ™‚


Day 2

Thanks to Ars geeks (and Kyle), I found WiFi in Holiday Inn and Kyle helped me fix a WebDAV related issue. Let me explain what this is and what this does: some of you might have accidentally/intentionally ran into this page. Until recently (as recent as 40 hours ago), I used to manually enter events (and edit if need be) in all the ICS (Internet Calendaring & Scheduling) files. Trust me, this is a very boring and tedious job. Kyle had explained that iCal in Mac can automatically update such a calendar and the best part of it was using GUI. So, over the last 40 hours, I managed to sneak in sometime and entered most events in iCal compatible way. But, I hadn’t done all the necessary configuration so that my Mac would talk to my webserver. Aping Kyle’s conf-files, I edited some parts of $SERVER_ROOT/conf/httpd.conf and upon entering the proper URL in iCal, it started working like a charm.

Soon after this was done, we (LUGgers) started our journey back to remoteness. First stop, as planned, was at Tim’s (parents’) place in Montrose. Driving directions to this place are probably the easiest ones somebody will ever find – 1. Last house in Montrose on this street. I loved the interiors as well as the backyard and it goes without saying that I loved the coffee, strawberries, banana and pancakes Tim’s mom prepared. Chong (a.k.a. Tim) had just returned from Japan (after the Study Abroad program) and looked lot like a Japanese. Amidst lot of catching up (with some almost-succeeded attempts to kidnap the chongPOD) and some jovial fighting, I shot a video that can be used as ransom to get certain things done from Chong. (BTW, if Chong’s mom is reading this, your son didn’t do anything bad). After having made sure that the house was intact, we started again towards remoteness. Driving in rain, mist and haze on much of I-75 and parts of US-41 felt like flying an air-craft, only at very low altitude and with slightly less turbulence. We all made it back to Houghton in one (individual) piece around 8.30 pm.

Adding to the goodness of the day, Pistons won game #1 of the play-offs. I don’t think I have enough energy left in me to do anything more tonight and like I had hoped before, the week ahead will be much better than the last one ๐Ÿ™‚

BASH – Center Align A File

Disclaimer

The instructions/steps/scripts/methods given below worked for me running CentOS. It may very well work for you on other linux distributions, Red Hat-like or otherwise. Please note that if you decide to use these instructions on your machine, you are doing so entirely at your very own discretion and that neither this site, sgowtham.com, nor its author is responsible for any/all damage – intellectual or otherwise.


What is BASH and SED?

BASH is a free software Unix shell written for the GNU Project. Its name is an acronym which stands for Bourne-again shell. The name is a pun on the name of the Bourne shell (sh), an early and important UNIX shell written by Stephen Bourne and distributed with Version 7 UNIX circa 1978, and the concept of being born again. BASH was created in 1987 by Brian Fox. In 1990 Chet Ramey became the primary maintainer. BASH is the default shell on most GNU/Linux systems as well as on Mac OS X and it can be run on most UNIX-like operating systems. It has also been ported to Microsoft Windows using the POSIX emulation provided by Cygwin, to MS-DOS by the DJGPP project and to Novell NetWare.

sed (Stream EDitor) refers to a UNIX utility which parses text files and implements a programming language which can apply textual transformations to such files. It reads input files line by line (sequentially), applying the operation which has been specified via the command line (or a sed script), and then outputs the line. It was developed from 1973 to 1974 as a UNIX utility by Lee E. McMahon of Bell Labs, and is available for most operating systems.


The Script

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
#! /bin/bash
 
# BASH script to center all text in the middle of 79-column width.
# Leading white spaces are insignificant and no trailing spaces appear at the end.
#
# Sun, 09 Apr 2006 08:47:20 -0400
 
if [ "$1" != "" ];
then
  echo
  echo "Center aligning the text in $1"
  export FILENAME=`echo $1 | awk -F '.' '{print $1}'`
  export EXTN=`echo $1 | awk -F '.' '{print $2}'`
  sed  -e :a -e 's/^.\{1,77\}$/ &/;ta' -e 's/\( *\)\1/\1/' $1 > $FILENAME_calign.$EXTN
  echo "Open $FILENAME_calign.$EXTN"
  echo
fi

PERL – Disk Usage Reminder

Disclaimer

The instructions/steps/scripts/methods given below worked for me running CentOS. It may very well work for you on other linux distributions, Red Hat-like or otherwise. Please note that if you decide to use these instructions on your machine, you are doing so entirely at your very own discretion and that neither this site, sgowtham.com, nor its author is responsible for any/all damage – intellectual or otherwise.


What is PERL?

In computer programming, PERL is a high-level, general-purpose, interpreted, dynamic programming language. PERL was originally developed by Larry Wall, a linguist working as a systems administrator for NASA, in 1987, as a general purpose UNIX scripting language to make report processing easier. Since then, it has undergone many changes and revisions and became widely popular among programmers. Larry Wall continues to oversee development of the core language. PERL borrows features from other programming languages including C, shell scripting (sh), AWK, sed and Lisp. The language provides powerful text processing facilities without the arbitrary data length limits of many contemporary UNIX tools, making it the ideal language for manipulating text files. It is also used for graphics programming, system administration, network programming, applications that require database access and CGI programming on the Web. PERL is nicknamed the Swiss Army knife of programming languages due to its flexibility and adaptability.


The Script

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
#! /usr/bin/perl -wT
 
# Perl script to check disk usage and send emails to users hogging more than a specified limit. 
# Needs to be run as root and a good way to do it is via a cron job.
# 
# First written : Pat Krogel, Fri Feb 24 11:51:35 EST 2006
# Last modified : Gowtham, Fri Feb 24 15:37:40 EST 2006
 
# Variable initializations
$date    = localtime;
$dulimit = 20;   # Disk Usage Limit, in GB
 
# Get the disk usage, in GB, and separate out space and uids 
open PIPE, "du -s -B 1024M /home/* |" or die "Can't get disk usage.\n";
while (<PIPE>)
{
   chomp;
   my ($space, $user) = split /\s+/;
   $user =~ s/^.*\/(\w+)$/\1/;
   if ($space > $dulimit)
   {
      # Debugging purposes only! 
      # print "$user is taking up $space GB\n";
      open MAIL, "| mail -s 'Disk Usage Notification' $user\@domain.name"
        or die "Can't send email.\n";
      print MAIL "$user,\n\n";
      print MAIL "Your home folder,\n\n";
      print MAIL "/home/$user ($space GB)\n\n";
      print MAIL "in 'host.domain.name' has exceeded the present limit.\n";
      print MAIL "Please back it up elsewhere (some other machine or\n";
      print MAIL "external hard drives) and clean up the home folder.\n\n";
      print MAIL "root\@host.domain.name\n";
      print MAIL "$date\n";
      close MAIL;
   }
}
close PIPE;

BASH – Router’s Public IP Address

BASH, AWK and SED

BASH is a free software Unix shell written for the GNU Project. Its name is an acronym which stands for Bourne-again shell. The name is a pun on the name of the Bourne shell (sh), an early and important UNIX shell written by Stephen Bourne and distributed with Version 7 UNIX circa 1978, and the concept of being born again. BASH was created in 1987 by Brian Fox. In 1990 Chet Ramey became the primary maintainer. BASH is the default shell on most GNU/Linux systems as well as on Mac OS X and it can be run on most UNIX-like operating systems. It has also been ported to Microsoft Windows using the POSIX emulation provided by Cygwin, to MS-DOS by the DJGPP project and to Novell NetWare.

AWK is a general purpose programming language that is designed for processing text-based data, either in files or data streams, and was created at Bell Labs in the 1970s. The name AWK is derived from the family names of its authors รขโ‚ฌโ€ Alfred Aho, Peter Weinberger, and Brian Kernighan; however, it is not commonly pronounced as a string of separate letters but rather to sound the same as the name of the bird, auk. awk, when written in all lowercase letters, refers to the UNIX or Plan 9 program that runs other programs written in the AWK programming language. AWK is an example of a programming language that extensively uses the string data type, associative arrays (that is, arrays indexed by key strings), and regular expressions. The power, terseness, and limitations of AWK programs and sed scripts inspired Larry Wall to write PERL. Because of their dense notation, all these languages are often used for writing one-liner programs. AWK is one of the early tools to appear in Version 7 UNIX and gained popularity as a way to add computational features to a UNIX pipeline. A version of the AWK language is a standard feature of nearly every modern UNIX-like operating system.

sed (stream editor) is a Unix utility that parses text files and implements a programming language which can apply textual transformations to such files. It reads input files line by line, applying the operation which has been specified via the command line (or a sed script), and then outputs the line. It was developed in 1973-74 as a Unix utility by Lee McMahon of Bell Labs, and is available for UNIX and most flavors of Linux operating system.


The Script

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
#! /bin/bash
 
# POSIX SH shell script to display the IP Address - one that the machine has, 
# as well as router's public IP Address, if any.
 
# First written : Gowtham,  Wed Dec 28 22:05:35 EST 2005
# Last modified : Gowtham,  Fri Dec 30 06:57:50 EST 2005
#                 Gowtham,  Tue Jan  3 09:52:47 EST 2006
#                 Jon DeVree, Tue Jan  3 22:28:15 EST 2006 
 
IFACE=`netstat -rn | grep ^0.0.0.0 | awk '{print $8}'`
 
if [ "$IFACE" = "" ];
then
  echo
  echo " You are not connected to the internet"
  echo
  exit 0
fi
 
IP=`/sbin/ifconfig $IFACE | grep Bcast | awk '{print $2}' | awk -F ':' '{print $2}'`
NSIP=`curl -s http://checkip.dyndns.org | awk '{print $6}' | \
            awk -F '<' '{print $1}'`
 
if [ "$IP" = "$NSIP" ];
then
  echo
  echo " Your machine is directly connected to the internet"
  echo " IP Address : $IP"
  echo
else
  echo
  echo " Your machine is configured behind a router"
  echo " Machine's static IP Address : $IP"
  echo " Router's  public IP Address : $NSIP"
  echo
fi

C – Translate The Coordinates Along Any Given Axis

Disclaimer

The instructions/steps/scripts/methods given below worked for me running CentOS and various other distributions. It may very well work for you on other linux distributions, Red Hat-like or otherwise. Please note that if you decide to use these instructions on your machine, you are doing so entirely at your very own discretion and that neither this site, sgowtham.com, nor its author is responsible for any/all damage – intellectual or otherwise.


What is C?

In computing, C is a general-purpose, cross-platform, block structured, procedural, imperative computer programming language developed in 1972 by Dennis Ritchie at the Bell Telephone Laboratories for use with the Unix operating system. Although C was designed for implementing system software, it is also used for developing application software – on a great many different software platforms and computer architectures, and several popular compilers exist. C has greatly influenced many other popular programming languages, most notably C++, which originally began as an extension to C. C is an imperative (procedural) systems implementation language. It was designed to be compiled using a relatively straightforward compiler, to provide low-level access to memory, to provide language constructs that map efficiently to machine instructions, and to require minimal run-time support. C was therefore useful for many applications that had formerly been coded in assembly language. Despite its low-level capabilities, the language was designed to encourage machine-independent programming. A standards-compliant and portably written C program can be compiled for a very wide variety of computer platforms and operating systems with little or no change to its source code. The language has become available on a very wide range of platforms, from embedded microcontrollers to supercomputers.


The Program

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
/* * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
 
This program shifts x, y, z co-ordinates of any system based on
the value of dx, dy, dz and writes the new set of (x,y,z) in a new
file. 
 
INPUT :
 
Input file must have the name 'shift-xyz-input.xyz' and (x,y,z) 
co-ordinates must be in the following format (space or tab separated)
 
First Line   : Number of atoms, N
Next N Lines : x, y, z of each atom
 
DESCRIPTION OF VARIABLES :
 
Nbuf     First line of the input file, stored as string
Abuf     Atomic Number of the atom, stored as string
xbuf     x co-ordinate of the atom, stored as string
ybuf     y co-ordinate of the atom, stored as string
zbuf     z co-ordinate of the atom, stored as string
 
N        Nbuf converted to integer, using atoi()
         Number of atoms
 
Atom[i]  Abuf converted to integer, using atoi()
         One dimensional array for atomic number of N atoms  
 
x[i]     xbuf converted to double precision number, using atof()
         One dimensional array for storing x co-ordinate of N atoms  
 
y[i]     ybuf converted to double precision number, using atof()
         One dimensional array for storing y co-ordinate of N atoms   
 
z[i]     zbuf converted to double precision number, using atof()
         One dimensional array for storing z co-ordinate of N atoms
 
 
COMPILATION/EXECUTION PROCEDURE :
 
gcc -o shift-xyz.x shift-xyz.c -lm
./shift-xyz.x
 
OUTPUT :
 
If the compilation and execution is successful, the program writes an output 
file named in 'shift-xyz-ouput.xyz' format with shifted (x,y,z) co-ordinates
 
Thu Dec 22 21:22:51 EST 2005
 
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * */
 
#include <stdio.h>
#include <string.h>
#include <stdlib.h>
#include <math.h>
 
main()
{
 
/* Defines input and output files */
  FILE *input;
  FILE *output;
 
/* Declaration and initialization of variables */
  char   Nbuf[100]={'\0'};
  char   Abuf[100]={'\0'};
  char   xbuf[100]={'\0'};
  char   ybuf[100]={'\0'};
  char   zbuf[100]={'\0'};
 
  int      Atom[100]={0};
  double   x[100]={0};
  double   y[100]={0};
  double   z[100]={0};
 
  int     i, N;
  double  dx, dy, dz;
 
  dx=0.500;  /* Change this value to indicate dx */
  dy=0.500;  /* Change this value to indicate dy */
  dz=0.500;  /* Change this value to indicate dz */
 
/* Opens the input file for reading */
  input = fopen("shift-xyz-input.xyz", "r");
 
  if (input == NULL) {
    printf("\nInput File empty or corrupted! Exiting...\n");
    exit (1);
  }
 
/* Creates the output file and opens it for writing */
  output = fopen("shift-xyz-ouput.xyz", "w");
 
/* Reads the first line of the input file and stores it as N */
  fscanf(input, "%s", Nbuf);
  N = atoi(Nbuf);
 
/* Reads and stores the (x, y, z) for each atom, first as a string (buffer) 
  and later stores it as an array element using appropriate conversion 
  mechanism */
  for(i=0; i<N; i++) {
    fscanf(input, "%s", Abuf);
    Atom[i]= atoi(Abuf);
 
    fscanf(input, "%s", xbuf);
    x[i]= atof(xbuf);
 
    fscanf(input, "%s", ybuf);
    y[i]= atof(ybuf);
 
    fscanf(input, "%s", zbuf);
    z[i]= atof(zbuf);
  }
 
/* Calculates the new x and y co-odrinates of atoms based on the value of
  dx, dy and dz */
  for(i=0; i<N; i++)  {
    x[i] = x[i] + dx;
    y[i] = y[i] + dy;
    z[i] = z[i] + dz;
  }
 
/* Final values are written in the output file */
  fprintf(output, "%d\n", N);
 
  for(i=0; i<N; i++){
    fprintf(output, "%d",       Atom[i]);
    fprintf(output, "\t%lf",    x[i]);
    fprintf(output, "\t%lf",    y[i]);
    fprintf(output, "\t%lf\n",  z[i]);
  }
 
/* Closes the input and output files */
  fclose(input);
  fclose(output);
 
/* Confirmation to the user */
  printf("Please open shift-xyz-output.xyz for results\n\n");
 
/* Program Ends */
}


The Input File Format

1
2
3
4
5
6
7
6
6   1.000000    0.000000        0.000000
6   2.000000  0.000000  0.000000
1   3.000000  0.000000  0.000000
1   4.000000  0.000000  0.000000
8   5.000000  0.000000  0.000000
8   6.000000  0.000000  0.000000

VASP, MPICH, Intel Compilers and Rocks

Much of the weekend was spent in some relaxation, shoveling snow (I have come to realize that this is a very good exercise – it can make one sweat even when it’s +10F outside!) and trying to debug the errors associated with execution of parallel version of VASP 4.6.28. However, same errors persisted and I didn’t go too far ahead with it.

I got in touch with few of Intel’s authorized resellers (one in Bloomington, MN – who transferred the call to someone else in Boston, MA) to see if I can get a copy of version 8.x of FORTRAN and C compilers. Though the latter person promised to get back to me with some favorable information soon, I got directly in touch with an Intel Support Technician – hoping that he/she wouldn’t be from a call center in some other country. Fortunately, it was somebody in California and unfortunately, he redirected me back to their re-sellers. Last call, pretty much out of desperation, did what I needed: I just had to register the non-commercially downloaded products for Premier Support and I would be entitled for previous versions too. If only Intel explicitly mentioned what Premier Support actually is, my worries would have ended a long time ago. By the way, if you are now wondering what the error message was (running on 2 processors), here it is:


1
2
3
4
5
6
7
8
9
10
11
12
13
running on    2 nodes
distr:  one band on    1 nodes,    2 groups
vasp.4.6.28 25Jul05 complex
POSCAR found :  4 types and   18 ions
LDA part: xc-table for Ceperly-Alder, standard interpolation
found WAVECAR, reading the header
POSCAR, INCAR and KPOINTS ok, starting setup
WARNING: wrap around errors must be expected
FFT: planning ...           1
reading WAVECAR
the WAVECAR file was read sucessfully
LAPACK: Routine ZPOTRF failed!           8
LAPACK: Routine ZPOTRF failed!           8

Having managed to get version 8.x and 7.x of Intel compilers, situation only got worse as the error message remained the same. At this point, I must thank the help offered by VASP Tech Support and Andri Arnaldsson (from University of Washington) – they have been pretty quick in their responses, sent their copies of Makefiles along with several tips and tricks. Changing compiler versions, using a previous version of MPICH (1.2.7p1 to be precise), repeating compilation many times with different BLAS, LAPACK libraries — nothing helped.

Taking a break for an hour and watching an episode of South Park seemed to have helped. A modification in the key words used for Google! search and reading some discussion forum a lot more carefully, I found that adding three lines at the end of VASP Makefile (what this does is to reduce the level of optimization for mpi.F), the error vanished and the calculations started running smoothly.

I repeated the same calculation using 2, 4, 6 and 8 processors and noticed a slightly strange behavior – when the number of processors was 2, 4 or 8, energy optimization is exactly same as in a serial calculation but when the number of processors is 2*N (N=3 in this case), energy optimization route is different – final result is still exactly the same. Though I have to do more trials (say 3, 5, 7, 9, 10 processors) to completely convince myself, it appears to me that using 2N processors does the trick. Like Dave Kraus mentioned once before – knowing what trick works is certainly important, but knowing why that trick works is even more important.

300+ compilation attempts spanning over six (yes, SIX) months of day in and day out, 14+ hour days to get (the Makefile with necessary flags and libraries for) one software suite compiled and tested successfully. The timing of this successful attempt can’t be just a coincidence — I sure do believe in Santa Claus and Christmas miracles, and am forever grateful to my advisor’s endless patience throughout these six months.

BASH – Login Counter

Disclaimer

The instructions/steps/scripts/methods given below worked for me running CentOS. It may very well work for you on other linux distributions, Red Hat-like or otherwise. Please note that if you decide to use these instructions on your machine, you are doing so entirely at your very own discretion and that neither this site, sgowtham.com, nor its author is responsible for any/all damage – intellectual or otherwise.


What is BASH and AWK?

BASH is a free software Unix shell written for the GNU Project. Its name is an acronym which stands for Bourne-again shell. The name is a pun on the name of the Bourne shell (sh), an early and important UNIX shell written by Stephen Bourne and distributed with Version 7 UNIX circa 1978, and the concept of being born again. BASH was created in 1987 by Brian Fox. In 1990 Chet Ramey became the primary maintainer. BASH is the default shell on most GNU/Linux systems as well as on Mac OS X and it can be run on most UNIX-like operating systems. It has also been ported to Microsoft Windows using the POSIX emulation provided by Cygwin, to MS-DOS by the DJGPP project and to Novell NetWare.

AWK is a general purpose programming language that is designed for processing text-based data, either in files or data streams, and was created at Bell Labs in the 1970s. The name AWK is derived from the family names of its authors รขโ‚ฌโ€ Alfred Aho, Peter Weinberger, and Brian Kernighan; however, it is not commonly pronounced as a string of separate letters but rather to sound the same as the name of the bird, auk. awk, when written in all lowercase letters, refers to the UNIX or Plan 9 program that runs other programs written in the AWK programming language. AWK is an example of a programming language that extensively uses the string data type, associative arrays (that is, arrays indexed by key strings), and regular expressions. The power, terseness, and limitations of AWK programs and sed scripts inspired Larry Wall to write PERL. Because of their dense notation, all these languages are often used for writing one-liner programs. AWK is one of the early tools to appear in Version 7 UNIX and gained popularity as a way to add computational features to a UNIX pipeline. A version of the AWK language is a standard feature of nearly every modern UNIX-like operating system.


The Script

1
2
3
4
5
6
7
8
9
#! /bin/bash
# 
# Displays the number of login attempts by users
# Gowtham, 2005.09.23
 
echo
echo " `hostname` login attempts by users in this month"
echo
last | awk '{print $1}' | sort | uniq -c | sort -nr