As long as I can remember, I've had an interest in computer programming. I started tinkering as a kid back in the early 1980s with a TI-99-4A and Atari XL Series (remember those membrane keyboards?), Atari STs and Apple II's. Most of that was just goofy kid stuff, sorting baseball and hockey cards and stuff that was Star Trek related.
It was in college, though, in 1984, when I began to learn to program. I attended the University of Victoria in British Columbia (that's in Canada, eh) and at the time the school had an IBM mainframe running UNIX. I took Fortran, COBOL, Pascal and, everyone's favorite, assembler language where you actually program at the machine language. While all of these languages were different, there were a few similarities.
First, the editor was VI, a command-line editor made up of cryptic key commands. Second, every student was allocated a certain weekly "budget" to program with, which was really a proxy for CPU cycles. Here, if you write a program with some sort of cyclic loop, you would burn through all of your CPU cycles and not be able to complete a project. There was a catch though - once you hit zero, you could stay logged in and go into the negative as long as you don't log out. So if you wrote some crap code, you had to stay in the lab all night long until you finished or you would have to wait until the next week to get your new allocation of CPU cycles. The third commonality was that the debugger was crap. The printout you got just gave you the last line of code that was executed and you had to figure out why.
What this meant for programming was that you had to do an immense amount of pre-work before you started typing anything into the mainframe. All the code had to be documented, flow charts had to created, and mock "runs" had to be done to see how the program flowed. The program would then run, invariably fail somewhere, and the whole process would be repeated. I would say for every hour of actual computer time, I would need to spend about four to five hours working with paper.
But then in my last year the school's attitude toward programming changed. The university invested in a Mac lab, complete with brand new Motorola 68000-based Macs, the crème de la crème of personal computers, if you will. My attitude towards programming changed.
With the Macs, there was no allocation of CPU because the CPU was dedicated to that machine. That meant one could write a subroutine or two, run it, see how it performed, tweak it, and then move on to the next task. The debugger was fantastic as well. Programming in 68000 assembler was dead easy compared to the IBM mainframe since it gave you an actual view of the contents of the registers and allowed you to walk through the code line by line.
In a sense, Macs enabled me to "hack" a lot more and write less elegant code because I didn't have to worry about burning extra cycles here and there. This meant I could code more for convenience than efficiency, reducing the time I spent developing code away from the computer significantly. I mentioned earlier that I spent about four to five hours on paper for every hour I spent on the computer. I would say that dynamic flipped, where it was very little time with paper and almost all the time on the computer. Did I get into some bad habits and write less efficient code? Sure, but it didn't matter anymore and I managed to finish my assignments much faster.
Looking back, the introduction of the Mac lab at my school brought out the hacker in me. I was able to tinker, play and try stuff that I could never have done on the CPU-restrictive mainframes. This actually helped me when I got into the workforce, as my first job was programming on Solaris and HP UX boxes, in which I wrote plenty of perl and c shell scripts.
So I'll wish the Mac a happy 30th birthday and tip my hat to the box that changed the way I look at programming. I'll toss the rest of my 80-column paper in the recycle bin and go back to hacking away at source code.
Read more about data center in Network World's Data Center section.