I posted this on my blog just now, but I thought it'd make an interesting discussion so I've duplicated it here. It'd be interesting to hear other people's views - especially if you're a student or working in IT at present.
http://mikkle.co.uk/content/shell-being-shelled
In this day and age it strikes me that, for the most part, computer science students are pretty good at pointing and clicking. Creating websites in Dreamweaver? Knocking together bad GUIs using GUI designers without ever understanding a bit of the underlying code? Even starting / stopping servers or services with a few point and click operations? No problem at all.
Some try to make themselves stand out by installing various flavours of Linux, then pointing and clicking around KDE with varying degrees of success. This of course will allow them to honestly slap "avid Linux user" on their CV in the hope that a future interviewer will be impressed by their knowledge that there's more to the world of OS's than Microsoft's latest offering.
However, what happens if you take away all that nicety? What happens if you leave just the bare essentials - the shell underneath? Then how many stereotypical "geeky" computer science students would be able to navigate their way around?
I suspect not many.
This, in my mind, is a bad situation to be in. Sure, learning how to use KDE is useful. But at the end of the day it isn't hard, and if you're even moderately comfortable with Windows you'll be able to work out how to click your way around in next to know time at all.
Thing is though, for better or for worse in the real world Linux is, relatively speaking, pretty rare on the desktop front. There's companies out there that'll provide the option of using it which I think is fantastic. It'd certainly be my choice. But where Linux / Unix has really found its home is in the server end of the market. The security, stability and reliability offered here is quite frankly amazing (especially with the likes of Unix and ZFS) and companies have caught onto this, often using such machines to drive enterprise scale applications. The bonus of course and what's at least partly driven this success is that the end user neither knows nor cares what OS the servers are running. As long as they get their content in the form they're after, they're happy.
However - servers don't need X, they don't need KDE and they definitely don't need the likes of Beryl style effects everywhere. Running these things is often a waste of resources that could be allocated much more usefully elsewhere. So for very good reasons, a lot of these servers don't run X at all. And if you haven't got a graphical environment, what are you left with?
The same thing that seems to be fast becoming alien to computer science graduates - the shell.
Note here that I'm not saying every man and his dog should have to learn to use one. If you're a casual PC user that has no interest in the IT field of work then there's no reason to bother. But students looking for a career in IT? Knowing the basics of a shell should surely be taught as a fundamental skill, not just an interesting piece of history. The shell is still very much alive today - it'd be a serious mistake to assume otherwise.