that assumes that all the computers do the same job,
e.g serve web pages or search databases, indeed a webserver can only have so many connections before it can't accept anymore, that's the point of load balancing, to share the connections over several servers to make the experiance better for all...
(load balancing is built into windows server 2003). but a load balanced webfarm is still not a super computer.
simillar affair with clustering, (though this is usually specific to database applications), a number of machines will work together performing the same type of task, usally, although both machines are physically different they will share a common storage source such as a scsi drive or a network SAN.
the clusters, (in this case) perform not only load balancing but can also perform node redundancy meaning one cluster node will take over as one fails.
clustered services are still not super computers though.
Grid computing is the way to make a super computer from smaller computers,
basically a super computer is defined as such because of the amout of calculations it can perform in any given time.
with grid computing you take many lesser computers and summ their powers using software, basically you take a common task, and you divide the task into smaller units, you send a small unit to each lesser computer, the lesser computers return the results in a format easy to put back together and the last computer (or central computer) displays or stores the results.
ergo you've completed many cycles in a shorter time by 'off shoring' work and have a super computer.
Super computers of this nature are used in applications like protein folding projects and the search for extra terestrial life.
the SETI project (although not official) is the worlds largest/fastest supercomputer due to the many hundreds of thousands of machines connected to the project...
the point is that with your handfull of machines, you could...
set up a web farm, (but youd still see the same speed limitations with a lesser number of connections since you already said that they wern't great machines. (I'm assuming that you arn't going to get a large amount of traffic.
You could set up a clusered database, but in order to do this efficiently you really need a fast external drive with dedicated connections from each machine to the drive, you also have to look at the software used that takes advantage of this, applications like oracle, are not only a right bitch to set up in cluster ready modes, but they are also very expensive.
with the last option, (which is the only real way to build a super computer as the summ of all the processors since they will work in parallel) requires specialist software, either custom written (such as the protien folders or seti projects), or adhereing to a framework that hasn't been fully standardised yet.
TBH the best use for an oold PC is either...
a small personal webserver/ftpserver to take work and from school/college/uni/work
a file server to help you clear up space on your current machine.
a machine to share the internet with the rest of your house, possibly running firewall/proxy software to hepl protect you and your family.
a machine dedicated to illegally downloading stuff that's on 24/7 -see how long it takes to get in trouble
a personal media server for your house or even just for your room.
a portable MP3 player to go in your car, (search the net plenty of people have done this crazy idea!)