You have awoken me from my posting slumber, so thanks.
What you are asking about could be extremely complicated depending on how robust you want the solution to be.
The core functionality you are looking for is referred to as "Screen-scraping". There is a right way and a wrong way to do this (the wrong way will get the IP address you run it from blocked in some cases).
Essentially what you are asking about is a rudimentary web crawler.
Here is my advice to you:
1.) Don't make the script that searches the web-pages a web-page (if that makes sense).
2.) Instead, make this part of your "project" a program (executable) that runs on a computer and goes out to those websites, grabs the info and updates a database (Java is free and there are many powerful IDE's that will help you get started quickly).
Other possible alternatives include: c++,.NET(IDE is not free), php and Ruby. There are others but these are the languages I would recommend, not necessarily in any order.
3.) Once you have the database populated you SHOULD use a web-interface to display the findings that have been placed in the database.
As for HOW to do it. That is far beyond the scope of my post but here are some links to get you pointed in the right direction.
<-- Love this one.
Keep in mind Screen Scraping is NOT a basic task to perform.
Here is a list of already written screen-scrapers that you could potentially use to populate your database.
Hope this helps.
Please keep in mind that in some cases, this can be considered an "unfavorable" activity at best when it is done incorrectly or for malicious reasons.