Agree with pretty much all that's said above - just one thing I'd offer a slightly different perspective on:
Will it change how programming is done like in a serious way?
It may well do. People now (including me) are seriously looking at writing their programs so they work concurrently, taking advantage of multiple cores - and after a lot of research no-one's found a way to say "here you are compiler, take this single threaded program and safely optimise it for x cores." The most likely thought at the moment is it just can't be done.
If this is the case then people do need to change how they program. There are languages where people have to code completely differently - it's a steep learning curve but you get the advantage that these languages really do take advantage of all the cores they can as best as they possibly can. Check out occam-pi for instance. It works by piling together a load of parallel networks and then having them all communicate on different channels, it's surprisingly intuitive when you get your head around the concepts but it still feels like a very old / clunky language to me.
The other possibility is that we see a shift to functional languages being the norm, or at least languages with more of a functional element to them (just like we saw a big shift from procedural languages being the norm to OO languages being the norm.) See Scala and F#, which are essentially like functional versions of Java and C# respectively. (This is probably why Microsoft got on the game with F#, they didn't want to be left behind!) Functional languages are inherently much better at concurrency because they operate via recursion (the purest functional languages like Haskell don't even have loops!). They do however require a complete shift in mindset from OO (most OO programmers avoid recursion like the plague unless an obvious case is required) - this may well happen but a lot of people will get upset in the process!
Personally I believe that should multi-core processors continue to grow in cores (sounds like a stupid point, but at this rate of growth it's going to be a while until we see 128 / 256 core home processors!) the above shift in paradigms will happen. However, until then the change we're most likely to see is that happening at the moment, more easy to use concurrency libraries are being added to existing languages and they're being used more often. Take Java for instance, we'll see a new fork-join framework with Java 7 allowing us to do very fine grained concurrent tasks. That's being added to the java.utl.concurrent package which was added in Java 5, and though I can't remember what they were I'm sure 6 had a new library added as well. You have to code differently to use these, but it's generally not too hard to grasp while still offering the performance benefits you need.
That said, the reason a lot of applications are just coded to use one core is that they aren't power-hungry enough to warrant using other ones. Take Quelea for instance (I released it yesterday so it's in my head at the moment!) I wrote it from scratch in Java, and there's not a single concurrency library in sight. Yet. But the worst it's doing is a bit of Image IO (literally reading an image from a file.) And the killer factor in that is likely to be disk speed rather than anything else. So quite frankly, what's the point of spending twice as long writing it to split it over 4 cores? In the future though I'm planning to have it support video, if that's the case then yes I'll be seriously looking at multithreading it.
However, an application I wrote last year for a company was a server side beast that ran on a server with 4 rather powerful cores and required processing huge amounts of text to extract useful information from. In that case, yes I did multithread it, and doing so reduced the time it took dramatically.
So the answer? Yes, programming will change and to a certain extent it already has. How
it changes though is the big question - will people continue to use languages they know and love and just use concurrent libraries as best they can, or will they go out their way to learn a new language better suited to the task and adopt the potentially steep learning curve? As a side note, here's a tip for anyone looking to graduate in the next few years (but only when you've learnt another language to a decent level already!) - learn a functional language like F#, Erlang or Scala (it's the concepts that matter so don't worry about what language you pick). At worst you've got another language people might want under your belt, and at best you're going to be in high demand in a few years when no-one else has the skills companies are after.