What prodigy said, the PCI Express bus is extremely fast. If you integrate the GPU onboard the motherboard there is (I would think) no way that the GPU would "talk" to the rest of the system faster than it can now. (At least current GPUs) An issue with putting a high end GPU onboard is board space. Look at a high end GPU. (Well really any GPU) They are basically small motherboards with small CPUs. They have VRMS (voltage regulation modules) a processor, and RAM. Where do we have room on a Micro ATX motherboard to put all of that? Maybe we could fit a MicroATX motherboard and a GPU into a board that would fit the ATX form factor, but the way techology goes, no one is going to want to reverse everything and get bigger motherboards. I mean, motherboards get smaller and smaller, who is going to want to revert back to a bigger motherboard just to have an integrated GPU?
However, I think we definately need to revert to something better than what we have right now. Think of the first graphics cards, just small thin cards...think of what they have morphed into. Cards that take up 2 slots, long as hell, (the highest end ones) and have heatsinks with fans, and even need their own power connector. I mean, if you look at the progression of graphics cards, the cards just get bigger and bigger. Same thing with CPUs. Think of the first CPU's, they didn't need a heatsink and used less power than today's CPUs. Now? Need gigantic masses of aluminum or copper to keep them cool and use up a lot of power. Have you ever looked at Graphics card temperatures? They are a travesty! Some can run at like 90-110C! That is like 200+ F. And why are the good ones so long? Well they need more board space, and you can't go out, that would hit the side of your case, the only place to go is back.
Something needs to be done.
If motherboards had a GPU socket you could have a REAL cooling solution for them, and would alleviate the need for a PCI Express slot (unless you need some other high speed system bus for something else that PCI is too slow for) Think about it, we're headed for bigger and bigger graphics cards unless 32nm and smaller chips start coming onto the market therefor requiring a lot less heat and power. But 32nm? You know how small that is? Correct me if I'm wrong but it is only a couple of atoms wide, you can only get a chip that is in theory 1 atom wide, so that is like...I don't know...only a couple of nm. Sometime soon we'll hit the limit of nm on a processor, and eventually silicon based PCs will be obselete. (Can you say quantum computer?) PC power supplies are reaching the limit of power you can pull out of an AC socket of a wall (or so I've heard, 1200W is closing in on the limit) so that is our, "limit" on PC power requirements. What makes CPUs so different from GPUs? Maybe one day all processing power will be accounted for on the CPU die, the architexture of GPUs (shader processors and such) can be emulated on a CPU die or just built in. The only thing stopping companies from putting high end GPUs embedded on motherboards...is people want to pick and choose. Crap motherboards will have embedded video for a budget system, and that's cool, but for gaming or CAD or something, you want some choice. Think of buying a motherboard and not being able to choose the CPU on board...that would suck, big time, and it's the same for if you were paying for a high end GPU to be embedded onboard that might be too fast or too slow for what you need.
Last thought, as for Graphics RAM, we could definately fit that on the motherboard. I mean, shit, graphics card average about 512MB - 1GB of RAM, you can easily fit that on a laptop RAM stick, and probably on a card even smaller than that. So put in a small RAM slot somewhere on the motherboard (maybe parallel to the system RAM slots) and we could put a GPU ram stick in there. that's not the problem, the main problem is the power consumption and board space needed for a GPU socket. And before anyone asks I'm thinking of getting this book published