Why arent there gpu sockets?

Well . . . . their are motherboards with GPU's onboard or integrated but there's one main reason why they don't just put a high powered GPU on the motherboard.

There's not enough space!

- Look at the heatsinks on todays video cards! If it weren't for those, temperatures on the GPU would hit over 100C's on idle! A CPU heatsink is big enough on the motherboard, and where would you add the second one for the GPU?

- Memory! Where the hell are we going to put all that video memory? You would need about two or three square inches free on the motherboard for fitting around 1GB of Video RAM. Where's that going?

The point is, they have such things, but they don't put flagship GPU's on the motherboard. It just would take up too much space.

Now . . . Laptop's are a completely different story . . .

And why are GPU cores not interchangeable. It's because the architecture of a GPU is much different with each new GPU.
 
i think they should make a gpu socket would make life so much easier
just add 2 more slots for video mem
surprised nvidia never did that....
 
I don't think its a problem having what we have now.

Too much space would be used on the motherboard, if they had to be wired up to have seperate connections for a video card alone, unless the video chip used standard system RAM, which intergrated chips share, but that isn't very effective.

It'd be putting a lot more hamper on the CPU I'm guessing in the end.
 
I dunno if making a GPU socket would be a good idea because then you'd have mfg's changing sockets every year or so because of ways they found better bandwidth or something... then you work CPU socket changes in there too, and you have design departments being worked to the bone and not being able to come up with better boards just trying to get stuff to work. A universal interface like PCI-E makes more sense to me as long as it has enough bandwidth. Not to mention the other drawbacks listed like not enough space for heatsink, etc.
 
I don't think its a problem having what we have now.

Too much space would be used on the motherboard, if they had to be wired up to have seperate connections for a video card alone, unless the video chip used standard system RAM, which intergrated chips share, but that isn't very effective.

It'd be putting a lot more hamper on the CPU I'm guessing in the end.

if they had 2 slots next to regular ram that was gpu only that would be cool
and it would save space in the end u see how big thos friggin boards are on the 8800s and above
only prob is it would rely on the mb chip set and that could put a big strain on it
 
If they ever did what you guys are going on about, it would be foolish and stupid. You would be stuck with a fixed configuration of the socket and any of the support electronics. If a new gen of gpu chips came out with a different pin layout or more pins you'd be so screwed. You wouldn't be able to use it.
But
If you have a slot on the board like we have now, you'd just swap out the card and go on about your business.

If it was on the motherboard, when a next gen chip came out that had more pins or laid out differently, you'd have to BUY A NEW MOTHERBOARD.
So what would you rather do? Buy a nice vid card for a couple hundred dollars. Or replace your motherboard for 5 or 6 hundred every time a new gen chip came out. It's not practical.
 
ok but that wasnt my point


My point is that something needs to be implemented that takes more advantage of the bandwith that these $400 gpu's put out. idk i just dont get why it takes an industry so long just to increase the bandwith on a pci bus, and still its not that much more:

2004 pci x16 Date Rate 250 MB/s

2007 pci 2.0 Data Rate 500 MB/s

Expected 2010 pci 3.0 Data Rate 1 GB/s
 
What prodigy said, the PCI Express bus is extremely fast. If you integrate the GPU onboard the motherboard there is (I would think) no way that the GPU would "talk" to the rest of the system faster than it can now. (At least current GPUs) An issue with putting a high end GPU onboard is board space. Look at a high end GPU. (Well really any GPU) They are basically small motherboards with small CPUs. They have VRMS (voltage regulation modules) a processor, and RAM. Where do we have room on a Micro ATX motherboard to put all of that? Maybe we could fit a MicroATX motherboard and a GPU into a board that would fit the ATX form factor, but the way techology goes, no one is going to want to reverse everything and get bigger motherboards. I mean, motherboards get smaller and smaller, who is going to want to revert back to a bigger motherboard just to have an integrated GPU?

However, I think we definately need to revert to something better than what we have right now. Think of the first graphics cards, just small thin cards...think of what they have morphed into. Cards that take up 2 slots, long as hell, (the highest end ones) and have heatsinks with fans, and even need their own power connector. I mean, if you look at the progression of graphics cards, the cards just get bigger and bigger. Same thing with CPUs. Think of the first CPU's, they didn't need a heatsink and used less power than today's CPUs. Now? Need gigantic masses of aluminum or copper to keep them cool and use up a lot of power. Have you ever looked at Graphics card temperatures? They are a travesty! Some can run at like 90-110C! That is like 200+ F. And why are the good ones so long? Well they need more board space, and you can't go out, that would hit the side of your case, the only place to go is back.

Something needs to be done.

If motherboards had a GPU socket you could have a REAL cooling solution for them, and would alleviate the need for a PCI Express slot (unless you need some other high speed system bus for something else that PCI is too slow for) Think about it, we're headed for bigger and bigger graphics cards unless 32nm and smaller chips start coming onto the market therefor requiring a lot less heat and power. But 32nm? You know how small that is? Correct me if I'm wrong but it is only a couple of atoms wide, you can only get a chip that is in theory 1 atom wide, so that is like...I don't know...only a couple of nm. Sometime soon we'll hit the limit of nm on a processor, and eventually silicon based PCs will be obselete. (Can you say quantum computer?) PC power supplies are reaching the limit of power you can pull out of an AC socket of a wall (or so I've heard, 1200W is closing in on the limit) so that is our, "limit" on PC power requirements. What makes CPUs so different from GPUs? Maybe one day all processing power will be accounted for on the CPU die, the architexture of GPUs (shader processors and such) can be emulated on a CPU die or just built in. The only thing stopping companies from putting high end GPUs embedded on motherboards...is people want to pick and choose. Crap motherboards will have embedded video for a budget system, and that's cool, but for gaming or CAD or something, you want some choice. Think of buying a motherboard and not being able to choose the CPU on board...that would suck, big time, and it's the same for if you were paying for a high end GPU to be embedded onboard that might be too fast or too slow for what you need.

Last thought, as for Graphics RAM, we could definately fit that on the motherboard. I mean, shit, graphics card average about 512MB - 1GB of RAM, you can easily fit that on a laptop RAM stick, and probably on a card even smaller than that. So put in a small RAM slot somewhere on the motherboard (maybe parallel to the system RAM slots) and we could put a GPU ram stick in there. that's not the problem, the main problem is the power consumption and board space needed for a GPU socket. And before anyone asks I'm thinking of getting this book published :)
 
It just isn't cost efficient for the companies. As you said, a GPU requires a lot of power. It would take a lot more resources to create a motherboard. You would need more transistors and voltage regulators. You would need a separate slot on the motherboard just for GRAM. What about people that want to SLI and Crossfire? I want a laptop, you have it the opposite way. Processors today are using less power and putting out less heat. Compare a 90W P4 processor to a 95W triple core. We have these gigantic heatsinks because we have the ability to overclock like hell now. Processors have always used heatsinks. A graphic chip is designed very different from a processor. They operate completely different. The GPU takes a huge load off of the CPU, giving you unparalleled performance to when it's integrated. These days the GPU even takes care of the Physx.
 
I've said this question before here. See, I believed it before, but there's a few issues that MAY render it impossible.

GDDR speed support, stream processors expectations, power consumption (# of plugs) and much more. There's the three of them that could explain why. But I might be wrong...
 
Back
Top Bottom