Why arent there gpu sockets?

Prodigy146

Fully Optimized
Messages
1,751
Yeah i think maybe im a dumbass for asking this.. But why dont motherboards support gpu sockets instead of a pci x16 bus?? idk i just think maybe that pci x16 doesnt take advantage of graphic card bandwith at all. And i dont get why were still using add-in cards for gpus.. Or maybe im thinking about this the wrong way???
 
maybe its got some cooling difficulties.
Or maybe its the way the motherboard is just designed.

I thought there is already an GPU socket, its on the graphics card is it not? :confused:
 
Then the processor would have extra work to do. The GPU is a dedicated video board that has it's own RAM and chip. The processor only has to send and retrieve information from it. When it's integrated, more system resources are being used.
 
I thought there is already an GPU socket, its on the graphics card is it not? :confused:

i dont see why you are confused. the gpu socket is on the card not on the motherboard. Remember when there were add in cards for processors, for mainly laptops?? people experienced almost a 70% decrease in performance when compared to the same processor in an actual socket. See thats what im thinking with gpus, but maybe we wouldnt see such a performance increase because a gpu is vastly different than a cpu. But really i dont see why it would be that hard to implement a gpu socket into a motherboard. And manufactuers like BFG, EVGA could just release there own heatsink+ fan combo with the nvidia/ati gpu.


i just think a pci x16 card is limiting the performance of a gpu, and maybe companies actually know this but keep this going so that gpu's always need to be updated, they always stay expensive
 
Does a graphics card's performance improve when you move it from a PCi-E 1.0 slot to a 2.0 slot? I don't think so...

I think ATi and Nvidia know what they are doing and besides, I don't think it would be practical to implement.
 
i just think a pci x16 card is limiting the performance of a gpu, and maybe companies actually know this but keep this going so that gpu's always need to be updated, they always stay expensive

It doesn't, if it did there would be a different interface. Hence why pci x16 2.0 was developed for newer faster cards.
 
I think most cards wouldn't even be very negatively effected by using AGP. Graphics interface bandwidth has never been a big issue with graphics cards. A 6600GT was tested in AGP mode, and there was no difference on 4x vs 8x AGP speeds.

And there as such things as "GPU Sockets" on motherboards too. It's called onboard (not integrated) graphics, where a graphics chip is attached to the motherboard. It's rare nowadays but it used to be quite common.
 
It's not all that rare, Nvidia has numerous 8-series boards out that support SLI for onboard and an additional card, ATI does as well.
 
Then the processor would have extra work to do. The GPU is a dedicated video board that has it's own RAM and chip. The processor only has to send and retrieve information from it. When it's integrated, more system resources are being used.

Let me expand on that.
When you have a socket attached to a board you are stuck with whatever configuration that socket has been set up with. How many pins. What they do.
You have to design the chip that is going to be used in that socket to mate up with the functions and connections of the board the socket is attached to. If you go a fixed pin/function you stiffle the design of the chips that go in that socket for that board.
However
With a seperate card(video card) if you want to alter/add/reconfigure/new generation, that chip, you simply have the socket makers make a new socket to your specs as well as the pcb makers make a new board. With that socket on your motherboard, you're stuck with what you have with little to no chance of being able to upgrade to next gen.
 
Back
Top Bottom