ATI Reveals Radeon X1900!!!

Xeno? Are you on drugs...?

The X1800XT, 7800GT/X and maybe even the X1600XT can outperform that thing. I've seen the XBOX 360 graphics my friend, and that weren't all that much better then the original XBOX.

But that's beside the point...

The thing is ATi will dominate probably for the firts and second quarter of 06', then nVidia will luanch probably a 8xxx series card and dominate the last two quarters, then it will just fluctuate back and forth like it always does. Or, perhaps ATi can dominate a majority of this year and still have a better card then nVidia will. But that's up for debate. Good news is that the video card industry just took another price drop! Which means I can get X8xx series cards for cheaper then they were before. Bwahahaha! :)
 
You do know that it takes time for developers to fully utilize the Xbox 360's hardware right? The Xenos chip is extremely powerful. It's full potential hasn't been tapped yet, that's all.

This how the Xenos chip really works.

Alvino- From thread "The Xbox 360 Report" said:
The ATI Xenos chip is one of the most unique GPUs ever made. With the PS3's RSX and the PC's normal graphics cards, the pixel pipelines and vertex shaders are all individual, so there will be a certain number of pixel pipelines and vertex shaders (like the 7800 GTX's 24 pixel pipelines and 8 vertex shaders). The disadvantage to this is that sometimes some of the pipelines or shaders won't be completely used, which means that they just don't do anything. This just makes it unefficient and a waste.

ATI realized that separating pixel pipelines and vertex shaders is a relatively inefficient process. Obviously you need both shader types, but why fix the number of how many you need? Thus the Unified Shader Architecture was concieved. Instead, ATI created generic shader engines that can be dynamically assigned to either pixel or vertex functions as needed (48 of them in the Xenos' case). You can devote all your engines to vertex processing when theres a lot of triangles, or if theres only a few large triangles but lots of pixels, you could devote all the shaders resources to pixle processing. ATI claims this unified shader model yeilds the best performance (120 billion operations per second) with maximum efficiency. This flexibility allows the GPU to efficiently utilize it's pipelines instead of leaving some of them hanging like the RSX (which uses the traditionally independent shaders).

gpu_sops_xbox360vsps3.jpg

The PS3's bar is an estimate of the RSX's GPU shader operations per second. It is based upon the currently released specs of the RSX.

According to ATI, the Xenos is actually TWO chips. GPU is the "parent" die, but there is also a "daughter" die, which has embedded memory, and inside that memory is intellegence that does a lot of the graphics processing from within the memory. Instead of relying on the main system memory to do anti-aliasing, Z, alpha processing and stencil processing, this will be done within this embedded memory. Add that with the intelligence inside the memory, you never have to leave it. This pretty much gives you infinite bandwidth.

GPU_foto.jpg


To be precise, it gives you 2TB per second of memory bandwidth. The big payoff from this high overhead is in anti-aliasing, which is a major factor in the Xenos' HDTV output. If you compare with the R520 (or better known as the X1800), there is a lot of heavy lifting with Multi-Sampling and Super-Sampling, which entials a lot of extra drawing and blending. With the Xenos, all the anti-aliasing happens within the graphics memory, so there's no impact on surrounding system resources and you're not bogging down the main system memory with samples that never get used. This totally helps everything become more efficient and resourceful.

In my opinion, the Xenos is one of the best and unique GPUs to date. ATI has really done a great job designing this and making this chip possible. Yes, the PS3's RSX may have 50mhz more (Xenos runs at 500mhz while the RSX runs at 550mhz) of clock speed, it's going to be inherently inefficient and will run on pure brute force instead of grace. Also, the RSX doesn't have the Xenos' unique intelligent memory, which means that the PS3's 256mb XRAM (another 256mb is for the CPU, while another 256mb is for the GPU) will have the unfortunate honor of doing all those processes with extremely slow memory bandwidth compared to the Xenos.

I'm sorry for the long read, but I thought I'd just explain the Xbox 360's GPU, because not many people actually understand how it really works.
 
That's too long to read. My eyes are seeing dots. I'll take your word for it that it's good. I'm not much of an ATI fan anyways. Good luck saving, but now you have another reason to keep saving longer knowing that the X1900XT is coming out, lol.
 
Well, for me, I say screw the X1K series. Because they're WAY to expensive at the moment. But as long as they keep pumping out X1K series cards that just means my future video card, the X850XT PE, will just get cheaper and cheaper. And besides, today's video cards are still amazing compared to years ago. I mean the Radeon 9500 series cards can still play today's most demanding games, maybe not at the greatest settings, but they still play them nonetheless.

What I'm saying is that the more nVidia and ATi challenge each other, the better it off us consumers/customers are. Because it will just keep lowering card prices such as the 7 series cards. Another interesting thought. Video cards may be getting more and more pixel pipelines and higher clock speeds, but soon enough that may not need to happen, because Physics Processing Chips are coming out, and since a video card has to process the shaders, the 2D images, the 3D images, as well as physics and their animations, the new PPU chips will handle a lot of the graphics on it's own. WHich is good, because from what I can tell you'll still be able to use a mid-range card and get high quality performance! :)


The XBOX 360, well, I knew that ATi was doing somethign to the pixel pipelines and what not, and that was it. Glad to know how it works and what ti can do. But I don't think that the XBOX 360 will have better graphics then the PS3. But this thread isn't about the XBOX 360 vs PS3, it's about the lastest and greatest video card, the X1900 series. So let's just end the debate about the XBOX and PS3 thing and move on to what this topic is really talking about. Which is the X1900 series video card.


The X1900 will hopefully be able to use that GDDR4 RAM and Shader 4.0. Because I'm tired of this GRRD3 and Shader 3.0. We need new stuff out. Hopefully this card can deliver. :)
 
Oh, you can mentally block out the stuff about the PS3. I was too lazy to edit everything. :p

The R580 sounds promising...I can't wait to see the benchmarks and reviews for it. Oh also, the X850 XT PE is a pretty good card, but if you're lucky enough to get a X800 GTO2 with a R480 core, then you can easily unlock the X800 GTO2's pixel pipelines and overclock it to X850 XT PE speeds for way cheaper. Just thought I'd let you know. ;)
 
i prefer ati than nvidia bcuz when u upgrade crossfire u dotn have to get two exactly same cards, with SLI u have to buy another two exactly same card. =/
 
Yeah, that is true. But you should wait for CrossFire to completely mature...it's still a bit buggy right now.
 
One thing I'm confused on since I never used an ATI card before besides the integrated ATI xpress in my laptop, but can you use Crossfire boards in any PCI-e x16 supported motherboards? From what I can see, all motherboards (if their chipset support PCI-express) can support either ATI Crossfire or Nvidia SLi mode right?

There's also an ATI All-In-Wonder PCI Express Edition already out now. It's powered by the Radeon X1300 and cost around $200.
 
TRDCorolla said:
One thing I'm confused on since I never used an ATI card before besides the integrated ATI xpress in my laptop, but can you use Crossfire boards in any PCI-e x16 supported motherboards? From what I can see, all motherboards (if their chipset support PCI-express) can support either ATI Crossfire or Nvidia SLi mode right?

There's also an ATI All-In-Wonder PCI Express Edition already out now. It's powered by the Radeon X1300 and cost around $200.

Well, technically, SLI and CrossFire are exactly the same thing. That being that they both use a Link Interface to connect two GPUs together via a cable or a some other device. I do wonder though, if you can use CrossFire in an SLI board. But the more and more I think about it the more and more it seems as a no. Simply because I think you have to have a certain chipset within the motherboard. But then again, if an SLI chipset can power two SLI video cards, why can't it power two CrossFire cards, or vise versa?

Anywho though. I definetly like seeing new and better graphics cards out on the market. It means that we, the consumers, get to experience even more realism in our games, video editing, designing, or whatever else requires a graphics card. :)
 
Back
Top Bottom