There was an API war back in the mid 90's where Microsoft was using their "Embrace, Extend, Exterminate" tactics to try and kill OpenGL. SGI responded by optimizing OpenGL themselves for Windows, since Microsoft had crippled it. The results were that they proved games and graphics programs ran just good if not better than on D3DX. But the result of the API War that time left neither in any dominant position until Microsoft persuaded Game Developers to take away the API choice.
This time around we saw in Vista release Microsoft was back to their old tricks with a fresh round of "Embrace, Extend, Extinguish". All the way back in 2004 Nvidia, IBM, Sony, and many others begin to work on cleaning up OpenGL for everyone. Sony wrote Collada FX and donated it to OpenGL founder SGI. Soon a united effort by almost every hardware manufacturer rose to not only rewrite and clean up OpenGL, but to enlarge it with scalable cross platform abilities far surpassing everyone's wildest dreams.
So it's not that Microsoft wouldn't share as much as the rest of the World that were locked out of Microsoft's Private Club, decided to make their own club. The result is that from the combined extreme manyears labor to write a whole operating system from all these contributing Corporations they have exceeded their expectations.
Microsoft DirectX? What's that LOLz??? People will be asking in 10years!
========================================================
But yes, Unified Shader Model is in RSX and it has many secrets that are going to be revealed soon. Notice Sony never said RSX was less than it's near 2TFLOPS performance by itself and it was everyone else insisting it was a G70. Sony never said it was or wasn't. How do you think they got that number though? Made it up? No, and near 2TFLOPS for RSX alone isn't NVFLOPS either!
Even Xbox 360 was not able to take full advantage of Unified Shader Model Architecture until DX10 was finalized and that is why Gears was their first great looking game. Why they then could display 1080P. But Xenos still does not support full DX10 features because Microsoft had just begun DX10 development just prior to the GPU being finished.
But yes MGS4 and Killzone II are the only Developers involved in Beta testing MT Evans DX10 features. With Killzone II it should be obvious that RSX is DX10 capable. People should have realized that when they announced they were doing Deferred Rendering in combination with MSAA for the Demo last summer. Only PC GPU's are capable of doing that like G80 series cards. Where the capability to do this is required by Microsoft's own DX10 Spec Standards. Xenos is not capable of running this game or that spec.
Results? MGS4 and Killzone will set the standard for games on PS3 to follow in years to come!
=======================================================
Now just one last tidbit to set your tails wagging if you are a PS3 fan!
Excerpt on Deferred Rendering Studies on Cell, running on only 5 SPE's of Cell that were clocked higher than the PPE was at 3.2Ghz. Now they conveniently left out the GHz, but it doesn't take a Genius to see that's what it is. Especially since I'll put the graph in showing Cell running at 4GHz on first pass Lab Tests back in 2005. That was a 6 pass test that had an Fmax of 4.4GHz. Now for this Study done last summer!
Published by Dr Mallinson from University of Utah SCE Research and Development August 03, 2007
Deferred Pixel Shading on the
PLAYSTATION®3
Our initial results are encouraging and we find benefits from
the higher clock rate of the Cell/B.E. and the more flexible
programming model. We chose an extreme test case that
stresses the memory subsystem and generates a significant
amount of DMA waiting. Despite this waiting the algorithm
scaled efficiently with speedup of 4.33 on 5 SPEs. This
indicates the Cell/B.E. can be effective in speeding up this sort
of irregular fine-grained shader. These results would carry
over to less extreme shaders that have more regular data
access patterns.
The next two sections of this paper introduces the graphical
problems we are solving and describe related work. We next
describe the architecture of the computer entertainment system
under study and performance measurements of the pixel
shader. We study the performance of that shader on a test
image and compare it to the performance of a high-end state
of the art desktop GPU, the NVIDIA GeForce 7800 GTX.
Our results show the delivered performance of the Cell/B.E.
and GPU were similar even though we were only using a
subset of the Cell/B.E. SPEs. We finish with some
concluding remarks.
The Link to this PDF document:
http://www.box.net/shared/static/3s1lel0o4s.pdf
IBM Touting 45nm Cell BE destined for PS3 capable of running at 6 to 7GHz same power usage of present Cell BE running at 4GHz!
http://www.xbitlabs.com/news/cpu/di...IBM_Touts_New_Generation_Cell_Processors.html
So Cell by itself with just 5 SPE's running at a higher clock rate of 4.33GHz can produce Deferred Rendering results comparable to a Nvidia 7800 GTX GPU.
This means Cell BE is running a DPM (Dynamic Power Management) like IBM's "Energyscale" System. Which coincidentally was developed at STI Research Labs in Austin, Texas where Cell BE was born.
The new 45nm Cell BE for Playstation 3 will be capable of running as high as 6 to 7GHz. This will be third generation Cell and possibly spawn a Slim version!
If Cell could logically out gun a 7800 GTX using all it's cores, isn't it obvious RSX is not one. When Killzone II is running all it's Deferred Rendering and MSAA on RSX that's a feat only an Nvidia 8800 GTX could do. Plus has any one ever attempted to over clock a 7800 by over 20% to get to 550MHz of RSX? NO! It would cook itself in no time. Maybe 10 to 12% but let's be real?
RSX is no 7800 off the shelf part with it's memory Write interface to XDR memory of up to 3.2GHz speed through EIB. Those interfaces inside of RSX are required to be Rambus Parts. You ever seen a PC GPU that can hook up to FLEXiO or FLEXPHASE Memory? None exist today!
Both of them are Rambus interfaces. One for FLEXiO and One for GDDR3. RSX must have some type of EIB or Direct Bus to the Display Engine for these Studies to have even taken place in order to bypass the RSX's Shaders and other components.
Here's the discovery I made just a few weeks ago and it's been staring us in the face. The only Memory Controller interface that can do this is Rambus's own GDDR3 Memory Controller. Read this PDF to see why:
http://www.box.net/shared/static/ufs42dnsw4.pdf
@Death Dealer
Up until now yes RSX has been running but is not showing it's full power and the lack of a proper API also means the Drivers can't properly take advantage of RSX's full Powers either. These are called layers of Abstraction and one of those that separates high level code (Games and Programs) from getting down to the hardware low level code (which would be the improper API til MT Evans is released here soon) is the Virtualization Layer. Which is part of the Hypervisor. That Hypervisor is capable of making RSX look like anything they want to have anyone see. It is capable of even having Developers not seeing the truth of it. Maybe not even on purpose but rather to get a driver working on Game OS with access to the chip till the proper API can be applied through a firmware upgrade!
Hope that answers that for you!