akakuroiK
L14: Freak
Nein, denn er redet da vom Multiplayer, nicht von der WiiU Version^^Wird man also kommende Woche bei der PAX ENDLICH was zur Wii U sehen? Das wäre...unerwartet, aber töfte!
Im folgenden Video siehst du, wie du consolewars als Web-App auf dem Startbildschirm deines Smartphones installieren kannst.
Hinweis: Diese Funktion erfordert derzeit den Zugriff auf consolewars über den integrierten Safari-Browser. Dies ist eine Einschränkung von Apple.
Nein, denn er redet da vom Multiplayer, nicht von der WiiU Version^^Wird man also kommende Woche bei der PAX ENDLICH was zur Wii U sehen? Das wäre...unerwartet, aber töfte!
Nein, denn er redet da vom Multiplayer, nicht von der WiiU Version^^
Ja gut, wenn man das so liest... ich denke, Nintendo wird zur E3 einen netten Striptease hinlegen ;-) Vorher glaube ich mittlerweile nicht mehr dran, nett wäre es aber natürlichYes, yes...das sagt er im letzten Satz! Jedoch sagt er vorher ja, dass Nintendo baldig seinen Kimono des Schweigens, bezüglich der Wii U, ein wenig lüften dürfte. Und sie sind ja auch auf der PAX vertreten und Gearbox gibt dort eine Podiumsdiskussion oder dergleichen...
Jaja...ich klammere mich an jedem kleinen Strohalm! LASST MICH DOCH!! >:3
I've been thinking about the GPU a bit recently, and in particular what we can infer from the decision to use an R700 series (Radeon HD4xxx) chipset in the development kits. Firstly, we know that the Wii U's GPU is, to some extent, a custom chip. It may well be based around an existing chip, but at the very least it has 32Mb of eDRAM onboard, and quite possibly some other extra stuff we don't know about. We also know that it began development in 2009. We can expect that in 2009 and early 2010 Nintendo and AMD settled down on the basic specifications for the chip, ie the number of SPUs, TMUs and ROPs, the use of VLIW5, VLIW4 or GCN architectures, and the intended manufacturing node. Now, sometime in late 2010 or early 2011, when Nintendo were putting together the first dev kits to be sent out to third parties, the GPU quite obviously wasn't ready, so they had to go with one of AMD's off the shelf cards as a stand-in, and they chose one from the R700 line (I've heard the HD4830, but I don't know if we've got confirmation of this).
Why did they do this?
We can pretty safely say that whatever GPU ends up in the Wii U, it will be manufactured at a 40nm or smaller process. Why then go with an older 55nm card when there were plenty of 40nm HD5xxx and HD6xxx cards available which could provide pretty much identical performance with a lower power draw? What characteristic does the Wii U's GPU share with the HD4xxx series that it doesn't with any card in the HD5xxx or HD6xxx lines? There's only one aspect that I can think of:
The HD4xxx series were the only 640 SPU cards available at the time the dev-kits were being put together.
This is actually a fairly sensible reason for putting a R700 series card in the dev kit; Nintendo had settled on a core configuration with 640 SPUs (and perhaps 32 TMUs and 16 ROPs), so a HD4830 would naturally have been the best fit for a development kit. I don't think it would be a stretch to say that this is good evidence for the final GPU being a 640 SPU part.
Now comes the real speculation. Early this year, we started to get reports that developers were getting new development kits with a performance boost over previous kits. That's the sort of thing you'd expect to hear if Nintendo replaced the R700 stand-in card with an early production version of the actual Wii U GPU. This lines up exactly with AMD's new 28nm HD7xxx series coming off the production line, and in particular the HD7770 (Cape Verde), their first 640 SPU part since the HD4xxx series. The HD7770, clocked down to about 600MHz-700Mhz, would fit pretty much perfectly into Nintendo's requirements as far as performance, size and heat are concerned.
Nintendo approached AMD in 2009 looking for a reasonably powerful, but low-wattage GPU to put in their mid-2012 console. It's not unreasonable to speculate that AMD said "we've got a 640 SPU part on a 28nm process planned for late 2011, how about we customise something around that?". It explains why they went with a HD4xxx card in the dev kits, it explains why the dev kit power boost came when it did, and it fits very neatly to what we've heard about performance and power consumption.
And to the inevitable "Nintendo would never do 28nm" responses, keep in mind that Nintendo have always used the smallest available node in manufacturing their hardware, right back to the 350nm chips in the N64. Also this would have been decided back in 2009/2010, when it would have been reasonable to expect the 28nm node to be ready for a 2012 reasonably-priced console. In fact, the push back of the release date from the summer could well be due in part to a desire to wait until the yields on 28nm chips increase.
We also have to consider whether NEC (now Renesas), who manufactured the Gamecube and Wii GPUs, and we can expect are first in line to manufacture the Wii U GPU, are capable of manufacturing at 28nm. As it happens, NEC announced a deal back in 2009 (when Nintendo would have been making the decision) with none other than IBM, to manufacture 28nm chips at East Fishkill, New York, in the very same facility which the Wii U's CPU is being manufactured. How's that for a coincidence?
Also if Sony is really using a Southern Islands chip, I can't imagine them using a chip bigger than a HD7870, if our speculation of the 7700 range is right, that means the GPUs are comparable, the 7870 is twice what the 7770 is.
These will likely be customized cards, and in Wii U's case, likely won't resemble the HD7770 but should have performance somewhere around it.
TLR PS4 at best, about twice the graphical power of where we think Wii U's GPU will be. Also Wii U and Xbox3's architecture will have much more in common then PS4's (confirmed if AMD CPU in P4 is true) leaves less room for easy ports for PS4.
Was wollt ihr mir erklaeren? Ich habe die entscheidung von nintedo schon seit gefuehlten 5 seiten abgehakt und den resisitiven bildschirm ja auch seine vorteile eingeraeumt.
Es gibg mir nur um multitouch, das laut aussage in diesem thread, ja auch auf einem resisitven screen moeglich waere.
Und veispiele werd ich jetzt keine weiter nennen. Warum? Ich nannte berwits welche und egal as ich jetzt nenne, das erste kommentar von dir in einem antwortpist wird lauten"fail!"
Ich verschwinde jetzt wieder aus dem thread.
Gibt einige hier mit denen man echt toll ueber sowas schreiben kann, aber du mein lieber roylet gehoerst nicht dazu.
nenn mir doch bitte mal eine idee die nicht einem minispielchen gleicht für die der touchbildschirm gut ist?
Mal was anderes interessantes ausm Gaf:
Spekulation User 1
Spekulation User 2
Och bloß nicht. :v:Na, toll!
Muss ich wohl doch nackig durch die Stadt flitzen, was? :v:
Na, toll!
Muss ich wohl doch nackig durch die Stadt flitzen, was? :v:
Och bloß nicht.
Bis zur E3 dauert es nicht mehr lange, lieber KrateroZ
Aber Gerüchte werden dennoch noch aufkommen
Seit wann haben die Griechen was zum anziehen?
Immer her damit !
Möge Nintendo meinen Unglauben bestrafen!
Seit die EU Griechenland Geld spendet :v:!
Na, toll!
Muss ich wohl doch nackig durch die Stadt flitzen, was? :v:
Immer her damit !
Möge Nintendo meinen Unglauben bestrafen!
Seit die EU Griechenland Geld spendet :v:!
Seit wann hat die EU Geld un es den Griechen zu spenden? :v:
Seit wann hat die EU Geld un es den Griechen zu spenden? :v:
Offtopic:Immer her damit !
Möge Nintendo meinen Unglauben bestrafen!