Looking back on my post "inept" was too strong of a word to use regarding developers. Perhaps it was tainted by a biased source. (this particular one was developing a proprietary engine) Although my statement regarding middleware optimization was correct, & still is. Progress is advancing rapidly, & has been made even since my last post in fact. A rather recent March update to the SDK (ver. 2.03) which in all probability is linked to the upcoming V5 development kit. As it stands, afaik the most popular middleware engines are running optimally with all features intact. (not enough shaders would render this impossible)
What did not occur to me was why was/is Nintendo testing & optimizing these various engines, adjusting & tweaking hw specifications, etc.? From its inception Nintendo was touting the Wii U's ease of portability, why the change? The most obvious answer was 3rd party engine compatibility, though it had to be more than that. I spoke of the Nintendo "footprint," & it could have indeed been larger than I initially thought.
Were 3rd parties were having trouble initially with the CPU perhaps? Instruction sets are typically tailored, or highly customized by the console vendor based upon their particular performance needs. IIRC, the GC's Gekko had an additional set of 50 instructions to the PowerPC 750 on which it was based. (as well as stripping away non-essential features) Nintendo is again going with IBM's PowerPC architecture. The rumored OoOe tri-core CPU with 2-way SMT appears legitimate. Why this seems unbelievable has me bewildered, & questioning posters incredulity. Is this based upon the very cheaply produced, severely underpowered Wii? (which btw was a first on the Nintendo home console front) I would have to assume so. The 32mb embedded ram is not as cost prohibitive as many of you might believe.
Every console CPU & GPU obviously behaves differently, the Xenon is a highly customised variant of the VMX AltiVec unit. Alas, I think that the PS3's PPE & SPEs are irrelevant in this discussion, with the exception of highlighting notably custom platform architecture. Why? Because Nintendo used the 360s' development environment model as a baseline of sorts. The 360's ease of development, ease of PC portability, generally superior versions of multi-platform software, ease of middleware engine adaptability, etc. (as well as seeking developer input from close 3rd parties) Thus we could assume that the Wii U's CPU could be incoporating a more modern VSX AltiVec unit. Though due to its customisation, instruction sets and data-formatting would also have to change. Optimized code for the 360 may run sub-par on the Wii U, (or stall) & this problem is also compounded with its communication/relationship to the GPU. (which is also heavily customized) While by no means a quantum leap over Xenon computationally, its efficiency will be where the CPU differentiates itself. (a penchant of Nintendo's)
It's a developer learning curve that's all, which will be beneficial to native engines much moreso than ported ones. The performance will scale up considerably on proprietary engines I've been told. (much like the PS3, though even more capable) Nintendo did design this platform attempting to be extremely port friendly, though its own software still drove & dictated the initial design & feature set. I've known of the target system specs for some time,& they are indeed accurate. I just do not know what alterations have been made. (as some have) The "target" specifications are what Nintendo told 3rd parties to expect from finalized hardware.
As far as the lighting is concerned, I tried to describe having some fixed functionality in parallel to programmable shaders. What this yields is stable, predictable performance. This is especially important when rendering a seperate viewpoint on the DRC (display remote controller) from what is shown on the main screen. (differing geometry, lighting, shader effects, etc.) I attempted to describe what I was told, I hope I didn't lose any aspects or understanding in translation. I referenced the GC's architecture because there seemed to be parallels. There is definitely a DSP, I read a page back where there was some confusion regarding its inclusion. Why is anyone bringing up the Ipad? I do not want to even enter into this nonsensical debate, the occupy completely different hemispheres.