Wii U GPUThe Wii U GPU is made by AMD and is based on a modern Radeon HD design. Nintendo has been working with AMD for two console generations now, and this will be the third time AMD/ATI supplies the graphics processor in a Nintendo console. The Wii U CPU will be made and supplied by IBM. According to reports, the Wii U GPU is based on the Radeon HD 5000 series GPU, which was introduced to the PC gaming market in 2009. However, while the based architecture will be the same, the Wii U GPU will be custom made with additional features. For more on the console hardware, check out the complete Wii U system specs.

Wii U GPU specs

The Wii U GPU specs have yet to be announced, but based on developer comments and on the already available design sheets from AMD, we can estimate the following features for the graphics chip:

  • Based on Radeon HD 5000 “Evergreen” series
  • Built on 40nm manufacturing technology
  • Unified shader architecture
  • GDDR5 memory support (memory is likely to be shared with CPU and system)
  • Shader count: 400 unified shaders (rumored)
  • 75 GB/S bandwidth
  • Low power design, 50W TDP
  • Full 1080p, 60 frames per second support

The Wii U GPU has been in development since 2010 according to sources, and it’s likely that AMD created a highly specialized chip for Nintendo, with many modern features incorporated into the 3 year old Evergreen design.

Wii U GPU power and performance

While the above specs don’t sound much compared to the current generation graphics chips from AMD, where up to 2000 unified shaders are possible, the specs are feasible for a console. Due to pricing, power, and resource issues, the GPU is estimated to have 400 unified shaders with extra features added. This would still make the Wii U GPU many times faster than what’s found in the Xbox 360, which only has 48 shader units. It has been rumored that the Wii U GPU includes a small amount of RAM (32 MB) embedded with the graphics processor and aside from the main system memory, although this report hasn’t been confirmed. This RAM is supposedly used as a framebuffer and to assist with some of the other GPU features.

  • Smurfman256

    Mabye it’s actually running on a 28nm process. Back in 2009 (when production on the Wii U GPU was being made) NEC (who makes the GPU for Nintendo and is a subsidiary of AMD) and IBM (who are making the GPU for the Wii U) signed an agreement to make a 28nm chip in *gasp* Eask Fishkill, NY (Where the chipset is being made).

    • Smurfman256

      AND the Radeon HD 7750 has a 50W TDP WITH 512 unified shaders. Or it could be an underclocked 7770 @ 800MHz with 640 unified shaders.

      • Totoro

        No way, that’s would put the console WAY too expensive. Ninti would probably go with a 400-450 unified shaders MAX option, and that would probably put it in the Radeon HD 57xx line. Still, just being in that line would give it a definite advantage over the competition, and directly make it a sub next gen console, which is completely fine, the Wii wasn’t even close to next gen and it held up fine with 1st party games as its stronghold. So with third party support and Nintendo’s awesome 1st party software, it’s going to be a beast.

  • Oebcodndi

    I don’t understand why you’re worrying about graphics power. Way I see it, if you want cutting edge tech build a PC. If you want interesting games and/or are a huge fan of Nintendo IP’s, get a Nintendo console. I’ll be buying the thing just for the next Zelda and Pikmin 3.

    • addressmebetty

      Thing is, I want to play a beautiful Zelda and Mario and Metroid with cutting edge graphics. Too much to ask?

    • DAn

      Yeah but think of it. What if you always want the best games and you dont want a PC because with a PC you need to always update it. With consoles they are always ready to go and you dont have to care for updating your graphics card. Its nice to do that if you’re a PC player but its better if you rather just jump on with a console and play it.

      • Andrew

        If you’re smart about which components to put in a PC you won’t need to worry about ‘updating’ it, not for several years. There are PCs from around or even before the ps3 and xbox 360 came out which can still play every game that is released for PC on higher settings than those consoles. This whole ‘updating a pc is expensive’ rumor is a huge farce.

        You can still play most games (for instance, BF3, Crysis 2, Metro 2033) on an ancient Pentium 4 if you have a newer midrange GPU and a bit of extra ram. The P4 came out around 2004, the xbox came out almost two years later.

        • deSSy2724

          This is not true…. you can do nothing today with a high end PC from 2005-07….. u cant even play on low settings and 30 FPS.

          • John

            Actually, I have a Mid-Range 2007/6 PC that still plays any games on High/Med, So a high end 05/06 can still play games today.


        …You Never NEED to update your graphic card… The PC can also be “ready to go”.

      • RayZfox

        The wii u ships with a 5 gb update on day 0. Graphics are important for a video game system because it is a video game system.

  • exicon632

    some people says that the wiiU is weaker than xbox and its not a next gen.console

    • Zuppermati

      Sure, and my anonymous source said that ps4 and xbox 720 are weaker than gamecube.

    • revolution5268

      well tell them to get there heads of there ass.

  • Totoro

    Very cool if they’re true, but sadly we might never know the specifics until after the launch of the console. I’m getting one on launch day, from experience I’ve seen that the later shipments of gaming consoles always get cheaper parts. Except the 360, that shit came from a osteoporosis old person’s ass since the first day.

  • Amokaro

    AMD 5000 series is good, it has modern feature and can blow 360 away.
    It’s importance for console to find sweet spot between computing power vs cost+power consumtion.

  • Lundqvist

    The diffrence in computing power between the Wiiu and 720/Ps4 will be pretty much the same as the one between 360/Ps3 and the old wii.
    So it will be impossible to port games between the systems, Wiiu will only get PS3 ports.
    I cant se any reason to buy one, I like Nintendo games but not enough to buy a console that will have 0% 3e party support after the release of 720/Ps4 like Nintendo had with the wii.

    • random guy

      The only reason your saying that is because the wii u didn’t get third party titles, but look at the ps2 how it was not as powerful as the gamecube and xbox yet it still got ports from the xbox and gamecube and looked almost identical.

      • Even more random guy

        Well, yes. But take a good look at the PS2/GC specs. GC had the most powerful CPU of them all. The Gekko was a lot more powerful than the P3 on the Xbox and stuff. And the GPU however, the nVidia inside the Xbox was superior to everyone else’s. But the GC GPU was just a little more powerful than the PS2. The PS2s had more polygon rate and memory bandwith while the GC GPU had more memory and was a more powerful archtecture.

        However, some “unknown” guy of Sony stated that The PS4 would wear a HD 7000 series GPU. We all know that this means nothing if it’s a lower end than Wii U’s but lets look up the past. Sony has never been cheap. So I think this possibility is not very welcome.
        So I think, and this is just me, that Sony will indeed invest in something more of a… 7850, wich is draw a 120W. But hey, it’s a monster card.

        That said, Wii U and the PS4 will not be even equal, just like the PS3 and the Wii are absolutely different.

        Also, you guys can look this up at Kotaku or Anandtech, from june to september, I don’t know when exactly I read about this PS4 thing, but it was very recent.

  • TheIdiot

    From what Crytek told people that they had Crysis 2 running maxed out at 60 FPS could be 720p or 1080p, heard some sources say Valve is interested in the graphics, If the GPU has 768MB there can be a chance 128 is put away for the system and CPU. Though we may need to wait to see one and tear it apart or get some software that devs seem to use with the PC port ._. I can’t wait for the system to come out! Though We will see PS4 and the new Xbox past expected 2014 since the Xbox has a life span of 10 years and Microsoft confirmed it will be 10 years so Xbox and PS fanboys are gonna hate until expected 2013-2015

  • TheAssWhiper

    from my opinion, i think the Wii U GPU will be between 400 and 500 MHZ, still not defeating the Xbox 360 or Ps3, or it could be like, it will hardly beat Ps3 or 360, the same way Wii hardly beat the first Xbox….

  • Astennu

    If its build on Evergreen its way more powerfull then the Xbox 360 and PS3.
    The Xbox 360 features the Xenos C1 witch has 64 shaders that are Pre HD2900 in capibility’s.

    64 evergreen shaders are more powerfull and have more features. If you have 400 of those on 400-500 MHz it will be 6x-7x more powerfull then the Xbox 360 in terms of Shader Power. And 5-6x the PS3. Witch has Gefroce 7800 like tech.

    But i do think the Xbox 720 and PS4 will feature even more powerfull GPU’s. But compared to the current Xbox 360 and PS3 the GPU of the WII U is super powerfull.

  • Astennu

    correction: I dove in the Xenos C1 specs:
    The Xenos C1 Features 48 VLIW5 shaders wich is = 240 alu’s.
    In case of the HD5870 there are 1600 alu’s = 320 Shaders.

    Since the HD2900 generation AMD is calling each alu a shader.

    Still the WII U should be 2,5-4x faster because of the new architecture (when running at the same clockspeeds)

    • Rindi

      Shynn, of course GPU-Z is not fake, those are the scsrenehots from a 590 and tally up with the latest from hundreds of websites reporting the same specs.A dual-GPU design will always clock less than a single GPu due to the excess heat of an enclosed space as with all dual-GPU that have ever existed. That’s why based on performance and overclocking your better off buying two 580s unless anyone has had issues running SLI.

    • Bryan Elias

      This is assuming that the IBM Power-Based CPU will be fast enough for this GPU.

      • Rinslowe

        It runs late in the cycle current gen games pretty well. I have AC3 on Wii U and 360 and my personal opinion is the fidelity is more apparent on Wii U… Otherwise mostly identical, I did however have to adjust my screen for Wii U to get that sweet spot on contrast that I have become accustomed to with 360.

        One thing to consider is that the CPU is not just a higher clocked broadway with more cores. It is a natural evolution and has more grunt and cache than previous versions. Sure it shares the same heritage but it is not strictly speaking effectively comparable with other 750 models let alone the competitors CPU’s…

        I am just as interested as the next guy in knowing for certain what it can do…

    • Rinslowe

      This is an important point for people to realise now that the clock speed for CPU & GPU has been made apparently public courtesy of “Marcan”….

  • wiiU’s supporter

    I am sure that wiiU will be a beast.But the most important thing is the wiiU must be to the next gen console or else Nintendo is dead(shoot wood).WiiU MUST succeed!!

  • Newtman

    How does it makes sense that nintendo would release a console with technology from 2012 and make it weaker than tech from 2006. It’s almost impossible for the Wii U to not be at least a few notches more powerful.

    • Rinslowe

      It is of course more powerful in terms of raw grunt than current consoles… Just that the novel tech still being cutting edge in some places is used in different ways than current gen consoles…

      The CPU cannot be compared effectively by clock speeds alone, especially the GPU…
      If a game is built from the ground up for Wii U, then it will show more potential than what can be seen visually from late in the generation titles on 360/PS3.

      I for one am interested in just how much potential is available to Wii U visually, and look forward with much anticipation to Retro Studiios brand new Wii U graphics engine, Which I heard was making it’s way around other dev’s as we speak… (Unfortunately this cannot be confirmed right now).

  • Isa/Ty

    Eh, that’s nice and all. I’m just more concerned with frame rate, gamepad and button recognition as well as the entertainment level of the gameplay of each distinctive game on the WiiU and it’s eshop overall. Graphics in my opinion are just a pretty picture, the finishing touch. It doesn’t make the game for me like some gamers.

    • Rinslowe

      I prefer to have the whole package working in synergy… So while a great gaming experience can still keep me entertained (Tetris especially springs to mind). My most satisfying moments are when playing games like Elder Scrolls, Zelda, Mass Effect series, Assassins Creed etc… Which manage all relevant aspects quite equally…

  • wiiU’s supporter

    wiiU IS NOT WEAK!

    • demize

      But the name sure stinks.

      Wii u did someone fart!!!

      • Mr know it all

        shitbox360 & piss3 just saying 🙂

  • armagonde

    Hey guys. Grapfics will not get a significant leap fromm what we have today. That says sony and that says nintendo. Sony.microsoft and nintendo are not gpu or cpu devolepers.they buy the tech what on the market. Second to that:-,you can not make a console with the current hi-tech because it will get to expensive. We had this with the playstation 3 which was on the market with 600 dollars or euro…and it could not outperforme the xbox 360. The message is the wi-u will be better but not superior to the current consoles…but it wwont be outperformed so easyly because their wont be pricey tech tech that brings necessary graphics power to outperform what is out on consoles.

  • Cpt.Crash

    Remember the Xbox 360 Only had 48-Unified Shaders, 500Mhz core clock and 32GB/s Memory Bandwidth, with a 90nm Chip Build and the console produced quite good graphics with that type of GPU, and now the Wii U uses 400-Unified Shaders build on a 40nm Chip with GDDR5 Memory which produces 75GB/s which can load a lot more faster then the Xbox 360’s GDDR3 Memory and with new probally DX11 Effects, that alone tells you the Wii-U has like 10 Times more Horse Power compared to Xbox 360’s Xenos GPU which could barely handle 720p, so the Wii U will show it’s true colors as the years go by.

    • Bryan Elias

      The Xbox 360 uses an Nvidia GPU and the Wii U uses an AMD GPU. The same number of shader cores for an Nvidia GPU is not equal to the same amount of AMD GPU shader cores. This is why a GTX 580 had 512 cores and a HD 6970 had 1536 stream processors. Furthermore, the XB 360 chip is no longer 90nm; that chip has seen a few core shrinks over the lifespan of the system.

      • Laby

        The Xbox 360 uses a GPU made by ATI that time before they became AMD, it’s the PS3 that uses a nVdia GPU, The point what I am trying prove here is that they achieved a lot of visual quality with the hardware that the console contains so it doesn’t matter who has more horsepower if you don’t know how to control it, and yes the newer models no longer uses 90nm process, but I had the old models in mind because the new models has no extra performance boost because they are yield back to match the performance of the original Xbox 360 models. AMD and nVdia has both different architectures so you can’t really compare their shaders to each other.

        • Bryan Elias

          Yeah I was wrong about the GPU used in the 360, but the PS3 never came into the discussion. Also, I never made the claim that a core shrink provides a performance increase. It can provide better thermals and lower power usage. 

          “AMD and nVdia has both different architectures so you can’t really compare their shaders to each other.” 

          You obviously didn’t read everything I posted. This is exactly what I was saying. I was pointing out that I thought the person I commented on was making this mistake although this person is still making a similar error by comparing a Unified Shader to a Stream Processor. This poster would need to clarify what the difference is between those two first.

  • toja

    xbox has 48 vec5 units,they are called 240 shaders today
    processor in wiiu will be effectively 2x weaker than this in 360, gpu can be maximally 2x faster
    system with that specs should be realeased year ago, it could be big succes then

    • Bryan Elias


  • demize

    Sadly, it seems that the Wii U will actually not run Black Ops 2 at a native 1080p resolution. Yesterday, on October 12, this information was revealed by GotGame. Through an e-mail response from Activision, it is stated that Black Ops 2 is not natively 1080p on the Wii U. From the looks of it, it appears that the Wii U will be up-converting Black Ops 2 to 1080p, not running it natively.

    • DemonRoach


  • Gawen

    If you guys buy consoles for the graphics… I feel sad for you, Ive been playing games since the colleco / intellivision era, atari, nes and up… While MS and Sony focused on “coolness” and graphics, much like *ahem* Sega, Nintendo has allways kept it’s strategy, the games. Can’t beat nintendo first party, Nintendo was left stranded since the N64, (I personally have many fond memories…) then the gamecube and the wii, still not outshinning with graphics and third party games, but still moving forward. Wii was an example of Nintendo’s old saying “Quality over Quantity”. Stop ranting about graphics, and enjoy the Nintendo (Entertainment) System. Like some people have said it, If it is graphics you’re looking for, build a high end pc. Mario, Samus, Kirby, Link, Zelda, Luigi, Donkey Kong and Pikmin among others, never dissappoint. These are the heroes I grew up with, and even though Im a PC guy, and eventually I end up with the other systems, because im a fckng gamer *means im open if the games are good*… I’am, a Nintendo Fanboy. All Hail Shigeru Miyamoto.

    • Bryan Elias

      If a person plays games for graphics, then he or she should build a gaming PC.

  • charlie

    I don’t think anyone should be worried too much about this stuff for the Wii u. As people have said nintendo fans have graphics fairly low on their agendas, its the gameplay and experience that are important. As has been said, want graphics and have money, build a pc. Also to take full advantage of the ps4/720 supposed double 1080 resolution will require displays that don’t exist without spending several £1000. I sure as shit don’t want to have to fork out for a new display just for a new console. Aside from that, if ps4/720 are download only/always on internet I will never be able to afford software for them as at least 80% of my game library is preowned or special offers. As such I will not bother and continue to be more than happily entertained with my u and all the juicy 1st party titles that will be out by then, as they will be miles better then what the competition has to offer. Over and out!

  • Tim

    I think it’s HD 5670 All specs are the same

  • To me, it looks like a cross between a 5550, (the GPU core speed) a 5570 (the core count) and a 5670 (the memory bandwidth) It doesn’t have enough cores to be a 5670 straight up though. That would put it at roughly 50% faster than the PS3’s GPU.

  • Damn, i love this console but it won’t have tomb raider. I was looking forward to that game, what makes things worst is the reason they’re not planing to put it on the Wii u is due to it’s interface. they say they not willing to work on it 🙁