05
Jan
TN-213448_TegraK1

Nvidia is having its traditional CES press event and has taken the opportunity to reveal some details on its next generation Tegra chip. Nvidia has talked about its mobile plans a little in general terms recently, but now we have a name and some specs to go on. The successor to Tegra 4 will be called the Tegra K1 and it comes in two different versions.

2014-01-05 23_02_24-NVIDIA - Twitch

The headlining feature Nvidia is touting in Tegra K1 (previously codenamed Logan) is the 192-core GPU based on the desktop Kepler architecture. By comparison, the current Tegra 4 is limping along with only 72 older GPU cores. Nvidia had a number of fancy charts to illustrate how superior the Tegra K1 GPU happens to be. The company says Tegra K1 will support DirectX 11.1, CUDA, and has more raw horsepower than the PS3 or the Xbox 360. All this and it is expected to be more power efficient than the current Tegra GPU.

2014-01-05 23_00_33-NVIDIA - Twitch 2014-01-05 23_01_41-NVIDIA - Twitch

Perhaps even more notable than the new GPU is a shift in Nvidia's previously reported roadmap. There will be two flavors of Tegra K1 – the first will be a quad-core Cortex-A15 CPU cluster with the Kepler GPU (and a helper core), but there will also be a version with dual-core Denver CPUs. Nvidia's Denver architecture wasn't expected to debut for another generation past Tegra K1. Denver is a custom design for Nvidia's new ARM-compatible CPU cores, so there won't be any more Cortex reference CPUs in Tegra – it's just like Qualcomm and Krait. Denver is based on the next-generation ARMv8 instruction set and supports 64-bit.

2014-01-05 23_06_30-NVIDIA - Twitch

Nvidia CEO Jen-Hsun Huang didn't specify any exact timeline for each version of the chip, but the A15 edition is expected to be out in the first half of the year. The Denver CPU version will come after that in the second half of 2014.

[Nvidia]

Ryan Whitwam
Ryan is a tech/science writer, skeptic, lover of all things electronic, and Android fan. In his spare time he reads golden-age sci-fi and sleeps, but rarely at the same time. His wife tolerates him as few would.

He's the author of a sci-fi novel called The Crooked City, which is available on Amazon and Google Play. http://goo.gl/WQIXBM

  • joser116

    By XBox One you mean 360?

    • Cheeseball

      Editor mistake. He really meant XBOX 360 since the other system is the PS3.

      • joser116

        Yeah I know that.

    • shitbox fangay

      Ahahaha Lol, I believe he really mean the xbox one and ps3. Since shitbox one is a shit.

  • Tuấn Ankh

    I don't know much about CPU stuff, but by looking at the photos: mind = blown!
    Can anyone tell me if that photo comparing the K1 with the PS3 and Xbox 360 credible? Or is that just like saying octa-core is better than quad-core ?

    • joser116

      It's comparing XBox 360 and PS3, no way this chip has more horsepower than XBox One or PS4.

      • Tuấn Ankh

        yea I mistook it at first

      • Nick Rosas

        Xbox 360* PS3*

        • joser116

          I don't quite get why you replied that to me

    • Franco Rossel

      Pretty credible, consider that the X360 and PS3 are 2006 machines.

  • Erik

    Wow I hope at least one top tier manufacturer
    pick this up

  • hyperbolic

    Nvidia, can't beat experience...

    • tiger

      experience at what? being the joke of the mobile community? being the rag doll that everyone uses to show their superiority?

  • Bariman43

    Apparently the Tegra K1 is so awesome Nvidia decided to imprint an image of it on a field of wheat.

  • master94

    You have to love NVIDIA. While everyone else is bragging about 2 cores and 4 cores NVDIA just threw out 192 cores. WOW

    • TheFirstUniverseKing

      There's a difference between CPU cores and GPU cores. This processor still comes with dual core and quad-core CPUs, it just also has a 192 core GPU (just like how the Tegra 4 had a 72 core GPU). In other words: it's a marketing gimmick.

      • Tuấn Ankh

        That's what I'm wondering. Would this chip being in an Android device beat the PS3 both in performance and power efficiency? From that comparison photo, it should.

        • ssj4Gogeta

          While the chip is impressive, it isn't as impressive as it might sound at first. If anything, those charts just show how old those consoles are. By comparison, PC GPUs refresh every year, and the top PC GPU's are a few times more powerful than the new consoles (XB One and PS4).

        • danishdhanshe

          Time to emulate ps3 on my droid! :p

      • Cheeseball

        In comparison, the Adreno 330 in the Snapdragon 800 has 128 ALUs (shader cores).

        • joser116

          And unlike the Adreno ALUs, the Tegra K1s GPU cores are all fully programmable.

        • Grahaman27

          Don't even try to compare shader cores to cuda/Kepler cores- its just unfair.

          The reason nvidia is calling it a 192 core chip is probably because cuda cores CAN function as CPU cores for certain things. The 330's GPU has 4 cores, whereas nvidias chip has 192 separate cores. Shader cores only do one simple thing and they are not GPU cores like Kepler cores are.

          Oh and the A7 is more powerful than the 330 graphically, and this is 250% more powerful than the A7! Only 3 months later. Don't discredit nvidia for an achievement that is truly great.

          • tiger

            3 months? Is it in any tablet on the market now? No.

            BTW, what everyone is waiting on is the DENVER/K1, which is based on ARMv8 (like A7). And as such, it is likely be OVER ONE YEAR behind A7. And by that time, A8 will be out in all iPhones and iPads.

          • Cheeseball

            You'll find no argument with me on this. I know the Adreno 330's GPU architecture is still based on the older Unified Shader Model and thus not fully GPGPU-compatible.

            I wonder if NVIDIA will allow full usage of the CUDA cores on Android. This new K1 SoC is basically a GeForce GTX 650 non-Ti cut in half with a lower bandwidth memory bus.

          • tiger

            And don't forget, A15 Cortex (quad core version coming out "soon") isn't exactly known to be very mobile-friendly! Thus, Apple and Qualcomm do not use A15 Cortex...only Samsung Exynos5, and that has turned out to be a battery destroyer!

          • Cheeseball

            The Tegra 4 uses four Cortex-A15s.

            But you may be right about the battery as it hasn't been used in any phone. It's currently used in the SHIELD and the Tegra Note 7.

            It does have a separate single A15 core that it falls back to when on Power Saver mode.

          • tiger

            Yeah, forgot about SHIELD. But, yeah, A15 was not meant to be in mobile application. Apple and Qualcomm both saw this and went custom route (NOT based on A15 at all).

          • Cheeseball

            There are two phones that do have it though, but are rare as hell: Xiaomi Mi3 and ZTE Geek U988S

          • SetiroN

            *misinformation warning*

          • Cheeseball

            Not really misinformation, but more of speculation. He's probably referencing Kepler's SMX functionality. I don't think there are any plans for the Android OS to use it. They certainly can't be used as true CPU cores.

            He's right about the A7 though. It has a PowerVR G6430 which pulls out around 250 GFLOPS at max clock. It's sad that no SoC manufacturer is combining it with A15/Krait cores, but that's probably due to licensing costs from Imagination Technologies.

          • SetiroN

            Sorry but all that "indepentent separate cores" marketing propaganda is just BS. People around here are trying to sell this chip as almost a 196 core cpu, something even nvidia's marketing didn't try to do.
            By the same standard that adreno 330 has 4 "actual cores", Tegra 5 has 1 "actual core". Which is about as stupid a comparison as the above.
            That's not to take away from the graphical power, which is impressive as we have known for a while now, it's just that people spreading technical misinformation as if they are knowledgeable irk me.
            Discussing complex and extremely different chips in dumbed down "cores" terms is silly and should be avoided if you don't know what you're talking about.

          • renz

            maybe not from google but nvidia already have solid GPGPU ecosystem from CUDA. maybe they can work with software vendor (that already experienced with CUDA) to move it into mobile platform

          • Cheeseball

            I would not want that unless it was an open platform. CUDA may be a "quasi-open" platform due to it's similarity to OpenCL, but it needs to be adopted by other developers for it to take off.

            AFAIK, there hasn't been any news about OpenCL on Android, so CUDA is going to be a long shot.

          • renz

            it seems Google themselves show no interest with OpenCL. qualcomm already dropping opencl support for their upcoming snapdragon 805. as for that CUDA stuff maybe not specifically for android but more in general. remember Kayla? it was nvidia initiative so develper can play around ARM and CUDA before Tegra K1 actually comes out

          • Cheeseball

            Yeah, the Kayla platform was the Tegra 3 effort to bring CUDA to ARM, but not through Android. It was more of a general Linux effort in preparation for ARM-based supercomputers.

      • joser116

        The reason why Nvidia is calling the Tegra K1 a "192-core Super Chip" is because they are not just regular GPU cores, they are fully programmable and massively parallel. Essentially, these GPU cores are capable of doing CPU computations and applications that normally run on a CPU could be written to run off the GPU cores, in theory making them stupid fast. I'm just glad that this chip is claimed to be far superior than the Apple A7

        • tiger

          It's over a year behind A7. By the time this K1 is released, Apple already has A8 out.

          • Grahaman27

            When did 3 months turn into a year? And less than that for tablets.

          • tiger

            The quad core version is crap...low-end...like Qualcomm S805. Denver/K1 is the Nividia top dog bc it is based on ARMv8 architecture.

          • DKT70

            TWAT !!

          • tiger

            Is that the best that you can do? Show me that i am wrong in what i wrote.

          • DKT70

            Dude, READ your own posts. You come across as a rabid Pro-Apple anti-Nvidia prick. You aren't worth the time.

          • tiger

            Ahhh, have nothing intelligent to say?? At least bring up ONE point that i was wrong. Come on...are you just a rabid fanboy that you can't even come up with ONE? It looks like the only TWAT around here is you. Got a corner to stand at?

        • SetiroN

          No they can't.

  • Msan

    So... the DualCore Denver CPU is designed by Nvidia, not based on ARM's Cortex A57? And @master94:disqus, those cores are for the GPU, Tegra 4 has 72.

    • renz

      yes it is nvidia custom design cpu much like Apple SoC and Qualcomm Krait. they have talk about this for quite sometime. but from early roadmap their custom ARM core will only arrive with Parker; successor to current Tegra K1. i heard about logan will have denver cpu before but the rumor are not widely spread instead most sites speculating that nvidia will bring in Parker early so they will have 64 bit cpu in 2014.

      • Msan

        In others site's talk that the Cortex A15 based is Logan and Denver is Parker.

      • Himmat Singh

        Denver is their custom ARM design.

  • Android Developer

    what about something that will compete with snapdragon 800 or 805 ?

    • Cheeseball

      The Tegra 4 already competes well with the Snapdragon 800, just not in phones.

    • TY

      The dual core version should blow 805 out of water.

      • Grahaman27

        And quad core. They both have the same GPU. Don't forget the sd800 was only 10% faster than the t4

        • TY

          I am more interested in the dual core version though. UI smoothness is highly related to single-threaded performance; if one Denver core can match or even surpass the performance of two A15s, the device will be extremely smooth. Apple has been taking the same approach and it's one of the reasons why iDevices' UI are so smooth.

  • Fareed Ahmed

    I'm not mistaken in saying that I didn't really see the tegra 4 feature in any big phones last year am I?

    • Cheeseball

      None. NVIDIA concentrated on putting the Tegra 4 into their Tegra Note 7 tablets and the SHIELD. They were aiming for niche markets.

      • GraveUypo

        in other words, it flopped hard.

        • Cheeseball

          Not really. If it flopped, they wouldn't of continued with the Tegra K1.

          • GraveUypo

            yes they would. mobile market growth is too explosive to be ignored.

            also i can't think of a better definition of flopping than "spending millions developing something and then not being able to sell it to anyone". there's pretty much no non-nvidia device with that SOC, and even the nvidia ones barely sold at all.

          • Cheeseball

            My reasoning for it not being a flop is because they're probably just aiming at niche markets, like what they're doing with their high-end GPUs on the desktop.

            They've been able to improve their technology in big strides every time they release. The jump from Tegra 2 to Tegra 3 was pretty big in the CPU department (from dual to quad A9s and proper NEON support) and the jump from Tegra 3 to Tegra 4 was a huge leap (from quad A9s to quad A15s w/ 48+24 pixel/vertex shaders). They've learned from their mistakes every time.

            Currently, the Tegra Note 7 keeps selling out at NewEgg and the online EVGA store. And I believe the SHIELD was sold out for weeks when it was released last year.

            Yes, I know it could be due to not being able to manufacture enough, or they could be limiting the amount they want to make since their priority is on desktop GPUs.

          • renz

            but the good thing is nvidia is not giving up like texas instrument did. while tegra 4 is not as successful as tegra 3 it is not as bad as people make it out to be. there are many HP android product adopting tegra 4 and there are a few new one from lenovo as well. in phones or tablet tegra might look less successful but in automotive they were doing just fine.

  • usaff22

    Are any apps even compatible with armv8?

    • Cheeseball

      Yup. ARMv8-A is user space compatible with ARMv7.

  • Marcell Lévai

    Well, I'll buy a Tegra device, once they don't overheat like hell.

    • Grahaman27

      I assume you are referring to tegra 3? Because the issue does not exist on tetra 4 and Kepler is 3x as efficient in the k1.

  • Himmat Singh

    My choice of skipping getting a Tegra 4 tablet feels vindicated!!

    • tiger

      Your choice is better served going with Qualcomm-equipped devices.

      • Himmat Singh

        What I meant is that I skipped this year's generation of chipsets. The next generation of chipsets are gonna be here real soon (and then this process repeats, yearly). Had I got a new tablet late last year, I'd have been regretting my decision already!

        While I must admit Qualcomm has the edge in terms of chipsets (and Samsung), I game extensively on my tablet and in that regards nVidia has the biggest clout.

        • tiger

          Clout based on past reputation in non-mobile computing. I would agree with you regarding PC/laptops etc..

  • tiger

    No release date. By the 2nd half of this year? Forget six months behind Apple, Nividia is over a year behind with ARMv8 and 64-bit!

    It is apparently "3x"faster than A7...well, IT SHOULD being that it is over a year behind!

    And by the time this DENVER/K1 comes out, Apple will be releasing the A8 chip which should easily eclipse K1 in performance.

    Nividia continues to be a joke in mobile computing. Lets hope Qualcomm can do better, but it seems like they are also over 6 months behind Apple A7. Samsung is also the same.

    • renz

      so? it must be a joke to you when nvidia were the only SoC maker that have working custom ARM 64 bit for android right now.

      • tiger

        Is it out yet? No. Qualcomm and Samsung have already announced their ARMv8 chips. Qualcomm and Samsung will likely be out way before DENVER/K1 will come to market in a mobile device. And with Qualcomm and Samsung being in big name phones, DENVER/K1 will be relegated to niche markets, like it is now.

        • renz

          did qualcomm and samsung already show working silicon to the public?

          • tiger

            Does it matter? The time frame of release to an ACTUAL product is what matters.

  • didibus

    Why do they mention support for DirectX? I guess these are aimed at Windows 8 RT or something. And how is the PS3 DirectX 9, pretty sure it uses OpenGL. This chart is so bogus.

    • renz

      PS3 used custom OpenGL but the gpu inside PS3 is based on Geforce 7800 series which is a directx 9 gpu. anyway even tegra 4 chip have direct x support. since the gpu inside Tegra K1 is kepler based so that's mean it also have direct x support just like other kepler based gpu.

    • Cheeseball

      DirectX (or more precisely Direct3D) is an API standard that GPU manufacturers meet so their products will work on the Windows platform. The PS3's NVIDIA RSX GPU is based on the GeForce 7800 GTX which is a Direct3D 9 part.

      The Tegra 4 is also used in the Surface 2 tablet which runs Windows 8 RT.

      • didibus

        So you're saying that infographic is actually geared towards Windows RT. It's possible, maybe Nvidia wants to compete in the newer Windows RT laptop segment, since they've struggled a bit to side step Qualcomm in mobile.

        • Cheeseball

          It's just a reference to show what API level the GPUs support. It's like how AMD and NVIDIA graphics card market that they support both OpenGL and DirectX.

          If a manufacturer is making Windows RT tablets, this SoC will interest them because it supports DirectX. If a manufacturer is making Android tablets, this SoC will interest them because it mostly supports OpenGL ES.

          • didibus

            But it's an incorrect reference, because the PS3 does not support DX9 and the Tegra K1 supports way more than Open GL ES 3, it supports Open GL 4.4. I can't even be sure that it truly support DX9, I mean, it is an ARM chip, it seems they used DirectX only as a kind of estimated features of the GPU, like it will support more or less stuff DirectX 11 supports. It's very unclear.

          • Cheeseball

            The PS3 itself doesn't support D3D9, but the GPU (~7800 GTX) that it's equipped with does. It's just there to signify that it is D3D9 capable, but doesn't necessarily mean it will use it.

          • didibus

            Ya, I guess it makes sense. They meant it more as like, the features of DirectX 9 and 11 are supported by the hardware. So hopefully, OpenGL drivers that support all those features are also planned.

          • Cheeseball

            "They meant it more as like, the features of DirectX 9 and 11 are supported by the hardware."

            You got it.

    • GraveUypo

      it's a more dummy-friendly way of saying it supports shader model 3.0+

      • didibus

        It's just kind of irrelevant though, I had to dig in details to know that it supports Open GL 4.4. Tegra 4 does not even support Open GL ES 3.0. I feel if this was geared towards Android, OpenGL should have been mentioned instead. Anyways, this looks pretty sweet, and OpenGL 4.4 support is awesome, even though Android does not support that version yet, maybe we shall see it in the future because this GPU does.

    • enoch861

      Mentioning DirectX is a nice of way of sayint it supports Windows..

    • Abram Carroll

      Ps3 used LibGCM. It's based off of GL, but with lower level access. The feature set is somewhat comparable to DX9. The 360 was in between DX9 and DX10. The closest comparable GPU would be the ATI 2900 GT. The 360 had tessellation for example, but lacked the Geometry Shaders for that. The tessellation was thus only an after effect and the issues couldn't be fixed. PS3 was never intended to have close to a good GPU. When the 360 dropped with nothing on the shelf touching it Sony needed more but spent a bundle on Cell being the GPU shaders. They used a cut down 7800 GTX(DX9) that was up against the ATI 1000 series. It was a gen behind and you needed to use the Cell to beat the 360.

      Chart
      PS3: 192 GFLOP GPU vs XB360's 240 GFLOPS.
      CPU Horsepower: PS3 = 1200(1) vs. XB360 = 3600

      CPU Horsepower: SPECint #'s estimated for consoles + / - 20%
      (1)PS3 GPU: Cell 154GFLOP FP32 not included, CPU SPU's not included

  • Stacey Liu

    A 5W GPU? Clearly this isn't intended for phones.

    • Sean Lumly

      Agreed. It's performance claims are thus dubious when they compare it to the A7, a SoC that was built primarily for phones. Tegra 4 was announced in a very similar matter, and while it's performance on the Shield was great where active cooling and power consumption was less of a concern, it could not compete in the power envelopes demanded by mobile phones and tablets.

      • deppman

        My Tegra Note, Asus Tf701T, and Xiamoi's Mi3 would beg to disagree. What made the S800 sell much better than the T4 is the integrated modem. The rest of the SOC specs, TDP included, are neck and neck.

        Ironically, the last I heard the S805 comes *sans* modem. Yes, it provides 40% higher graphical performance than the 330, but at what power cost? Arguably both the S805 and TK1 have "too-high" TDP for phones (as do the T4 and S800), but they will probably both be shipped anyway. And, doing a little math, it looks like the S805 will provide *at best* 50% the graphical performance of the TK1.

        • Sean Lumly

          I'm talking about competitive graphics performance. It seems that Tegra 4 is behind the pack. To be fair, there is little data corroborating that claim (outside of GFXBench).

          • deppman

            Real life tests show the T4 is competitive. The S800 has tended lower than its pre-launch numbers, while the T4 has tended higher.

            http://www.tomshardware.com/reviews/nvidia-tegra-note-7-evga-tablet-review,3668-9.html

            http://hothardware.com/Reviews/EVGA-Tegra-Note-7-Android-43-Tablet-Review/

            My $199 TN7 with pressure sensitive stylus outperforms the $650 SG Note 8 in Antutu - and a lot of other benchmarks.

          • Sean Lumly

            Thanks for the links. The Tegra Note seems to be behind in on-screen tests (ie. not resolution normalized tests); tests where a 720p resolution may be competing with 1080p devices moving twice the pixels. This puts a significantly larger demand on computation and bandwidth.

            Antutu is not a credible graphics benchmark.

            Now, these are benchmarks, and not games. However, they agree with the premise that the Tegra Note tablet is not performing as well as top-end chipsets in mobile phone.

          • Cheeseball

            Where do you see that? All the results show that they perform equal to or better than the Snapdragon 800.

          • Sean Lumly

            Yes, for onscreen tests. The Tegra Note tablet is 720 (1 million pixels) vs the phones that are running 1080p (2 million pixels). With the exception of the Basemark GUI, offscreen tests at a fixed generally resolution perform better on the Snapdragon. However the Tegra Note has the advantage for onscreen tests that render at the native resolution.

            It still performs well, though. It's just in the back of the pack, despite being a higher-power chip designed for tablets, when compared to those designed for phones (AFAIK Tegra 4i is the phone variant -- I could be wrong about this).

          • Abram Carroll

            The Tegra 4 wins most offscreen benchmarks vs S800.

            Tegra 4 devices seem to have considerable longer battery life. Often 7 hours vs 3 hours realworld.

            A good chip in a poorly thought out device will have issues.

          • Sean Lumly

            Such as?

          • Abram Carroll
          • Sean Lumly

            @abramcarroll:disqus The page you linked to highlighted CPU benchmarks. The GPU section compared the performance of the Tegra 4 to that of older chipsets/implementations.

            But I will not respond to this conversation any longer. Take care.

          • tiger

            Don't forget that the chassis size matters too...more heatsink etc. = better performance.

          • deppman

            To be fair, the Note is 1280x800 vs 1920x1080.

          • deppman

            The *only* (non-apple) chip that outperforms the T4 in graphics is the S800 with the Adreno 330, and *only* in certain tests and *only* at top frequencies, *only* when it isn't throttling (which apparently is almost always). It easily beats all other "top-end" chipsets such as the S600, S4 Pro, etc.

          • Sean Lumly

            No. Again, I'm comparing benchmarks designed to test the GPU or the computational facilities used for gaming and am taking data directly from the sites of credible benchmarks directly: Futuremark's 3DMark, GFXbench, and Rightware's Basemark. It should also be mentioned that in this case we're comparing an SoC clearly tuned for tablets against phone SoCs, and that the phone SoCs are winning.

            It should also be mentioned that you listed several generations of Qualcomm APs. It is prudent to compare it against leading chips released around the same time. This is the S800, the Exynos 5420, and the A7.

            In the majority of cases, it is bested by these phone SoCs in these benchmarks. This is true even when circumventing the benchmark fixing in and around the Exynos 5420 in the Galaxy Note 3 -- its results are higher.

            The Tegra 4 performs well, and I'm sure makes for a fabulous mobile experience -- we should ask for little more. It's just not a leader and it shouldn't be labelled as such.

            But this is where I end the conversation. Please feel free to follow up with a rebut if you would like to have the last word. I respect your opinion, and I hope that I haven't offended. Best of luck.

        • Stacey Liu

          The S800's TDP isn't too bad...it's a little above 4W under load.

          In most phone use cases, two of the four CPU cores are off anyway, so it's quite manageable.

          Tegra 4 could hit about 8W and TK1 doesn't look like its any different.

          • deppman

            Stacy, could you please provide the source of this information? I really would like to know!

            As for my estimates of actual TDP I look back at the Laptop mag review of the shield where they tested the device under load for hours and saw 4.3w average for the entire system. Subtracting out the screen, WiFi, system losses, etc, that seems like an excellent number.

            Spiking to 8W in itself may not be a problem ... As long as it doesn't stay there :-)

        • tiger
  • GraveUypo

    the funny part is that it's probably also more powerful than the wiiU.
    but i guess it would be a little too inconsiderate to mention that :P

  • Twilightlicous

    Why are they showing Directx? i have a feeling the first K1 device is going to be using Windows RT

  • Sean Lumly

    I'm sorry, but I'm not terribly impressed. Managing 2.7x the performance of the A7 (and by implication, the current gen Mali and Adreno) is certainly a great feat, until you consider that this chip is rated at 5Watts (under load), which is WAY too power hungry for a mobile phone (though ok for a tablet). I have my doubts that the K1's performance per Watt will be competitive with current mobile devices (@ an assumed 28nm).

    Based on the die-diagram, the GPU is also HUGE -- it looks to take up almost half of the chip (I will have to measure this). This is in stark contrast to most modern mobile SoCs where the GPU accounts for around 20% of die area, yet attain incredible performance. If you triple the size of a current-day mobile GPU, you may end up with something similarly sized as the GPU in the K1, and almost certainly better performance. And next year's mobile GPU archs will be introducing yet more efficiency boosting technologies to improve their performance even more. So I suspect that the K1 will also perform poorer when considering performance per mm^2 (@ an assumed 28nm).

    Also pushing the idea that OpenGL 4.4 as a major selling point is misleading at best. What systems will plausibly have access to the API? Android officially supports OpenGLES3, and while windows could use the Direct 11 support, the ARM CPU cores of the Tegra K1 limited it to Windows RT, which has minuscule penetration. Sure you could *possibly* use this chip in a low-cost steam machine (if they support ARM arch), but there are likely more convenient Intel options. The only machine that could plausibly target GL4.4 is Nvidia's own Shield and I have serious doubts that many devs will go out of their way to to exclusively release games for that system based on its tiny penetration. GL 4.4 or even Direct X11 as major selling features is quite silly.

    My guess is that the Kepler architecture has been developed primarily with PC in mind, and performance per mm^2 and performance per Watt were not primary concerns. I have little doubt that by the time that this core is released into the market, competing next generation mobile GPUs they will perform similarly with better power consumption, better die utilization, and have an ace-feature that is missing from the K1 announcement -- ASTC texture compression, which is MONUMENTAL for doing away with a texture-format-fragmentation which is very real for Android game devs.

    • GraveUypo

      "have little doubt that by the time that this core is released into the
      market, competing next generation mobile GPUs they will perform
      similarly with better power consumption, better die utilization (etc)"

      Well. what can one say? Tegra will be tegra. History repeats itself.

      • Sean Lumly

        Agreed. It seems to be a common theme with Nvidia. I was hoping that the Kepler architecture would turn the tide for Tegra, but it seems as if they are still behind the competition that has had a steadfast eye on perf/watt and perf/mm2 being permanent citizens of mobile hardware architecture.

        I genuinely hope that I am wrong, but I can't see OEMs buying into the K1.

        • tiger

          You're right...all the big players are either using Qualcomm or Samsung chips. Apple uses their own design. What is left for Nividia?

    • tiger
      • Sean Lumly

        Thanks! I have read the articles about the K1 at anandtech. The one interesting take away that they speculate is Nvidia GPUs being used for Windows Phone. Outside of that, thethey agree that the K1 will have a hard time gaining traction, even if they don't focus on perf relative to next gen counterparts, and the questionable API relevance.

        • tiger

          Agree there. It will be niche chip in niche devices.

    • deppman

      Your analysis suffers lack of vision. The mobile ecosystem is not stagnant. For example, before the K1, Google had no reason to support anything beyond ES3.0 in Android because the hardware didn't exist. Now it does. Don't you think they wouldn't *love* to ship an OpenGL4.4. Nexus 7 this year? This is visionary design. Henry Ford once said, "If I had asked people what they wanted, they would said faster horses."

      • Sean Lumly

        A fair point, but what compelling reasons are there to support the designed-for-desktop feature set of GL 4.4? After all, Khronos specifically pruned back the feature set of the desktop standard to be more appropriate for a mobile power footprint. The result was OpenGL ES 3.0.

        And Google is dictating hardware at this point. Many vendors must support APIs like Renderscript simply because this is a required part of Android. Even the existing OpenCL desktop 'standard' has been recently dropped for mobile by the last supporting proponent Qualcomm.

        I don't claim to see the future, but I see little play in general support of the desktop API. I don't think that GL 4.4 constitutes a visionary design, rather it includes a feature set not necessarily appropriate for the constraints of mobile.

        • Abram Carroll

          Didn't Apple take out 27 patent claims on OpenCL? I'd guess with their patent trolling everybody backed away from OpenCL as they didn't want to end up in court for using it.

      • Sean Lumly

        I wrote an entire rebut, but it seems to have disappeared. I will try to respond again if I have time.

      • Guest

        I have now replied twice to this comment, and they are not showing up. I'm not sure why this is the case.

      • Sean Lumly

        I will say this: The use of GL4.4 could be significant for Nvidia's Shield platform if NVidia features some high-profile ports that help sell the system. And who knows? If this takes off (eg. PC games being run on the platform), we may see a larger industry push for the full desktop spec in mobile in the coming years.

        Their characterization of OpenGL ES 3.0 was misleading, though, and we'll see amazing things coming out of it, many of which I think will challenge desktop titles running with comparable performance limits. Many of the effects featured in the video are directly possible on GLES3 that Nvidia claims are out-of-reach. These include tonemapping, ambient occlusion, global illumination, ray tracing, ray marching, soft shadows, particles, compute, physically based lighting, etc. Tessellation is the one that will likely not be done -- it can be done in GLES30 by deforming instanced patches and modifying their vertexes in the vertex shader, but this is too exotic to be realistically commonly used.

    • Sean Lumly

      The thing that I like about the Tegra K1, is the fact that it is clearly a tablet SoC. This is something of a rarity in the mobile space. Only a few tablet devices have non-mobile phone SoC, but something that can take advantage of a larger battery. This should ensure that the K1 chip has great performance compared to its competition (my argument above was about the architectures efficiency, which I suspect will be a few paces behind the competition).

      Many modern mobile GPUs can scale their number of cores or clusters. The PowerVR 6230 in the Apple A7 can scale from 2 clusters to 6. The Mali T628 in the Exynos 5420 can scale from 6 cores to 8 as well, and the upcoming Mali T760 can scale to 16 cores. There is also the ability clock things higher for added performance. But we are unlikely to see these large-core chip variants in tablets, as many tablets use phone chips, which are not well suited for these large GPUs.

      Nvidia has the potential to shake up the tablet industry, with a chip that may beat the performance of phone SoCs, and at prices that are competitive. By choosing to offer a tablet SoC, tablet OEMs may choose this for cost/performance benefits over a phone SoC.

  • tiger
  • rmeden

    In the Audi CES keynote, it was mentioned that the new Audi cpu boards for auto piloted cars will by based on the k1. Nvidia CEO was on stage too.

  • Ferdinand

    I know this is a bit off topic but why does everyone compare the galaxy note 10.1 2014 3g with exynos processor in all benchmarks. Somehow it seems nobody knows there is actually a lte version with a qualcomm snapdragon 800 soc that outperforms the exynos in benchmarks (well most off them) and I know this cause I have one and I'm using it right now to write this. I live in SA here we get only the lte variant no carriers messing with devices ect ect (except Vodacom but they're services are beyond poor and they are expensive compared to mtn, cell c or 8ta)

Quantcast