07
Apr
wm_Snapdragon

When Qualcomm announces a new class-leading mobile chip, even the less technical among us tend to take notice. So, meet the Snapdragon 64-bit 808 and 810 processors - Qualcomm's most powerful mobile chips ever.

The 810 is an octa-core setup that will be utilized in a fashion similar to ARM's big.LITTLE architecture (as will the 808), though Qualcomm is using its own technology to manage how the cores interact, rather than an off-the-shelf solution. The quad-core A57 is a 20nm part, as is the lower-power A53 quad, though Qualcomm hasn't disclosed clock speeds for either. The 810 also marks the debut of a couple new pieces of silicon, including the Adreno 430 GPU (30% faster than Adreno 420) and a new LPDDR4-1600 RAM interface (still dual-channel), as opposed to LPDDR3.

The 808 is a down-market version of the 810, using dual A57 cores as opposed to quad, though with the same 4-core A53 for power conservation. The GPU is downgraded to the newly-announced Adreno 418 (only 20% faster than Adreno 330, making it seem like an amped-up version of that GPU), and the RAM is LPDDR3.

image

Both chips also feature Category 6 LTE, offering speeds up to 300Mbs utilizing tri-band carrier aggregation. You probably shouldn't expect those kinds of speeds on your carrier any time soon, of course, unless you're in Korea.

Unfortunately, while the hexa and octa-core Snapdragon 808 and 810 do represent Qualcomm's beefiest offerings to date, they're also probably aimed squarely at Asia, especially given their planned market debuts: not until the first half of 2015.

Qualcomm has regularly spoken in recent months of high consumer demand in Asian markets for phones with an ever-increasing number of cores. That is, people buying phones in Asia actually care a lot about this specification - hexa and octa core devices are simply perceived as superior, regardless of the technical merit behind that perception. Qualcomm is also quick to point out that western markets, like Europe and the US, simply do not mirror Asia in this way - consumers are far more concerned with other device performance factors, like battery life.

image

This makes it seem likely, even inevitable, that Qualcomm will push more efficient designs, or at least ones that are cheaper, in the western world for the time being. The Snapdragon 805 would seem to be the likely candidate for high-end devices being released throughout the rest of the year at this point. After all, it's still a pretty incredible chip - with Krait 450 CPUs clocked at up to 2.7GHz and the Adreno 420 GPU (40% faster than Adreno 330), it's probably not all that much slower than the new 810. In fact, its GPU is actually a higher-end part than the Adreno 418 featured in the Snapdragon 808, which I think is fairly telling.

For more, check out Qualcomm's blog post on the announcement, below.

Qualcomm

David Ruddock
David's phone is an HTC One. He is an avid writer, and enjoys playing devil's advocate in editorials, imparting a legal perspective on tech news, and reviewing the latest phones and gadgets. He also doesn't usually write such boring sentences.

  • andy_o

    Dunno, the new parts are 20nm, and probably the RAM will also be a smaller node? And isn't the lower power quad core chip aimed at improving battery life? Seems to me the best for both markets, Asian and "western".

    • http://www.androidpolice.com/ David Ruddock

      Qualcomm is undoubtedly working on the next generation of Krait, which I'm guessing we'll see announced either at CES 2015 or maybe even a bit before that. These seem like stopgap chips designed to respond to market pressure - mostly in Asia - that is demanding technical specifications simply for their own sake. After all, Qualcomm has never given banner titles to non-Krait chips (these are just ARM reference design cores, not Kraits). Snapdragon 850, or 900 (or maybe first we'll see 650 / 700, like we did in 2013) or whatever they're going to call it, is going to have that courtesy reserved for it.

      We might, maybe, see a quarter or two of 808 / 810 devices next year. But personally, I think OEMs will avoid them - they sound expensive (more transistors = more $), and considering how 805 specs against 808, I just seriously doubt it. Even with the smaller 20nm process, 810's faster RAM, A57 cores, and screaming GPU have to up the power consumption figures considerably.

      Qualcomm's strategy in the US has generally been finding a few chips that really work and pushing them in as high a volume as possible, while less popular designs fall to the wayside. I don't see the 808 / 810 as the chips to achieve that goal.

      • kpkp

        I think they scraped their 32-bit krait evolution and decided to go with ARM cores since they are 64-bit. All because Apple got them with the pants down.

        • http://www.androidpolice.com/ David Ruddock

          Yeah, plans undoubtedly changed, but a 64-bit Krait is inevitable. Like I said, these chips still seem like a stopgap - Qualcomm can't be proud of running to ARM just to get on the 64-bit bandwagon.

          • duck hairs

            Probably won't be called krait though will it? I believe on there arm v6 chips they called it scorpion, armv7 krait now armv8 in sure they'll come up with a new name.

  • noname

    64 bit?

    • http://www.androidpolice.com/ David Ruddock

      Yes, I missed that point. Whoops.

  • anon
  • someone755

    "not until the first half of 2015."
    What. The fuck. Are they thinking?
    So we're supposed to wait one entire year for a truly better chipset from Qualcomm to come out?
    Any data on when Tegra's K1 models come out? Will gladly buy a phone with that in if it's available by Q4 2014. (please be 2014 please be 2014 please be 2014)

    • renz

      64 bit K1 are expected to be ready by second half of this year. nvidia did demo the first working chip running android on CES 2014. maybe the chip will be ready in time but actual device based on it will come much later as usual. personally i'd expect late this year or early next year

      • someone755

        That's a standard with NVidia, isn't it? Hype the chipset for about a year, then release it, then wait for OEMs to actually build a device xD

        • renz

          the 32 bit chip has been ready for quite sometime . it is depends on how many and how long OEM to actually made a consumer product out of it. and from what i see it is better for nvidia to make their own device rather than let the OEM built one lol.

          • someone755

            Aren't they both equally as powerful?
            The quad-core obviously earns a bit more points in benchmarks but isn't it the same really?
            True about your statement. Why can't we see EVGA make another tablet or even a phone tho? It wouldn't be faster than NVidia making their own but it would be tons cheaper (also not that slow again since EVGA and NVidia are pretty close from the GPU market).

          • renz

            we will know for sure when actual device get benchmark. but we might see numbers from 32 bit K1 soon.

            http://www.phoronix.com/scan.php?page=news_item&px=MTY1MzI

      • David Hart

        I think the K1 will be in very few phones.

    • thartist

      Problem with the 801? Anything you need to improve on it?...

      • someone755

        It's a decent chipset, but nothing really new from last year. And don't give me the "20 points more in antutu and 1 minute longer battery" as an excuse.
        Qualcomm messed this one up.

      • abobobilly

        Not really a huge jump. Just minor improvements. We have yet to see a breakthrough in this technology.

  • Roberto Giunta

    So, that means that we won't get anything newer/higher end than the 805 until the 1st half of 2015, (probably April-June 2015)? Sounds like there will be quite some time for Intel (& Nvidia and Exynos if they can finally include an LTE modem) to play catch up.

    • Walkop

      I expect soon enough Intel is going to obliterate all other chip makers in the performance area.

      I mean, seriously: they have more experience designing and manufacturing chips than anyone, they have the most advanced fab tech in the world, and they came up from having the most trashy mobile chips you could imagine to where they are now. Extremely competitive with even Apple's best CPU offerings. I use Apple because the A7 -just- beats the Snapdragon 800 in single-threaded performance at just over *half* the clock speed. For Intel, in only a few years, they caught up to what took ARM makers nearly a decade.

      Even in laptops, look how far Haswell has pushed the envelope.

      • เกรียนเทพ ดี อันลิมิเตด

        Qualcomm is very very very slow on development after the release of their 400, 600, 800 series.

      • duck hairs

        You use an apple a7 device just because it beats android phones in performance? What the hell are.you going to need that much performance on a phone for especially In apples walled garden.

        • Walkop

          I use the A7 because it's one of the best SoCs out there at the moment, and in some ways better than the 800 (which is considered the best chip in the Android space by the majority, if I can make that presumption).

          Android makes use of the 800s multithreaded capacities way more than iOS ever could with the A7 being dual-core. It's about designing for a function (or a set of functions). Apple can use more single threaded performance (as can Android devices), but we get more use out of more cores. Generally. Make sense?

          • duck hairs

            OK i agree the a7 is great but what in saying is you're moving to a more walled off operating system in which you can do a lot less (less things to do with that performance also) just for a little extra ommph? Seems like you value Computing performance too much there's so much more that goes into a phone.

          • Walkop

            Oh, no way. Lol, I worded that wrong. I'll never give up my Nexus 5 for an iOS device. I just chose the A7 as my EXAMPLE for a high-end and well-designed SoC.

    • AOSPrevails

      There was a rumor that Exynos will include an Intel LTE modem.

      • ProductFRED

        I believe the Note 2 already does. It's just separate from the Exynos SOC.

  • a.d.AM

    So what phone is going to have an octa core, 64 bit, 4 GB RAM, and a 55MP camera? I feel like I need that phone now...

    • Matthew Fry

      I was watching an anime last night. One character had a flip phone with a giant lens on the back. I want something like that but mounted on the back of a slab phone. It's ridiculous to have these big beautiful screens on these phones with tiny spy camera lenses.

      • Cheeseball

        LOL. Nobunagun.

      • andy_o

        Anime, it seems, doesn't have to follow physics.

        • Matthew Fry

          Ha. Well... You are probably right. I know almost nothing about how cameras work and whether aperture size and depth are related. My point is, we should be able to get something larger than a 3mm wide lens.

          • abobobilly

            This is why i envy Nokia's advancement in camera tech ... and the fact that they've limited it only to Windows Phone and now redundant Symbian phone.

          • andy_o

            It's actually simple enough, at least the basics. Generally speaking what you want is more light captured, and that requires a big aperture (physically, in mm, not f/number). You can project all this light in any size sensor, but the reason big-sensor cameras can capture more light, is because the lenses with the same aperture are more feasible to make for the bigger sensor (there seems to be a price/feasibility) sweet spot at about the 35mm format size).

            So, for instance, a 50mm f/2 lens has an aperture of 25mm. This is a common lens for the 35mm or "full frame" format. To make an equivalent lens for, say, a "2x" format (like FourThirds), it is well known by now that you need a 25mm focal length. But what is not said often enough is that a 25mm f/2 lens is not truly equivalent. It only has a 12.5mm aperture. So, a true equivalent lens would be a 25mm f/1.0 lens (25mm aperture). That would be prohibitively expensive. When you get to very small sensors, it's just physically impossible to get big apertures which can capture more light.

            In fact, comparisons have been made with existing lenses on a 5D and FourThirds cameras (FourThirds 35-100 f/2 & Canon 70-200 f/4 lens), and the prices of the lenses compensated for the camera prices. The whole kits were comparable. The advantage of full frame is that it allows for bigger apertures without breaking engineering or physical posibilities.

          • andy_o

            Anyway, more directly your point. The N5's lens has a 4mm focal length. A 4mm aperture would put it at f/1.0 already. If you make the sensor bigger, the focal length also has to change, making the lens longer and the phone thicker.

  • Matthew Fry

    We've now officially delved into crazy numbering territory. I'm not saying there's an easy way out of it though. Higher numbers should equal beefier CPUs. They should just do 4xx for quad core, 8xx for octa core, 6xx for hexa core with the xx indicating relative beefiness. I also wouldn't mind the GPU being part of the model either like 805-420 or 805x420.

    • abobobilly

      One more reason for them to squeeze more money out of consumers ... or rather, from ones who "can pay".

      Personally, first 2 or 3 times ... it was awesome to see even the minor improvements. Now its just getting boring, and ridiculous.

  • Alexis Urena

    Mr. Ruddock, you forgot to mention Sprint Spark is using CA 20x2 spectrum this year (ATT is also using CA to a lesser degree as well), and 20x3 by next year. Im already achieving over 70mb/sec on sprint spark. With this additional 20Mhz, it will effectively double the max throughput this year then triple next year.

  • thartist

    I'm seriously starting to think that these guys don't know what else to cash on us for. Throwing 8 cores in a stupid smartphone is way out of reason, as well as doing 4k video, AND I LOOOVE THE HIGH END, but nonetheless... (Clear your mind from all the hype and think of it for a second...)

    If you were to save battery you would use 1 or 2 mildly quick cores, not 4! After all, they kick in for doing average stuff that doesn't require performance. Same as 4k, would you in your sane mind record 4k video today for your 1080p tv and monitor, with a sensor that can barely take pictures at night or even do it right in daylight? No. But they tell us what to desire.

    Uhff, rant off.

    • someone755

      Also those 300+ ppi displays. Gtfo with that it only eats battery.

      • David Hart

        WAY agree, nothing but a battery/performance LOSS.

        The Nvidia shield having only a 1280x720 screen at 5" is perfect.

    • abobobilly

      You are actually onto something here, and i do share your sentiments. Sadly, if we start such a rant, people will run us down giving this absurd reason "we've reached the breaking point of tech for now".

    • Lars Jeppesen

      You seem to miss the part where it said "2nd half of 2015"

      Are you saying 4k tvs won't be widely available by then?

      • JonJJon

        They might be more widely available by then but you can bet that the uptake will still be tiny and price of them will still be far too much for most consumers. A lot of 720p TVs are still bought and used. 4K may seem like it's going to explode on the scene in techy conversation places but in reality it will be several more years before 4K TVs make a significant dent in the television market.

      • Richard Markert

        Widely avilable or not, most people won't have one. I have an awesome 1080p LED SmartTV from 2012 and I'm not upgrading it till it dies.

    • didibus

      Did you read the article? When they said that the Asian market prioritises amount of core and specs over everything else including battery life, whereas the western world prioritises battery life over specs. And that is why these new SOCs are aimed at asian market first.

    • duck hairs

      I think the reason they have 4 low.power cores is that 2 just wouldn't be enough to do anything really so it would spend a lot more time with the high performance cores on.

  • Nicholas Polydor

    Quotes from http://www.anandtech.com/show/7925/qualcomms-snapdragon-808810-20nm-highend-64bit-socs-with-lte-category-67-support-in-2015

    "Only the Snapdragon 810 has a hardware HEVC encoder however."

    This is good. Unfortunately, the Qualcomm press release ( http://www.qualcomm.com/media/releases/2014/04/07/qualcomm-announces-ultimate-connected-computing-next-generation-snapdragon ) states, "... 4K video at 30 frames per second... The combined 14-bit dual Image Signal Processors (ISPs) are capable of supporting 1.2GP/s throughput and image sensors up to 55MP." This is a shame: I was hoping for 4K HEVC recording at 60fps.

    "The 810 can support up to two 4Kx2K displays (1 x 60Hz + 1 x 30Hz)... "

    This is good. However, the press release states, "... external 4K display support via HDMI1.4." How can HDMI 1.4 support 4K at 60Hz?

    Still, this is not good enough from Qualcomm. By the time the Snapdragon 615 and 32-bit Snapdragon 805 - let alone Snapdragon 810 - are released, Apple will have released the A8. By the time Qualcomm release their Krait-based 64-bit Snapdragon, Apple will have released the A9.

  • Fatal1ty_93_RUS

    Seems like new Nexi won't be seeing the 64bit jump yet

    • Mehmed

      It seems the new Nexus would either have the Snapdragon 801 or the Snapdragon 805. Good, but not what i wanted. I wanted a 64 Bit Snapdragon based on Krait in 20nm. I like how the Snapdragon 810 has 1. 64 Bit, 2. 20nm, 3. LPDDR4.
      It's not just Smartphones and Tablets which are using SoC of ARM. Chromebooks, Chromeboxes, Amazon FireTV. and so on.

      • David Hart

        I might pass this next Nexus if it isn't a big enough upgrade :/
        Only real reason I upgraded from the the Nexus 4 was LTE/Design

      • David Hart

        I might pass this next Nexus if it isn't a big enough upgrade :/
        Only real reason I upgraded from the the Nexus 4 was LTE/Design

    • 64 bit android googleio14

      You're still forgetting about Intel's and Nvdia's chips for the nexus. I wouldn't be surprised :)

  • briarwood

    I've always wondered how Apple's A7 would run Android. I've read a lot of glowing posts regarding the A7 running iOS. How would it handle Kit Kat ?

    I ask because I don't put much faith in benchmarks as my both my Nexus 5 and Nexus 4 are very very fast and in the case of the Nexus 5 it is blisteringly quick, both are running PLSX. Even the Nexus 4 with that old chipset is blazing yet benchmarks are relatively low.

    • duck hairs

      In sure the a7 could easily handle kitkat, yes it doesn't have the highest clock speed or most cores but its running apples beast custom armv8 architecture.

  • Bob Hart

    So when can we expect a 64 bit Android OS?

    Without the 64 bit OS who needs a 64 bit processor?

    More Ram for 64 bit also.

  • Luke Kallam

    I'm betting... Note 4 and G3 will have the 805, maybe the next nexus, if it's released in December, might feature the 808? Maybe not, but it's nice to hope. Either way, 64 bit actually being able to be useful on a phone will nice, kind of. I think we're at a point where all of these phones are overpowered and we don't really need much more for a while.

  • Luke Kallam

    I'm betting... Note 4 and G3 will have the 805, maybe the next nexus, if it's released in December, might feature the 808? Maybe not, but it's nice to hope. Either way, 64 bit actually being able to be useful on a phone will nice, kind of. I think we're at a point where all of these phones are overpowered and we don't really need much more for a while.

  • Ricardo Lemus

    im also drooling at newer technology , but battery life is 1st dont get me wrong better perfonmace is great, but OEMs need to optimize software my gs4 sometimes takes like 4 seconds to multitask , i used to have a gs3 running stock android and i could easyly say that it was way smoother than my gs4 running touchwiz

  • Andrew T Roach

    805 is still rehashed 3 year old Krait and the stock ARM cores on Qualcomm's planned new offerings nothing to brag about either. Chinese SoC makers have these same configurations in the works already.