A little over a year ago, Nvidia decided to change the game–literally. The Nvidia SHIELD Portable was announced, released, and was very well received. Here we are, shortly following the announcement of a new, and epic, generation of mobile device processors, and Nvidia has officially released their next SHIELD installment, the Nvidia SHIELD Tablet.
One of the chief complaints we saw with the original SHIELD Portable was the screen size and resolution. A 5” screen with 720p resolution was usable for most tasks, but could get to be a bit of a strain on the eyes after a while. Nvidia has attempted to address this with an 8”, 1920 x 1200 display (which is, by the way, quite nice).
Check out Jordan’s Video Review:
|Processor||NVIDIA® Tegra® K1 192 core Kepler GPU,2.2 GHz ARM Cortex A15 CPU|
|Display||8-inch 1920×1200 multi-touch Full HD display|
|Audio||Front facing stereo speakers, dual bass reflex port with built-in microphone|
|Storage||32 GB (WiFi+4G LTE) / 16 GB (WiFi-only)|
|Wireless||802.11n 2×2 Mimo 2.4 GHz and 5 GHz Wi-FiBluetooth 4.0 LE, GPS / GLONASS|
|Connectivity||WiFi+4G LTE or WiFi-only, Mini-HDMI, Micro-USB 2.0, MicroSD slot, 3.5 mm stereo headphone jack with microphone|
|Camera||Front: 5MP HDR;Back: 5MP auto focus HDR|
|Stylus||DirectStylus 2 with 3D Paint (Included)|
|Battery||19.75 Watt Hours|
As you can probably imagine, with the Tegra K1 and 2GB of RAM, this thing eats up games for breakfast.
As this latest SHIELD is a standalone tablet, if you want to interact with your games like you did on the SHIELD portable, you’ll need a controller. With most other devices, this means pairing a Bluetooth controller. This usually introduces a bit of latency, which could mean the difference between getting a headshot and BEING headshot.
With the SHIELD Tablet, Nvidia released the SHIELD controller, a WiFi-direct solution that promises lower latency and easier pairing. In practice, both of these claims appear to be true.
Additionally, a magnetic tablet cover is available that makes it simple to stand the tablet up on a flat surface so you can keep right on gaming with the wireless controller.
As with the SHIELD Portable, the tablet comes with a version of Android KitKat (specifically, version 4.4.2) that is only minimally customized, adding in pieces and parts to make the controller and stylus work appropriately, as well as whatever’s necessary for game streaming and recording. This means that updates can, and should, come frequently, as they have with the original SHIELD.
This also means that rooting the device is quite painless, as you can see in the following video:
Sound is one place where the SHIELD Tablet really shines. With most Android devices, and especially most tablets, speakers come in the form of one or two small, tinny speakers at the bottom, or the back, of the device.
The SHIELD Tablet has front facing stereo speakers as well as bass reflex ports on the side, which makes for some truly decent sound quality. I rarely found myself bumping the volume over about 50%, because the speakers were just that loud, clear, and crisp sounding.
This is another area where the SHIELD Tablet shines. It’s easy to throw around numbers like 8” and 1920×1200, but it doesn’t do it justice. The colors are vivid, and the viewing angles are excellent.
One new feature introduced with the SHIELD Tablet is the ability to record screencasts directly from the device, including the built-in camera and microphone. This really makes this device a unique experience, as far as I’m concerned.
I’ve only tested this functionality a few times, and it seems to be a bit hit-or-miss. It records at a strange resolution, 1728×1080, presumably because the native screen resolution is 1920×1200 instead of 1920×1080. Additionally, the audio can sometimes go wildly out of sync from the video. Rebooting the device seems to take care of that issue, but you don’t know about it until after the recording, so it’s safest to just reboot before you’re going to record anything.
The built-in microphone really isn’t all that bad. My initial tests made me think it might be, but as it turns out, if you’re using the wireless controller, it attempts to use the microphone in it instead, which IS a pretty rough microphone.
Built-in streaming to Twitch.tv is also supported, which is absolutely awesome. You have to turn the quality down before attempting it, but still, it’s an all-in-one game streaming solution.
The downside of all of this, as I hinted earlier, is some glitchiness in the software. I attempted to record gameplay of games like Half-Life 2, but if I tried to leave the camera turned on while doing so, the game would immediately crash. I believe most of these things will be fixed, in due time, with software upgrades.
I’ve said it before, and I’ll say it again. This is a tablet. Please don’t use it as a camera.
That said, the pictures I took with the rear-facing camera were… well, not great. 5MP doesn’t go quite as far as it used to, so they were blurry and grainy. However, for the front-facing camera, while it’s still a bit grainy, it’s leaps and bounds better than a lot of other front-facing cameras, and given that its primary intention is to be used while streaming or recording games, it works extremely well for that!
A stylus is not something you’d normally talk about with a tablet, but this is a bit of an exception. The stylus of the original Tegra Note has been revamped a bit for the SHIELD tablet, with excellent results. I’m no artist, but the stylus has been extremely easy to use and feels very sturdy and solid in the hand, allowing for fine-grained control.
Unfortunately, the stylus doesn’t appear to work with all other devices, though it DID work with the HP Slate 7 Extreme, which also uses Nvidia DirectStylus technology.
According to Nvidia, the battery in the tablet is 19.75 Watt hours. That should equate to about 5200 mAh, which is just above average for a tablet of this size. In practice, I usually don’t find myself sitting down with a device like this for more than a couple of hours at a time, so I regularly saw several days of battery life, but my gaming was probably lighter than average. With heavier usage, of course you’d be able to drain the battery in just a few hours, but that can be said of just about any device with any battery size.
As a “next step” in the SHIELD family, the new SHIELD tablet is definitely a very worthwhile addition. Excellent performance, interesting software additions, amazing sound quality make it a powerful combo, not just for gaming, but for everyday tasks, media consumption, and even a bit of artistry. With a price tag of $299/$399 (and even more if you want the wireless controller and magnetic device cover) it’s a bit on the steep side, but if you’re looking for a good all-around tablet, and an especially good gaming tablet, this is the one.
June 6, 2014 By: Jimmy McGee
Android KitKat 4.4.3 has been released for the Nexus 5 and many more Nexus devices! That and much more news is covered by Jordan, as he reviews all the important stories from this week. Included in this week’s news is the announcement that OmniROM nightlies are now based on Android 4.4.3 and Google announced a new Project Tango tablet! That’s not all that’s covered in today’s video!
Jordan talks about the other videos released this week on XDA Developer TV. XDA Developer TV Producer TK released an Xposed Tuesday video for OK Google for 3rd Party Launchers. Then, Jordan talked about Portal and Half-Life 2 on the Nvidia Shield. Finally, TK gave us a an Android App Review of Gallery Plus. Pull up a chair and check out this video.
February 4, 2014 By: egzthunder1
Many of you may recall that back in June of 2012, we talked about how NVIDIA was given a rather direct message courtesy of none other than Dr. Linus Torvalds himself. Basically, the article written by XDA Recognized Developer AdamOutler went on about the closed nature of both NVIDIA and Qualcomm as chipset manufacturers, and how it was shameful and really inexplicable how two companies with such closed minded ideals could possibly be the paramount chipset providers for a large number of Android device manufacturers. Adam went on to wrap up the article with a brief (but very powerful) video on what the father of the Linux kernel thought about their lack of support for the open source world. Needless to say, shocked and appalled NVIDIA released a statement not long after in an attempt to address some of the finer points of the rather graphical complaint. Without going into the nitty gritty of the response itself, Adam basically dismantled their apology/explanation piece by piece. Their words, not being backed up by their actions, meant little to nothing.
Fast forward a year-and-a-half later, and we see something coming out of the left field: a completely unexpected move by NVIDIA, which left a whole lot of people trying to think back to what Dr. Torvalds said in the past. In a nutshell, you may recall that earlier this year, NVIDIA announced at CES the arrival of a “192-core” processor, the Tegra K1. This monster of a computerized brain was believed to be, much like its predecessors, as closed source as feasibly possible. So, while it was exciting to see such next generation hardware on the verge of hitting the market, it was a mixed bag of emotions as no one could predict what kind of tricks the chip maker had under its sleeves to keep those pesky devs away from their trade secrets. Well, as per a post made by Alexandre Courbot, the Japanese division of NVIDIA was making progress in helping the Nouveau project come afloat. For the unaware, the Nouveau project is essentially a group trying to create open source code to work with NVIDIA graphics hardware, which is the ultimate roadblock for most open source operating systems utilizing the Linux kernel (Android, Linux distros for PC, etc).
It seems that NVIDIA finally saw the benefit of contributing to the Open Source community, as they decided to share some of their work with the rest of the world by beginning testing and coding open source drivers for the K1 chipset. The work is indeed endorsed by NVIDIA, so rest assured that it is not a leak to be taken down or redacted. As stated, it is in very early stages of development and very few tests have been run. These will then expand into user space testing—if successful, of course. The road is a lengthy one, but certainly not impossible. And now that Nvidia is helping out, it is a far more tangible reality. The post was very well received by entire Open Source community—so much so, that even Linus himself gave NVIDIA yet another piece of his mind, but this time using a different finger. As the saying goes, you can’t make an omelette without breaking some eggs, and Linus’s original message seems to have broken just enough eggs for this wonderful, currently being cooked, omelette to finally happen.
Who knows? Maybe one day in the not-so-distant future, when the manufacturers remove their heads out of their posteriors and realize that without open source they would not exist today, they will start sharing what they know so that they can do what they once (likely) embarked out to do when their companies opened doors for the first time: Make the world a better place through education and sharing of knowledge through technology and innovation.
You can find more information in the original Google Plus post by Dr. Linus Torvalds.
[Thanks to OEM Relations Manager jerdog for the tip!]
January 13, 2014 By: Will Verduzco
About a week ago, we talked about Nvidia’s Tegra K1 announcement and what it could mean for the future of mobile gaming. Then, our own XDA Developer TV Producer Jordan got a hands-on look at the chipset and its reference platform, and he took a look at some of what it can do. The chip, which merges Nvidia’s GeForce architecture with their mobile line, is based on the same Kepler architecture that powers their desktop GPUs.
Despite of all of this, one question still remained: Just how fast will this thing actually be? We all knew it was poised to be near or at the top of the pack. After all, it’s the first in a new generation of high-end mobile SoCs. And on top of that, Nvidia has made some spectacular claims by billing the K1 as having 192 cores.
Now, we all know that these 192 “cores” aren’t cores in the same respect as how your quad-core Snapdragon 800 has 4 CPU cores. In fact, they aren’t even really discrete GPU cores like the 2 GPU cores in the PowerVR SGX 543MP2. Rather, a better way of thinking of these Cuda “cores” is to consider them unified shaders, much in the way that desktop CPUs are categorized. The Tegra K1 happens to have a single Kepler SMX (Next Generation Streaming Multiprocessor) unit, and each Kepler SMX is comprised of 192 Cuda Cores. Thus, a more accurate comparison would be not to the 4 CPU cores in the Tegra 4, but rather its 72 unified shaders—or the 32 Cuda Cores per desktop Fermi SM unit. But even after dispelling some of the marketing buzzword-induced misconceptions surrounding the K1 and its 192 cores, we are still left wondering about this SoC’s performance.
Now, thanks to some benchmarks run by the folks over at TomsHardware, we have a bit of a better idea of how the K1 will perform. And as demonstrated on the Lenovo ThinkVision 28, the Tegra K1 makes the GPUs in our current generation of SoCs look like yesterday’s news.
When controlling for resolution by using an offscreen 1080p buffer in the industry standard GFXBench, the Tegra K1 manages 48 frames per second in the T-Rex HD test. This is vastly superior to what’s seen on the Nexus 5′s Snapdragon 800 (23 fps), the Tegra Note 7′s Tegra 4 (20 fps), and the iPhone 5s’s Apple A7 (27). A similar spread (but with a smaller percent difference) can be seen in Futuremark’s 3DMark benchmark.
On the CPU front, the quad-core ARM Cortex A15 in the initial version of the Tegra K1 looks to be fast, but not revolutionary. This is to be expected, as there is no reason for the four 2.0 GHz A15 cores in the K1 to be much faster than the four 2.3 GHz Krait-400 cores in the Snapdragon 800. After all, the Krait-400 CPU is essentially a highly modified ARM Cortex A9 with many performance-enhancing features found in the A15. As such, the K1-powered ThinkVision 28 bested the quad-core Nexus 5 and Tegra Note 7 by less than 10%. And strangely, the AnTuTu results for the ThinkVision 28 actually proved worse than the Tegra Note 7.
Now there are still a few caveats with these results. These devices all feature different OS versions. And as such, any direct comparisons are impossible to make. The ThinkVision is running Android 4.2, whereas the Tegra Note 7 is on 4.3 and the Nexus 5 is naturally on Google’s latest and greatest. Furthermore, the 64-bit ARMv8-based Denver version of the Tegra K1 is still yet to be seen, and that could potentially bring some major increases in CPU performance—all while being pin-compatible with the A15 version.
There’s still a lot to be seen regarding how the Tegra K1 will perform in the devices of tomorrow, but at least we know that its GPU performance does not fail to impress. What are your thoughts on the K1? Will it be at the heart of your next mobile device? Let us know in the comments below!
January 9, 2014 By: Jimmy McGee
Nvidia has officially brought an end to the “cores race” in two ways with the release of the Tegra K1. By releasing a device with “192 cores,” Nvidia not only wins in the marketing game, but they also make the numbers game a joke. Ultimately, it’s not about the number of cores. Rather, it’s about the power of the processor. But marketing numbers aside, the Tegra K1 is very powerful.
XDA Developer TV Producer Jordan was on site at International CES 2014 and got a chance to catch a demonstration of the Tegra K1′s power. Jordan sat down and talked with the folks at Nvidia. And in this video, we get to see the power of the K1 through various applications on the Nvida Tegra K1 reference platform. Check out this video to learn more about the revolutionary Nvidia K1.
January 6, 2014 By: Will Verduzco
We’ve been talking a lot about Android-powered gaming devices recently. Heck, we even gave them a place here on the XDA forums not too long ago. All of this is possible thanks to the increasingly powerful Systems-on-a-Chip in modern Android-powered devices. Now, Nvidia wishes to up the ante in the low-power SoC world with its Kepler-based Tegra K1 SoC.
The K1 breaks away from previous Tegra devices by merging Nvidia’s GeForce architecture with its mobile architecture. The company accomplishes this by making the K1 (previously codenamed Project Logan) the first mobile chip based on their Kepler architecture rather than the previous GeForce ULP.
For those who have been keeping up with the desktop GPU world, the Nvidia GeForce GTX 600-series marked the introduction of the Kepler architecture, which brought many key improvements over the previous Fermi SM (Streaming Multiprocessor) architecture. Fermi, which was featured in the GTX 400- and 500-series products was widely ridiculed for its power consumption and heat output. The Kepler SMX (Next Generation Streaming Multiprocessor) architecture, on the other hand, was redesigned from the ground up to provide much increased energy efficiency through the use of a unified clock.
Just like its desktop counterpart, the Kepler SMX in the Tegra K1 features 192 Cuda Cores per SMX unit. And since there is only one SMX in the K1, this equates to 192 total Cuda Cores. This is both a marked increase from the current Tegra 4 (initially featured in the Nvidia Shield), which features 72 cores, as well as the Fermi SM, which offers 32 Cuda Cores per SM unit. All of these increased parallel shader units equate to increased shader and compute power, which can be harnessed with greater gaming visuals.
Below is an Nvidia tech demo showing what these 192 cores are capable of:
And speaking of harnessing the increased power, it’s certainly not going to waste. Nvidia chose to demonstrate the power of their Tegra K1 using Unreal Engine 4 by Epic Games.
In addition to raising the bar on the mobile GPU front, the Tegra K1 brings interesting choices when it comes to its CPU component. Initially, the K1 will only be available with a quad-core ARM Cortex A15, using Nvidia’s patented 4-Plus-1 architecture that offers a low power core for light loads. Eventually, however, Nvidia will release another version of the Tegra K1 with Nvidia’s dual-core Denver CPU. This second version will be based on ARM’s 64-bit V8 instruction set.
The initial quad-core ARM Cortex A15 version is compatible with the 32-bit ARMv7 instruction set. It is a 3-way superscalar architecture, and it will run at up to 2.3 GHz and feature two banks of 32 KB L1 cache. The dual-core Denver variant is compatible with the 64-bit ARMv8 instruction set. It is based on a 7-way superscalar architecture, and it will run at up to 2.5 GHz and feature 64 KB + 128 KB of L1 cache.
Most interestingly, these two versions will be pin-compatible. This means that OEMs will be able to easily switch between the two SoCs and offer different versions of their products for different markets. Thanks to the economies of scale that this allows for, we could possibly see this translate to lower overall costs—or higher profit margins, depending on how cynical you are.
So when will all of this be available? There’s no firm release date, and when these chips make their way to actual devices is anybody’s guess, but the quad-core Cortex A15 version will be available some time in the first half of 2014. The dual-core 64-bit Denver version will make its debut in the second half of the year.
Are you excited for the future of mobile gaming made possible by the next generation of mobile SoCs like the Tegra K1? From what we can see today, it seems like the bar has clearly been raised, and it’ll be exciting to see how Qualcomm and Samsung respond with the next generations of their Snapdragon and Exynos lines.
More information on the K1 and some basic specs can be found on Nvidia’s own site and Tegra K1 press release. There, you can also learn more about UE4 on the Tegra K1, including a video that we have posted above. And if you want to look at some of Nvidia’s strange ideas for marketing, head over here and watch the video below.