GamersNexus Tears Down a Nintendo Switch

Subject: Systems, Mobile | March 4, 2017 - 07:01 AM |
Tagged: Tegra X1, teardown, switch, nvidia, Nintendo

Here at PC Perspective, videos of Ryan and Ken dismantling consoles on their launch date were some of our most popular... ever. While we didn’t do one for the Nintendo Switch, GamersNexus did, and I’m guessing that a segment of our audience would be interested in seeing what the device looks like when dismantled.

Credit: GamersNexus

As he encounters many chips, he mentions what, if anything, is special about them based on their part numbers. For instance, the NVIDIA SoC is listed as A2, which is apparently different from previous Maxwell-based Tegra X1 SoCs, but it’s unclear how. From my perspective, I can think of three possibilities: NVIDIA made some customizations (albeit still on the Maxwell architecture) for Nintendo, NVIDIA had two revisions for their own purposes and Nintendo bought the A2, or the A2 shipped with NVIDIA's Maxwell-based Shield and my Google-fu is terrible.

Regardless, if you’re interested, it should be an interesting twenty-or-so minutes.

Source: GamersNexus

Nintendo Switches out spinach green and almost black for 720p

Subject: General Tech | March 1, 2017 - 03:46 PM |
Tagged: Tegra X1, Nintendo Switch, Joy-Con, gaming

The Nintendo Switch has arrived for those who feel that mobile gaming is lacking in analog joysticks and buttons.  The product sits in an interesting place, the 720p screen is nowhere near the resolution of modern phones though those phones lack a dock which triggers an overclocked mode to send 1080p to a TV.  The programming team behind Nintendo also has far more resources than most mobile app developers and they can incorporate some tricks which a phone simply will not be able to replicate.  Ars Technica took the Switch, its two Joy-Cons and the limited number of released games on a tour to see just how well Nintendo did on their new portable gaming system.  There are some improvements that could be made but the Joy-Cons do sound more interesting than the Gameboy Advanced.

newswitch-1-1440x960.jpg

"With the Switch, Nintendo seems to be betting that the continued drum beat of Moore's Law and miniaturization has made that dichotomy moot. The Switch is an attempt to drag the portable gaming market kicking and screaming to a point where it's literally indistinguishable from the experience you'd get playing on a 1080p HDTV."

Here is some more Tech News from around the web:

Gaming

 

Source: Ars Technica
Subject: General Tech
Manufacturer: Nintendo

Price and Other Official Information

Since our last Nintendo Switch post, the company had their full reveal event, which confirmed the two most critical values: it will launch on March 3rd for $299.99 USD ($399.99 CDN). This is basically what the rumors have pointed to for a little while, and it makes sense. That was last week, but this week gave rise to a lot more information, mostly from either an interview with Nintendo of America’s President and COO, Reggie Fils-Aimé, or from footage that was recorded and analyzed by third parties, like Digital Foundry.

From the GameSpot interview, above, Reggie was asked about the launch bundle, and why it didn’t include any game, like 1 - 2 - Switch. His response was blunt and honest: they wanted to hit $299 USD and the game found itself below the cut-off point. While I can respect that, I cannot see enough people bothering with the title at full price for it to have been green-lit in the first place. If Nintendo wasn’t interested in just eating the cost of that game’s development to affect public (and developer) perceptions, although they might end up taking the loss if the game doesn’t sell anyway, then at least it wasn’t factored into the system.

nintendo-2017-switchhero.jpg

Speaking of price, we are also seeing what the accessories sell at.

nintendo-2017-joy-con_pro_controller.jpg

From the controller side of things, the more conventional one, the Nintendo Switch Pro Controller, has an MSRP of $69.99 USD. If you look at its competitors, the DualShock 4 for the PlayStation 4 at $49 and the Xbox Wireless Controller for the Xbox One at the same price, this is notably higher. While it has a bunch of interesting features, like “HD rumble”, motion sensing, and some support for amiibos, its competitors are similar, but $20 cheaper.

nintendo-2017-joy-con_pair_gray.jpg

The Switch-specific controllers, called “Joy-Con”, are $10 more expensive than the Pro Controller, at $79.99 USD for the pair, or just $49.99 USD for the left or right halves. (Some multiplayer titles only require a half, so Nintendo doesn’t force you to buy the whole pair at the expense of extra SKUs, which is also probably helpful if you lose one.) This seems high, and could be a significant problem going forward.

As for its availability? Nintendo has disclosed that they are pushing 2 million units into the channel, so they are not expecting shortages like the NES Classic had. They do state that demand is up-in-the-air a bit, though.

Read on to find out about their online component and new performance info!

Subject: Systems, Mobile

Vulkan 1.0, OpenGL 4.5, and OpenGL ES 3.2 on a console

A few days ago, sharp eyes across the internet noticed that Nintendo’s Switch console has been added to lists of compliant hardware at The Khronos Group. Vulkan 1.0 was the eye-catcher, although the other tabs also claims conformance with OpenGL 4.5 and OpenGL ES 3.2. The device is not listed as compatible with OpenCL, although that does not really surprise me for a single-GPU gaming system. The other three APIs have compute shaders designed around the needs of game developers. So the Nintendo Switch conforms to the latest standards of the three most important graphics APIs that a gaming device should use -- awesome.

But what about performance?

In other news, Eurogamer / Digital Foundary and VentureBeat uncovered information about the hardware. It will apparently use a Tegra X1, which is based around second-generation Maxwell, that is under-clocked from what we see on the Shield TV. When docked, the GPU will be able to reach 768 MHz on its 256 CUDA cores. When undocked, this will drop to 307.2 MHz (although the system can utilize this mode while docked, too). This puts the performance at ~315 GFLOPs when in mobile, pushing up to ~785 GFLOPs when docked.

You might compare this to the Xbox One, which runs at ~1310 GFLOPs, and the PlayStation 4, which runs at ~1840 GFLOPs. This puts the Nintendo Switch somewhat behind it, although the difference is even greater than that. The FLOP calculation of Sony and Microsoft is 2 x Shader Count x Frequency, but the calculation of Nintendo’s Switch is 4 x Shader Count x Frequency. FMA is the factor of two, but the extra factor of two in Nintendo’s case... ...

Yup, the Switch’s performance rating is calculated as FP16, not FP32.

nintendo-2016-switch-gpu.png

Snippet from an alleged leak of what Nintendo is telling developers.
If true, it's very interesting that FP16 values are being discussed as canonical.

Reducing shader precision down to 16-bit is common for mobile devices. It takes less transistors to store and translate half-precision values, and accumulated error will be muted by the fact that you’re viewing it on a mobile screen. The Switch isn’t always a mobile device, though, so it will be interesting to see how this reduction of lighting and shading precision will affect games on your home TV, especially in titles that don’t follow Nintendo’s art styles. That said, shaders could use 32-bit values, but then you are cutting your performance for those instructions in half, when you are already somewhat behind your competitors.

As for the loss of performance when undocked, it shouldn’t be too much of an issue if Nintendo pressures developers to hit 1080p when docked. If that’s the case, the lower resolution, 720p mobile screen will roughly scale with the difference in clock.

Lastly, there is a bunch of questions surrounding Nintendo’s choice of operating system: basically, all the questions. It’s being developed by Nintendo, but we have no idea what they forked it from. NVIDIA supports the Tegra SoC on both Android and Linux, it would be legal for Nintendo to fork either one, and Nintendo could have just asked for drivers even if NVIDIA didn’t already support the platform in question. Basically, anything is possible from the outside, and I haven’t seen any solid leaks from the inside.

The Nintendo Switch launches in March.

Rumor: Nintendo NX Uses NVIDIA Tegra... Something

Subject: Graphics Cards, Systems, Mobile | July 27, 2016 - 07:58 PM |
Tagged: nvidia, Nintendo, nintendo nx, tegra, Tegra X1, tegra x2, pascal, maxwell

Okay so there's a few rumors going around, mostly from Eurogamer / DigitalFoundry, that claim the Nintendo NX is going to be powered by an NVIDIA Tegra system on a chip (SoC). DigitalFoundry, specifically, cites multiple sources who claim that their Nintendo NX development kits integrate the Tegra X1 design, as seen in the Google Pixel C. That said, the Nintendo NX release date, March 2017, does provide enough time for them to switch to NVIDIA's upcoming Pascal Tegra design, rumored to be called the Tegra X2, which uses NVIDIA's custom-designed Denver CPU cores.

Preamble aside, here's what I think about the whole situation.

First, the Tegra X1 would be quite a small jump in performance over the WiiU. The WiiU's GPU, “Latte”, has 320 shaders clocked at 550 MHz, and it was based on AMD's TeraScale 1 architecture. Because these stream processors have single-cycle multiply-add for floating point values, you can get its FLOP rating by multiplying 320 shaders, 550,000,000 cycles per second, and 2 operations per clock (one multiply and one add). This yields 352 GFLOPs. The Tegra X1 is rated at 512 GFLOPs, which is just 45% more than the previous generation.

This is a very tiny jump, unless they indeed use Pascal-based graphics. If this is the case, you will likely see a launch selection of games ported from WiiU and a few games that use whatever new feature Nintendo has. One rumor is that the console will be kind-of like the WiiU controller, with detachable controllers. If this is true, it's a bit unclear how this will affect games in a revolutionary way, but we might be missing a key bit of info that ties it all together.

nvidia-2016-shieldx1consoles.png

As for the choice of ARM over x86... well. First, this obviously allows Nintendo to choose from a wider selection of manufacturers than AMD, Intel, and VIA, and certainly more than IBM with their previous, Power-based chips. That said, it also jives with Nintendo's interest in the mobile market. They joined The Khronos Group and I'm pretty sure they've said they are interested in Vulkan, which is becoming the high-end graphics API for Android, supported by Google and others. That said, I'm not sure how many engineers exist that specialize in ARM optimization, as most mobile platforms try to abstract this as much as possible, but this could be Nintendo's attempt to settle on a standardized instruction set, and they opted for mobile over PC (versus Sony and especially Microsoft, who want consoles to follow high-end gaming on the desktop).

Why? Well that would just be speculating on speculation about speculation. I'll stop here.

NVIDIA Jetson TX1 Will Power Autonomous Embedded Devices With Machine Learning

Subject: General Tech | November 12, 2015 - 02:46 AM |
Tagged: Tegra X1, nvidia, maxwell, machine learning, jetson, deep neural network, CUDA, computer vision

Nearly two years ago, NVIDIA unleashed the Jetson TK1, a tiny module for embedded systems based around the company's Tegra K1 "super chip." That chip was the company's first foray into CUDA-powered embedded systems capable of machine learning including object recognition, 3D scene processing, and enabling things like accident avoidance and self-parking cars.

Now, NVIDIA is releasing even more powerful kit called the Jetson TX1. This new development platform covers two pieces of hardware: the credit card sized Jetson TX1 module and a larger Jetson TX1 Development Kit that the module plugs into and provides plenty of I/O options and pin outs. The dev kit can be used by software developers or for prototyping while the module alone can be used with finalized embedded products.

JX08_JetsonTX1_topBlack_04_v001_jw_wht.jpg

NVIDIA foresees the Jetson TX1 being used in drones, autonomous vehicles, security systems, medical devices, and IoT devices coupled with deep neural networks, machine learning, and computer vision software. Devices would be able to learn from the environment in order to navigate safely, identify and classify objects of interest, and perform 3D mapping and scene modeling. NVIDIA partnered with several companies for proof-of-concepts including Kespry and Stereolabs.

Using the TX1, Kespry was able to use drones to classify and track in real time construction equipment moving around a construction site (in which the drone was not necessarily programmed for exactly as sites and weather conditions vary, the machine learning/computer vision was used to allow the drone to navigate the construction site and a deep neural network was used to identify and classify the type of equipment it saw using its cameras. Meanwhile Stereolabs used high resolution cameras and depth sensors to capture photos of buildings and then used software to reconstruct the 3D scene virtually for editing and modeling. You can find other proof-of-concept videos, including upgrading existing drones to be more autonomous posted here.

From the press release:

"Jetson TX1 will enable a new generation of incredibly capable autonomous devices," said Deepu Talla, vice president and general manager of the Tegra business at NVIDIA. "They will navigate on their own, recognize objects and faces, and become increasingly intelligent through machine learning. It will enable developers to create industry-changing products."

But what about the hardware side of things? Well, the TX1 is a respectable leap in hardware and compute performance. Sitting at 1 Teraflops of rated (FP16) compute performance, the TX1 pairs four ARM Cortex A57 and four ARM Cortex A53 64-bit CPU cores with a 256-core Maxwell-based GPU. Definitely respectable for its size and low power consumption, especially considering NVIDIA claims the SoC can best the Intel Skylake Core i7-6700K in certain workloads (thanks to the GPU portion). The module further contains 4GB of LPDDR4 memory and 16GB of eMMC flash storage.

In short, while on module storage has not increased, RAM has been doubled and compute performance has tripled for FP16 compute performance and jumped by approximately 40% for FP32 versus the Jetson TK1's 2GB of DDR3 and 192-core Kepler GPU. The TX1 also uses a smaller process node at 20nm (versus 28nm) and the chip is said to use "very little power." Networking support includes 802.11ac and Gigabit Ethernet. The chart below outlines the major differences between the two platforms.

  Jetson TX1 Jetson TK1
GPU (Architecture) 256-core (Maxwell) 192-core (Kepler)
CPU 4 x ARM Cortex A57 + 4 x A53 "4+1" ARM Cortex A15 "r3"
RAM 4 GB LPDDR4 2 GB LPDDR3
eMMC 16 GB 16 GB
Compute Performance (FP16) 1 TFLOP 326 GFLOPS
Compute Performance (FP32) - via AnandTech 512 GFLOPS (AT's estimation) 326 GFLOPS (NVIDIA's number)
Manufacturing Node 20nm 28nm
Launch Pricing $299 $192

The TX1 will run the Linux For Tegra operating system and supports the usual suspects of CUDA 7.0, cuDNN, and VisionWorks development software as well as the latest OpenGL drivers (OpenGL 4.5, OpenGL ES 3.1, and Vulkan).

NVIDIA is continuing to push for CUDA Everywhere, and the Jetson TX1 looks to be a more mature product that builds on the TK1. The huge leap in compute performance should enable even more interesting projects and bring more sophisticated automation and machine learning to smaller and more intelligent devices.

For those interested, the Jetson TX1 Development Kit (the full I/O development board with bundled module) will be available for pre-order today at $599 while the TX1 module itself will be available soon for approximately $299 each in orders of 1,000 or more (like Intel's tray pricing).

With CUDA 7, it is apparently possible for the GPU to be used for general purpose processing as well which may open up some doors that where not possible before in such a small device. I am interested to see what happens with NVIDIA's embedded device play and what kinds of automated hardware is powered by the tiny SoC and its beefy graphics.

Source: NVIDIA

Google's Pixel C Is A Powerful Convertible Tablet Running Android 6.0

Subject: Mobile | October 2, 2015 - 04:09 PM |
Tagged: Tegra X1, tablet, pixel, nvidia, google, android 6.0, Android

During its latest keynote event, Google unveiled the Pixel C, a powerful tablet with optional keyboard that uses NVIDIA’s Tegra X1 SoC and runs the Android 6.0 “Marshmallow” operating system.

The Pixel C was designed by the team behind the Chromebook Pixel. Pixel C features an anodized aluminum body that looks (and reportedly feels) smooth with clean lines and rounded corners. The tablet itself is 7mm thick and weighs approximately one pound. The front of the Pixel C is dominated by a 10.2” display with a resolution of 2560 x 1800 (308 PPI, 500 nits brightness), wide sRGB color gamut, and 1:√2 aspect ratio (which Google likened to the size and aspect ratio of an A4 sheet of paper). A 2MP front camera sits above the display while four microphones sit along the bottom edge and a single USB Type-C port and two stereo speakers sit on the sides of the tablet. Around back, there is an 8MP rear camera and a bar of LED lights that will light up to indicate the battery charge level after double tapping it.

Google Pixel C Tegra X1 Tablet.jpg

The keyboard is an important part of the Pixel C, and Google has given it special attention to make it part of the package. The keyboard attaches to the tablet using self-aligning magnets that are powerful enough to keep the display attached while holding it upside down and shaking it (not that you'd want to do that, mind you). It can be attached to the bottom of the tablet for storage and used like a slate or you can attach the tablet to the back of the keyboard and lift the built-in hinge to use the Pixel C in laptop mode (the hinge can hold the display at anywhere from 100 to 135-degrees). The internal keyboard battery is good for two months of use, and can be simply recharged by closing the Pixel C like a laptop and allowing it to inductively charge from the tablet portion. The keyboard is around 2mm thick and is nearly full size at 18.85mm pitch and the chiclet keys have a 1.4mm travel that is similar to that of the Chromebook Pixel. There is no track pad, but it does offer a padded palm rest which is nice to see.

Google Pixel C with Keyboard.jpg

Internally, the Pixel C is powered by the NVIDIA Tegra X1 SoC, 3GB of RAM, and 32GB or 64GB of storage (depending on model). The 20nm Tegra X1 consists of four ARM Cortex A57 and four Cortex A53 CPU cores paired with a 256-core Maxwell GPU. The Pixel C is a major design win for NVIDIA, and the built in GPU will be great for gaming on the go.

The Pixel C will be available in December ("in time for the holidays") for $499 for the base 32 GB model, $599 for the 64 GB model, and $149 for the keyboard.

First impressions, such as this hands-on by Engadget, seem to be very positive stating that it is sturdy yet sleek hardware that feels comfortable typing on. While the hardware looks more than up to the task, the operating system of choice is a concern for me. Android is not the most productivity and multi-tasking friendly software. There are some versions of Android that enable multiple windows or side-by-side apps, but it has always felt rather clunky and limited in its usefulness. With that said, Computer World's  JR Raphael seems hopeful. He points out that the Pixel C is, in Batman fashion, not the hardware Android wants, but the hardware that Android needs (to move forward) and is primed for a future of Android that is more friendly to such productive endeavors. Development versions of Android 6.0 included support for multiple apps running simultaneously side-by-side, and while that feature will not make the initial production code cut, it does show that it is something that Google is looking into pursuing and possibly enabling at some point. The Pixel C has an excellent aspect ratio to take advantage of the app splitting with the ability to display four windows each with the same aspect ratio.

I am not sure how well received the Pixel C will be by business users who have several convertible tablet options running Windows and Chrome OS. It certainly gives the iPad-and-keyboard combination a run for its money and is a premium alternative to devices like the Asus Transformers.

What do you think about the Pixel C, and in particular, it running Android?

Even if I end up being less-than-productive using it, I think I'd still want the sleek-looking hardware as a second machine, heh.

Source: Google
Author:
Manufacturer: NVIDIA

SHIELD Specifications

Announced just this past June at last year’s Google I/O event, Android TV is a platform developed by Google, running Android 5.0 and higher, that aims to create an interactive experience for the TV. This platform can be built into a TV directly as well as into set-top style boxes, like the NVIDIA SHIELD we are looking at today. The idea is to bring the breadth of apps and content to the TV through the Android operating system in a way that is both convenient and intuitive.

NVIDIA announced SHIELD back in March at GDC as the first product to use the company’s latest Tegra processor, the X1. This SoC combines an 8-core big.LITTLE ARM processor design with a 256-core implementation of the NVIDIA Maxwell GPU architecture, providing GPU performance previously unseen in an Android device. I have already spent some time with the NVIDIA SHIELD at various events and the promise was clearly there to make it a leading option for Android TV adoption, but obviously there were questions to be answered.

DSC01740.jpg

Today’s article will focus on my early impressions with the NVIDIA SHIELD, having used it both in the office and at home for a handful of days. As you’ll see during the discussion there are still some things to be ironed out, some functionality that needs to be added before SHIELD and Android TV can really be called a must-buy product. But I do think it will get there.

And though this review will focus on the NVIDIA SHIELD, it’s impossible not to marry the success of SHIELD with the success of Google’s Android TV. The dominant use case for SHIELD is as a media playback device, with the gaming functionality as a really cool side project for enthusiasts and gamers looking for another outlet. For SHIELD to succeed, Google needs to prove that Android TV can improve over other integrated smart TV platforms as well as other set-top box platforms like Boxee, Roku and even the upcoming Apple TV refresh.

But first, let’s get an overview of the NVIDIA SHIELD device, pricing and specifications, before diving into my experiences with the platform as a whole.

Continue reading our review of the new NVIDIA SHIELD with Android TV!!

NVIDIA SHIELD and SHIELD Pro Show up on Amazon

Subject: Mobile | May 16, 2015 - 01:00 PM |
Tagged: Tegra X1, tegra, shield pro, shield console, shield, nvidia

UPDATE: Whoops! It appears that Amazon took the listing down... No surprise there. I'm sure we'll be seeing them again VERY SOON. :)

Looks like the release of the new NVIDIA SHIELD console device, first revealed back at GDC in March, is nearly here. A listing for "NVIDIA SHIELD" as well as the new "NVIDIA SHIELD Pro" showed up on Amazon.com today.

shield1.jpg

Though we don't know what the difference between the SHIELD and SHIELD Pro are officially, according to Amazon at least, the difference appears to be the internal storage. The Pro model will ship with 500GB of internal storage, the non-Pro model will only have 16GB. You'll have to get an SD Card for more storage on the base model if you plan on doing anything other than streaming games through NVIDIA GRID it seems.

shield2.jpg

No pricing is listed yet and there is no release date on the Amazon pages either, but we have always been told this was to be a May or June on-sale date. Both models of the NVIDIA SHIELD will include an HDMI cable, a micro-USB cable and a SHIELD Controller. If you want the remote or stand, you're going to have to pay out a bit more.

shield3.jpg

For those of you that missed out on the original SHIELD announcement from March, here is a quick table detailing the specs, as we knew them at that time. NVIDIA's own Tegra X1 SoC featuring 256 Maxwell GPU cores powers this device using the Android TV operating system, promising 4K video playback, the best performing Android gaming experience and NVIDIA GRID streaming games.

  NVIDIA SHIELD Specifications
Processor NVIDIA® Tegra® X1 processor with 256-core Maxwell™ GPU with 3GB RAM
Video Features 4K Ultra-HD Ready with 4K playback and capture up to 60 fps (VP9, H265, H264)
Audio 7.1 and 5.1 surround sound pass through over HDMI
High-resolution audio playback up to 24-bit/192kHz over HDMI and USB
High-resolution audio upsample to 24-bit/192hHz over USB
Storage 16 GB
Wireless 802.11ac 2x2 MIMO 2.4 GHz and 5 GHz Wi-Fi
Bluetooth 4.1/BLE
Interfaces Gigabit Ethernet
HDMI 2.0
Two USB 3.0 (Type A)
Micro-USB 2.0
MicroSD slot (supports 128GB cards)
IR Receiver (compatible with Logitech Harmony)
Gaming Features NVIDIA GRID™ streaming service
NVIDIA GameStream™
SW Updates SHIELD software upgrades directly from NVIDIA
Power 40W power adapter
Weight and Size Weight: 23oz / 654g
Height: 5.1in / 130mm
Width: 8.3in / 210mm
Depth: 1.0in / 25mm
OS Android TV™, Google Cast™ Ready
Bundled Apps PLEX
In the box NVIDIA SHIELD
NVIDIA SHIELD controller
HDMI cable (High Speed), USB cable (Micro-USB to USB)
Power adapter (Includes plugs for North America, Europe, UK)
Requirements TV with HDMI input, Internet access
Options SHIELD controller, SHIELD remove, SHIELD stand

 

Source: Amazon.com

Podcast #343 - DX12 Performance, Dissecting G-SYNC and FreeSync, Intel 3D NAND and more!

Subject: General Tech | April 2, 2015 - 01:16 PM |
Tagged: podcast, video, dx12, 3dmark, freesync, g-sync, Intel, 3d nand, 20nm, 28nm, micron, nvidia, shield, Tegra X1, raptr, 850 EVO, msata, M.2

PC Perspective Podcast #343 - 04/02/2015

Join us this week as we discuss DX12 Performance, Dissecting G-SYNC and FreeSync, Intel 3D NAND and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts:Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano

Subscribe to the PC Perspective YouTube Channel for more videos, reviews and podcasts!!