Flash player not detected. Click here to install flash.
« 1 2 3 4 5 »

GDC 15: AMD Mantle Might Be Dead as We Know It: No Public SDK Planned

Subject: Graphics Cards | March 2, 2015 - 02:31 PM |
Tagged: sdk, Mantle, dx12, API, amd

The Game Developers Conference is San Francisco starts today and you can expect to see more information about DirectX 12 than you could ever possibly want, so be prepared. But what about the original low-level API, AMD Mantle. Utilized in Battlefield 4, Thief and integrated into the Crytek engine (announced last year), announced with the release of the Radeon R9 290X/290, Mantle was truly the instigator that pushed Microsoft into moving DX12's development along at a faster pace.

Since DX12's announcement, AMD has claimed that Mantle would live on, bringing performance advantages to AMD GPUs and would act as the sounding board for new API features for AMD and game development partners. And, as was always trumpeted since the very beginning of Mantle, it would become an open API, available for all once it outgrew the beta phase that it (still) resides in.

mantle1.jpg

Something might have changed there.

A post over on the AMD Gaming blog from Robert Hallock has some news about Mantle to share as GDC begins. First, the good news:

AMD is a company that fundamentally believes in technologies unfettered by restrictive contracts, licensing fees, vendor lock-ins or other arbitrary hurdles to solving the big challenges in graphics and computing. Mantle was destined to follow suit, and it does so today as we proudly announce that the 450-page programming guide and API reference for Mantle will be available this month (March, 2015) at www.amd.com/mantle.
 
This documentation will provide developers with a detailed look at the capabilities we’ve implemented and the design decisions we made, and we hope it will stimulate more discussion that leads to even better graphics API standards in the months and years ahead.

That's great! We will finally be able to read about the API and how it functions, getting access to the detailed information we have wanted from the beginning. But then there is this portion:

AMD’s game development partners have similarly started to shift their focus, so it follows that 2015 will be a transitional year for Mantle. Our loyal customers are naturally curious what this transition might entail, and we wanted to share some thoughts with you on where we will be taking Mantle next:

AMD will continue to support our trusted partners that have committed to Mantle in future projects, like Battlefield™ Hardline, with all the resources at our disposal.

  1. Mantle’s definition of “open” must widen. It already has, in fact. This vital effort has replaced our intention to release a public Mantle SDK, and you will learn the facts on Thursday, March 5 at GDC 2015.
     
  2. Mantle must take on new capabilities and evolve beyond mastery of the draw call. It will continue to serve AMD as a graphics innovation platform available to select partners with custom needs.
     
  3. The Mantle SDK also remains available to partners who register in this co-development and evaluation program. However, if you are a developer interested in Mantle "1.0" functionality, we suggest that you focus your attention on DirectX® 12 or GLnext.

Essentially, AMD's Mantle API in it's "1.0" form is at the end of its life, only supported for current partners and the publicly available SDK will never be posted. Honestly, at this point, this isn't so much of a let down as it is a necessity. DX12 and GLnext have already superseded Mantle in terms of market share and mind share with developers and any more work AMD put into getting devs on-board with Mantle is wasted effort.

mantle-2.jpg

Battlefield 4 is likely to be the only major title to use AMD Mantle

AMD claims to have future plans for Mantle though it will continue to be available only to select partners with "custom needs." I would imagine this would expand outside the world games but could also mean game consoles could be the target, where developers are only concerned with AMD GPU hardware.

So - from our perspective, Mantle as we know is pretty much gone. It served its purpose, making NVIDIA and Microsoft pay attention to the CPU bottlenecks in DX11, but it appears the dream was a bit bigger than the product could become. AMD shouldn't be chastised because of this shift nor for its lofty goals that we kind-of-always knew were too steep a hill to climb. Just revel in the news that pours from GDC this week about DX12.

Source: AMD
Author:
Manufacturer: NVIDIA

Finally, a SHIELD Console

NVIDIA is filling out the family of the SHIELD brand today with the announcement of SHIELD, a set-top box powered by the Tegra X1 processor. SHIELD will run Android TV and act as a game playing, multimedia watching, GRID streaming device. Selling for $199 and available in May of this year, there is a lot to discuss.

Odd naming scheme aside, the SHIELD looks to be an impressive little device, sitting on your home theater or desk and bringing a ton of connectivity and performance to your TV. Running Android TV means the SHIELD will have access to the entire library of Google Play media including music, movies and apps. SHIELD supports 4K video playback at 60 Hz thanks to an HDMI 2.0 connection and fully supports H.265/HEVC decode thanks to Tegra X1 processor.

10.jpg

Here is a full breakdown of the device's specifications.

  NVIDIA SHIELD Specifications
Processor NVIDIA® Tegra® X1 processor with 256-core Maxwell™ GPU with 3GB RAM
Video Features 4K Ultra-HD Ready with 4K playback and capture up to 60 fps (VP9, H265, H264)
Audio 7.1 and 5.1 surround sound pass through over HDMI
High-resolution audio playback up to 24-bit/192kHz over HDMI and USB
High-resolution audio upsample to 24-bit/192hHz over USB
Storage 16 GB
Wireless 802.11ac 2x2 MIMO 2.4 GHz and 5 GHz Wi-Fi
Bluetooth 4.1/BLE
Interfaces Gigabit Ethernet
HDMI 2.0
Two USB 3.0 (Type A)
Micro-USB 2.0
MicroSD slot (supports 128GB cards)
IR Receiver (compatible with Logitech Harmony)
Gaming Features NVIDIA GRID™ streaming service
NVIDIA GameStream™
SW Updates SHIELD software upgrades directly from NVIDIA
Power 40W power adapter
Weight and Size Weight: 23oz / 654g
Height: 5.1in / 130mm
Width: 8.3in / 210mm
Depth: 1.0in / 25mm
OS Android TV™, Google Cast™ Ready
Bundled Apps PLEX
In the box NVIDIA SHIELD
NVIDIA SHIELD controller
HDMI cable (High Speed), USB cable (Micro-USB to USB)
Power adapter (Includes plugs for North America, Europe, UK)
Requirements TV with HDMI input, Internet access
Options SHIELD controller, SHIELD remove, SHIELD stand

Obviously the most important feature is the Tegra X1 SoC, built on an 8-core 64-bit ARM processor and a 256 CUDA Core Maxwell architecture GPU. This gives the SHIELD set-top more performance than basically any other mobile part on the market, and demos showing Doom 3 and Crysis 3 running natively on the hardware drive the point home. With integrated HEVC decode support the console is the first Android TV device to offer the support for 4K video content at 60 FPS.

Even though storage is only coming in at 16GB, the inclusion of an MicroSD card slot enabled expansion to as much as 128GB more for content and local games.

11.jpg

The first choice for networking will be the Gigabit Ethernet port, but the 2x2 dual-band 802.11ac wireless controller means that even those of us that don't have hardwired Internet going to our TV will be able to utilize all the performance and features of SHIELD.

Continue reading our preview of the NVIDIA SHIELD set-top box!!

Imagination Launches PowerVR GT7900, "Super-GPU" Targeting Consoles

Subject: Graphics Cards, Mobile | February 26, 2015 - 02:15 PM |
Tagged: super-gpu, PowerVR, Imagination Technologies, gt7900

As a preview to announcements and releases being made at both Mobile World Congress (MWC) and the Game Developers Summit (GDC) next week, Imagination Technologies took the wraps off of a new graphics product they are calling a "super-GPU". The PowerVR GT7900 is the new flagship GPU as a part of its Series7XT family that is targeting a growing category called "affordable game consoles." Think about the Android-powered set-top devices like the Ouya or maybe Amazon's Kindle TV.

gt7900-1.png

PowerVR breaks up its GPU designs into unified shading clusters (USCs) and the GT7900 has 16 of them for a total of 512 ALU cores. Imagination has previously posted a great overview of its USC architecture design and how you can compare its designs to other GPUs on the market. Imagination wants to claim that the GT7900 will offer "PC-class gaming experiences" though that is as ambiguous as the idea of a work load of a "console-level game." But with rated peak performance levels hitting over 800 GFLOPS in FP32 and 1.6 TFLOPS in FP16 (half-precision) this GPU does have significant theoretical capability.

  PowerVR GT7900 Tegra X1
Vendor Imagination Technologies NVIDIA
FP32 ALUs 512 256
FP32 GFLOPS 800 512
FP16 GFLOPS 1600 1024
GPU Clock 800 MHz 1000 MHz
Process Tech 16nm FinFET+ 20nm TSMC

Imagination also believes that PowerVR offers a larger portion of its peak performance for a longer period of time than the competition thanks to the tile-based deferred rendering (TBDR) approach that has been "refined over the years to deliver unmatched efficiency."

gt7900-2.png

The FP16 performance number listed above is useful as an extreme power savings option where the half-precision compute operates in a much more efficient manner. A fair concern is how many applications, GPGPU or gaming, actually utilize the FP16 data type but having support for it in the GT7900 allows developers to target it.

Other key features of the GT7900 include support for OpenGL ES 3.1 + AEP (Android Extension Pack), hardware tessellation and ASTC LDR and HDR texture compression standards. The GPU also can run in a multi-domain virtualization mode that would allow multiple operating systems to run in parallel on a single platform.

gt7900-3.png

Imagination believes that this generation of PowerVR will "usher a new era of console-like gaming experiences" and will showcase a new demo at GDC called Dwarf Hall.

I'll be at GDC next week and have already setup a meeting with Imagination to talk about the GT7900 so I can have some hands on experiences to report back with soon. I am continually curious about the market for these types of high-end "mobile" GPUs with the limited market that the Android console market currently addresses. Imagination does claim that the GT7900 is beating products with performance levels as high as the GeForce GT 730M discrete GPU - no small feat.

Windows 10 Technical Preview Build 10022 Spotted

Subject: General Tech | February 26, 2015 - 07:00 AM |
Tagged: windows 10, windows, microsoft

WZor, a group in Russia that somehow acquires many Windows leaks, has just published screenshots of Windows 10 Build 10022 and Windows Server Build 9926. As far as we can tell, not much has changed. We see neither an upgraded Cortana nor a look at the Spartan browser. The build is not labeled “Microsoft Confidential” though, which makes people believe that it is (or was) intended for public release -- maybe as early as this week.

microsoft-windows10-10022-leak.jpg

Image Credit: WZor Twitter

Honestly, I do not see anything different from the provided screenshots apart from the incremented version number. It is possible that this build addresses back-end issues, leaving the major new features for BUILD in late April. Leaked notes (also by WZor) for build 10014, called an “Early Partner Drop”, suggest that version was designed for hardware and software vendors. Perhaps the upcoming preview build is designed to give a platform for third-parties to develop updates ahead of Microsoft releasing the next (or second-next) big build?

Either way, it seems like we will get it very soon.

Source: WZor

Podcast #338 - More USB 3.1 Devices, Broadwell NUC, another 840 Evo fix and more!

Subject: General Tech | February 26, 2015 - 02:07 PM |
Tagged: pcper, podcast, video, usb 3.1, Broadwell, Intel, nuc, Samsung, 840 evo, asus, Strix Tactic Pro, GTX 970, directx12, dx12

PC Perspective Podcast #338 - 02/26/2015

Join us this week as we discuss more USB 3.1 Devices, Broadwell NUC, another 840 Evo fix and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Jeremy Hellstrom, Allyn Malventano, and Sebastian Peak

Program length: 1:46:04

  1. EVGA Contest Winner!
  2. Week in Review:
  3. News item of interest:
  4. Question: Alex from Sydney
    1. Just a quick question regarding DirectX 12. I’m planning to buy a new graphics card soon but I want a DirectX 12 card for all the fancy new features so I’m considering either the GTX 970 or 980, the question I have is are these real DirectX 12 cards? Since DirectX 12 development is still ongoing how can these cards be fully DirectX 12 complaint?
  5. Hardware/Software Picks of the Week:
    1. Ryan: Prime95
    2. Jeremy: Not SSL anyways; old become new is much more pleasant
    3. Allyn: Lenovo Superfish removal tool (once their site is back online, that is)
  6. Closing/outro

Subscribe to the PC Perspective YouTube Channel for more videos, reviews and podcasts!!

GDC 15: NVIDIA Shows TITAN X at Epic Games Keynote

Subject: Graphics Cards | March 4, 2015 - 01:10 PM |
Tagged: titan x, nvidia, maxwell, gtx, geforce, gdc 15, GDC

For those of you worried that GDC would sneak by without any new information for NVIDIA's GeForce fans, Jen-Hsun Huang surprised everyone by showing up at the Epic Games' keynote with Tim Sweeny to hijack it.

The result: the first showing of the upcoming GeForce TITAN X based on the upcoming Maxwell GM200 GPU.

titanx3.jpg

JHH stated that it would have a 12GB frame buffer and would included 8 billion transistors! There wasn't much more information than that, but I was promised that the details would be revealed sooner rather than later.

titanx2.jpg

Any guesses on performance or price?

titanx1.jpg

titanx4.jpg

Jen-Hsun signs the world's first TITAN X for Tim Sweeney.

Kite Demo running on TITAN X

Intel Revamps Atom Branding, Next Generation Atoms Will Come in x3, x5, and x7 Tiers

Subject: General Tech | February 26, 2015 - 02:02 AM |
Tagged: SoFIA, moorefield, Intel, Cherry Trail, branding, atom

Intel is updating its Atom processor branding to better communicate the expected performance and experience customers can expect from their Intel powered mobile device. In fact, the new branding specifies three tiers. Atom processors will soon come in Atom x3, x5, and x7 flavors. This branding scheme is similar to the Core processor branding using the i3, i5, and i7 labels.

The Atom x3, x5, and x7 chips are low power, efficient processors for battery powered devices and sit below the Core M series which in turn are below the Core i3, i5, and i7 processors. The following infographic shows off the new branding though Intel does not reveal any specific details about these new Atom chips (we will hopefully know more after Mobile World Congress). Of course, Atom x3 chips will reside in smartphones with x5 and x7 chips powering tablets and budget convertibles. The x7 brand represents the flagship processors of the Atom line.

The new branding will begin with the next generation of Atom chips which should include Cherry Trail, the 14nm successor to Bay Trail featuring four x86 Airmont cores and Gen 8 Intel graphics. Cherry Trail (Cherryview SoC) will be used in all manner of mobile devices from entry level 8"+ tablets to larger notebooks and convertibles. It appears that Intel will use Moorefield (a quad core 14nm refresh of Merrifield) through 2015 for smartphones though road maps seem to indicate that Intel's budget SoFIA SoC will also launch this year. SoFIA and Moorefield processors should fall under the Atom x3 brand with the higher powered and higher clocked Cherry Trail chips will use the Atom x5 and x7 monikers.

What are your thoughts on Intel's new Atom x3/x5/x7 brands?

Source: Intel

Who Should Care? Thankfully, Many People

The Khronos Group has made three announcements today: Vulkan (their competitor to DirectX 12), OpenCL 2.1, and SPIR-V. Because there is actually significant overlap, we will discuss them in a single post rather than splitting them up. Each has a role in the overall goal to access and utilize graphics and compute devices.

khronos-Vulkan-700px-eventpage.png

Before we get into what everything is and does, let's give you a little tease to keep you reading. First, Khronos designs their technologies to be self-reliant. As such, while there will be some minimum hardware requirements, the OS pretty much just needs to have a driver model. Vulkan will not be limited to Windows 10 and similar operating systems. If a graphics vendor wants to go through the trouble, which is a gigantic if, Vulkan can be shimmed into Windows 8.x, Windows 7, possibly Windows Vista despite its quirks, and maybe even Windows XP. The words “and beyond” came up after Windows XP, but don't hold your breath for Windows ME or anything. Again, the further back in Windows versions you get, the larger the “if” becomes but at least the API will not have any “artificial limitations”.

Outside of Windows, the Khronos Group is the dominant API curator. Expect Vulkan on Linux, Mac, mobile operating systems, embedded operating systems, and probably a few toasters somewhere.

On that topic: there will not be a “Vulkan ES”. Vulkan is Vulkan, and it will run on desktop, mobile, VR, consoles that are open enough, and even cars and robotics. From a hardware side, the API requires a minimum of OpenGL ES 3.1 support. This is fairly high-end for mobile GPUs, but it is the first mobile spec to require compute shaders, which are an essential component of Vulkan. The presenter did not state a minimum hardware requirement for desktop GPUs, but he treated it like a non-issue. Graphics vendors will need to be the ones making the announcements in the end, though.

Before we go further, some background is necessary. Read on for that and lots more!

GDC 15: ZOTAC Announces the SN970 Steam Machine - Powered by a GTX 970M and Intel Skylake CPU

Subject: Systems | March 4, 2015 - 12:11 AM |
Tagged: Skylake, zotac, valve, SteamOS, Steam Machine, steam, gdc 2015, gdc 15, GDC, GTX 970M

Favor a steamier TV gaming experience? ZOTAC has announced a new Steam Machine on the eve of Valve’s presentation at GDC on Wednesday.

SN970-03.jpg

The SN970 presumably gets its name from the GTX 970M mobile GPU within, and this does the heavy lifting along with an unspecified 6th-generation Intel (Skylake) CPU. The massive amount of HDMI outputs (there are 4 HDMI 2.0 ports!) is pretty impressive for a small device like this, and dual Gigabit Ethernet ports are a premium feature as well.

SN970-05.jpg

There's a lot going on back here - the rear I/O of the ZOTAC SN970

Here's the rundown of features and specs from ZOTAC:

Key Features

  • SteamOS preloaded
  • NVIDIA GeForce® GTX 970M MXM graphics
  • 4 x HDMI 2.0, supports 4K UHD @ 60Hz

Specifications

  • 6th Gen Intel Processor
  • NVIDIA GeForce® GTX 970M 3GB GDDR5
  • 8GB DDR3 SODIMM
  • 64GB M.2 SSD
  • 1 x HDMI in
  • 2D/3D NVIDIA Surround
  • Dual Gigabit Ethernet
  • 4 x USB 3.0, 2 x USB 2.0
  • 1 x 2.5” 1TB HDD
  • 802.11ac WiFi, Bluetooth 4.0
  • Mic-In, Stereo Out
  • SD/SDHC/SDXC Card Reader

The release for this new Steam Box isn't specified, but we will be doubtless be hearing more from Valve and their partners tomorrow so stay tuned!

Source: ZOTAC

So Long Adware, and Thanks for All the Fish!

Subject: Graphics Cards | March 1, 2015 - 07:30 AM |
Tagged: superfish, Lenovo, bloatware, adware

Obviously, this does not forget the controversy that Lenovo got themselves into, but it is certainly the correct response (if they act how they imply). Adware and bloatware is common to find on consumer PCs, which makes the slowest of devices even more sluggish as demos and sometimes straight-up advertisements claim their share of your resources. This does not even begin to discuss the security issues that some of these hitchhikers drag in. Again, I refer you to the aforementioned controversy.

lenovo-do.png

In response, albeit a delayed one, Lenovo has announced that, by the launch of Windows 10, they will only pre-install the OS and “related software”. Lenovo classifies this related software as drivers, security software, Lenovo applications, and applications for “unique hardware” (ex: software for an embedded 3D camera).

It looks to be a great step, but I need to call out “security software”. Windows 10 should ship with Microsoft's security applications in many regions, which really questions why a laptop provider would include an alternative. If the problem is that people expect McAfee or Symantec, then advertise pre-loaded Microsoft anti-malware and keep it clean. Otherwise, it feels like keeping a single finger in the adware take-a-penny dish.

At least it is not as bad as trying to install McAfee every time you update Flash Player. I consider Adobe's tactic the greater of two evils on that one. I mean, unless Adobe just thinks that Flash Player is so insecure that you would be crazy to install it without a metaphorical guard watching over your shoulder.

And then of course we reach the divide between “saying” and “doing”. We will need to see Lenovo's actual Windows 10 devices to find out if they kept their word, and followed its implications to a tee.

Source: Lenovo

GDC 15: NVIDIA Announces SHIELD, Tegra X1 Powered Set-top with Android TV

Subject: General Tech, Mobile | March 3, 2015 - 10:21 PM |
Tagged: Tegra X1, tegra, shield, gdc 15, GDC, android tv

NVIDIA just announced a new member of its family of hardware devices: SHIELD. Just SHIELD. Powered by NVIDIA's latest 8-core, Maxwell GPU Tegra X1 SoC, SHIELD will run Android TV and act as a game playing, multimedia watching, GRID streaming set-top box.

04.jpg

Odd naming scheme aside, the SHIELD looks to be an impressive little device, sitting on your home theater or desk, bringing a ton of connectivity and performance to your TV. Running Android TV means the SHIELD will have access to the entire library of Google Play media including music, movie and apps. SHIELD supports 4K video playback at 60 Hz thanks to an HDMI 2.0 connection and fully supports H.265/HEVC decode thanks to Tegra X1 processor.

01.jpg

Speaking of the Tegra X1, the SHIELD will include the power of 256 Maxwell architecture CUDA cores and will easily provide the best Android gaming performance of any tablet or set-top box on the market. This means gaming, and lots of it, will be possible on SHIELD. Remember our many discussions about Tegra-specific gaming ports from the past? That trend will continue and more developers are realizing the power that NVIDIA is putting into this tiny chip.

02.jpg

In the box you'll get the SHIELD set-top unit and a SHIELD Controller, the same released with the SHIELD Tablet last year. A smaller remote controller that looks similar to the one used with the Kindle Fire TV will cost a little extra as will the stand that sets the SHIELD upright.

Pricing on the new SHIELD set-top will be $199, shipping in May.

BitTorrent Sync 2.0 Available Now For PC, NAS, and Mobile With Pro Version For $39.95/year

Subject: General Tech | March 4, 2015 - 02:29 AM |
Tagged: sync 2.0, folder sync, file sharing, bittorrent sync, bittorrent, backup

BitTorrent Sync has officially taken the beta tag off and launched Sync 2.0. Sync 2.0 is the latest iteration of the company’s file and folder synchronization application. It uses certificate-based security and the torrent protocol to securely share files an folders with no file size or transfer limits. Sync 2.0 is available for PCs as well as NAS and mobile devices and it can be used to roll your own cloud storage.

Sync 2.0 Main.png

Sync 2.0 contains numerous bug fixes and three major new features over Sync 1.4 (which I detailed here and includes selective sync, ownership and permission controls, and private identities). Additionally, the question of how BitTorrent will monetize Sync has been answered with the introduction of a paid Sync Pro subscription service that grants access to all the new Sync features.

BitTorrent continues to offer a free version that Sync 1.4.3 users can upgrade to in order to take advantage of the bug fixes with one big caveat. The free version of Sync 2.0 is limited to synchronizing 10 folders (no file/folder size or transfer limits). This is a irksome step backwards from the previous version that in my opinion is unwarranted (Sync Pro unlocks a slew of useful features), but apparently BitTorrent believes it needs to do this to encourage enough people to ante up for the paid version to support the project.

Users can download Sync 2.0 for Windows, Mac, Linux, and Free BSD from GetSync.com while mobile users can pick the Sync app up from their app store of choice (it should be live today). BitTorrent now supports Sync on Network Attached Storage devices from Asustor, Drobo, Netgear, Overland SnapServer, QNAP, Seagate, and Synology. You can grab the appropriate NAS build from this page.

Downloads of Sync 2.0 include a 30-day trial of Sync Pro. Sync Pro will cost $39.95 per user per year (on unlimited devices) with volume licensing available for large organizations and teams.

I have been using Sync since the original alpha and have found it to be invaluable in keeping all my files in sync and my smartphone pictures backed up (especially with the number of times my S5 has needed replacing heh). I am still deciding whether or not I will purchase the yearly Pro subscription (The 10 folder limit does not affect me (yet anyway)), but the new features are compelling as the linked devices and selective sync would be welcome. The ownership and permissions stuff is great for collaboration and sharing with others, but that’s not something I’m using it for right now.

What are your thoughts on Sync 2.0 and the new subscription model? Now that I am allowed to talk about it, do you have any questions?

Source: BitTorrent

MWC 2015: Samsung Has Officially Announced the New Galaxy S6 and S6 Edge Smartphones

Subject: Mobile | March 1, 2015 - 02:01 PM |
Tagged: SoC, smartphones, Samsung, MWC 2015, MWC, Galaxy S6 Edge, galaxy s6, Exynos 7420, 14nm

Samsung has announced the new Galaxy S phones at MWC, and the new S6 and S6 Edge should be in line with what you were expecting if you’ve followed recent rumors.

galaxy-s6-both-versions-4-9zh2jqc.jpg

The new Samsung Galaxy S6 and S6 Edge (Image credit: Android Central)

As expected we no longer see a Qualcomm SoC powering the new phones, and as the rumors had indicated Samsung opted instead for their own Exynos 7 Octa mobile AP. The Exynos SoC’s have previously been in international versions of Samsung’s mobile devices, but they have apparently ramped up production to meet the demands of the US market as well. There is an interesting twist here, however.

14nmOcta.jpg

The Exynos 7420 powering both the Galaxy S6 and S6 Edge is an 8-core SoC with ARM’s big.LITTLE design, combining four ARM Cortex-A57 cores and four Cortex-A53 cores. Having announced 14nm FinFET mobile AP production earlier in February the possibility of the S6 launching with this new part was interesting, as the current process tech is 20nm HKMG for the Exynos 7. However a switch to this new process so soon before the official announcement seemed unlikely as large-scale 14nm FinFET production was just unveiled on February 16. Regardless, AnandTech is reporting that the new part will indeed be produced using this new 14nm process technology, and this gives Samsung an industry-first for a mobile SoC with the launch of the S6/S6 Edge.

GSM Arena has specs of the Galaxy S6 posted, and here’s a brief overview:

  • Display: 5.1” Super AMOLED, QHD resolution (1440 x 2560, ~577 ppi), Gorilla Glass 4
  • OS: Android OS, v5.0 (Lollipop) - TouchWiz UI
  • Chipset: Exynos 7420
  • CPU: Quad-core 1.5 GHz Cortex-A53 & Quad-core 2.1 GHz Cortex-A57
  • GPU: Mali-T760
  • Storage/RAM: 32/64/128 GB, 3 GB RAM
  • Camera: (Primary) 16 MP, 3456 x 4608, optical image stabilization, autofocus, LED flash
  • Battery: 2550 mAh (non-removable)

The new phones both feature attractive styling with metal and glass construction and Gorilla Glass 4 sandwiching the frame, giving each phone a glass back.

galaxy-s6-back-angle-9zh2jqc.jpg

The back of the new Galaxy S6 (Image credit: Android Central)

The guys at Android Central (source) had some pre-release time with the phones and have a full preview and hands-on video up on their site. The new phones will be released worldwide on April 10, and no specifics on pricing have been announced.

Author:
Manufacturer: Asus

Quiet, Efficient Gaming

The last few weeks have been dominated by talk about the memory controller of the Maxwell based GTX 970.  There are some very strong opinions about that particular issue, and certainly NVIDIA was remiss on actually informing consumers about how it handles the memory functionality of that particular product.  While that debate rages, we have somewhat lost track of other products in the Maxwell range.  The GTX 960 was released during this particular firestorm and, while it also shared the outstanding power/performance qualities of the Maxwell architecture, it is considered a little overpriced when compared to other cards in its price class in terms of performance.

It is easy to forget that the original Maxwell based product to hit shelves was the GTX 750 series of cards.  They were released a year ago to some very interesting reviews.  The board is one of the first mainstream cards in recent memory to have a power draw that is under 75 watts, but can still play games with good quality settings at 1080P resolutions.  Ryan covered this very well and it turned out to be a perfect gaming card for many pre-built systems that do not have extra power connectors (or a power supply that can support 125+ watt graphics cards).  These are relatively inexpensive cards and very easy to install, producing a big jump in performance as compared to the integrated graphics components of modern CPUs and APUs.

strix_01.jpg

The GTX 750 and GTX 750 Ti have proven to be popular cards due to their overall price, performance, and extremely low power consumption.  They also tend to produce a relatively low amount of heat, due to solid cooling combined with that low power consumption.  The Maxwell architecture has also introduced some new features, but the major changes are to the overall design of the architecture as compared to Kepler.  Instead of 192 cores per SMK, there are now 128 cores per SMM.  NVIDIA has done a lot of work to improve performance per core as well as lower power in a fairly dramatic way.  An interesting side effect is that the CPU hit with Maxwell is a couple of percentage points higher than Kepler.  NVIDIA does lean a bit more on the CPU to improve overall GPU power, but most of this performance hit is covered up by some really good realtime compiler work in the driver.

Asus has taken the GTX 750 Ti and applied their STRIX design and branding to it.  While there are certainly faster GPUs on the market, there are none that exhibit the power characteristics of the GTX 750 Ti.  The combination of this GPU and the STRIX design should result in an extremely efficient, cool, and silent card.

Click to read the rest of the review of the Asus STRIX GTX 750 Ti!

Manufacturer: SilverStone

Introduction and First Impressions

The RV05 is the current iteration of SilverStone's Raven enclosure series, and a reinvention of their ATX enthusiast design with a revised layout that eliminates 5.25" drive bays for a smaller footprint.

RV05_Angle.jpg

Return to Form

The fifth edition of SilverStone's Raven is a return to form of sorts, as it owes more to the design of the original RV01 than the next three to follow. The exterior again has an aggressive, angular look with the entire enclosure sitting up slightly at the rear and tilted forward. Though the overall effect is likely less visually exciting than the original, depending on taste, in its simplicity the design feels more refined and modern than the RV01. Some of the sharpest angles have been eliminated or softened, though the squat stance coupled with its smaller size gives the RV05 an energetic appearance - as if it's ready to strike. (OK, I know it's just a computer case, but still...)

RV05_Logo.jpg

The Raven series is important to the case market as a pioneer of the 90º motherboard layout for ATX systems, expanding on the design originally developed by Intel for the short-lived BTX form-factor. In the layout implemented in the Raven series the motherboard is installed with the back IO panel facing up, which requires the graphics card to be installed vertically. This vertical orientation assists with heat removal by exploiting the tendency of warm air to rise, and when implemented in an enclosure like the RV05 it can create an excellent thermal environment for your components. The RV05 features large fans at the bottom of the case that push air upward and across the components on the motherboard, forcing warm air to exit through a well-ventilated top panel.

RV05_Top.jpg

And the RV05 isn't just a working example of an interesting thermal profile, it's actually a really cool-looking enclosure with some premium features and suprisingly low price for a product like this at $129 on Amazon as this was written. In our review of the RV05 we'll be taking a close look at the case and build process, and of course we'll test the thermal performance with some CPU and GPU workloads to find out just how well this design performs.

Continue reading our review of the SilverStone Raven RV05 enclosure!!

EVGA and Inno3D Announce the First 4GB NVIDIA GeForce GTX 960 Cards

Subject: Graphics Cards | March 3, 2015 - 02:44 PM |
Tagged: video cards, nvidia, gtx 960, geforce, 4GB

They said it couldn't be done, but where there are higher density chips there's always a way. Today EVGA and Inno3D have both announced new versions of GTX 960 graphics cards with 4GB of GDDR5 memory, placing the cards in a more favorable mid-range position depending on the launch pricing.

960_evga.PNG

EVGA's new 4GB NVIDIA GTX 960 SuperSC

Along with the expanded memory capacity EVGA's card features their ACX 2.0+ cooler, which promises low noise and better cooling. The SuperSC is joined by a standard ACX and the higher-clocked FTW variant, which pushes Base/Boost clocks to 1304/1367MHz out of the box.

960_evga_2.PNG

Inno3D's press release provides fewer details, and the company appears to be launching a single new model featuring 4GB of memory which looks like a variant of their existing GTX 960 OC card.

inno3d_960.jpg

The existing Inno3D GTX 960 OC card

The current 2GB version of the GTX 960 can be found starting at $199, so expect these expanded versions to include a price bump. The GTX 960, with only 1024 CUDA cores (half the count of a GTX 980) and a 128-bit memory interface, has been a very good performer nonetheless with much better numbers than last year's GTX 760, and is very competitive with AMD's R9 280/285. (It's a great overclocker, too.) The AMD/NVIDIA debate rages on, and NVIDIA's partners adding another 4GB offering to the mix will certainly add to the conversation, particularly as an upcoming 4GB version of the GTX 960 was originally said to be unlikely.

Source: EVGA

Dell's Venue 8 7000 continues to impress

Subject: Mobile | February 25, 2015 - 04:46 PM |
Tagged: z3580, venue 8 7000, venue, tablet, silvermont, moorefield, Intel, dell, atom z3580, Android

Dell's Venue 8 7000 tablet sports an 8.4" 2560x1600 OLED display and is powered by the Moorefield based Atom Z3580 SOC, 2GB LPDDR3-1600 with 16GB internal of internal storage with up to a 512GB Micro SD card supported.  Even more impressive is that The Tech Report had no issues installing apps or moving files to the SD card with ES File Explorer, unlike many Android devices that need certain programs to reside on the internal storage media.   Like Ryan, they had a lot of fun with the RealSense Camera and are looking forward to the upgrade to Lollipop support.  Check out The Tech Report's opinion of this impressive Android tablet right here.

screen.jpg

"Dell's Venue 8 7000 is the thinnest tablet around, and that's not even the most exciting thing about it. This premium Android slate packs a Moorefield-based Atom processor with quad x86 cores, a RealSense camera that embeds 3D depth data into still images, and a staggeringly beautiful OLED display that steals the show. Read on for our take on a truly compelling tablet."

Here are some more Mobile articles from around the web:

Mobile

Manufacturer: Cooler Master

Introduction and Technical Specifications

Introduction

02-17_Product_RL-N24M-24PK-R1_01 - Copy.jpg

Courtesy of Cooler Master

Cooler Master is known in the enthusiast community for their innovative designs with product offerings ranging from cases to desktop and laptop cooling implements. Cooler Master also offers their own line of all-in-one (AIO) CPU liquid cooling solutions for better system performance without the noise of a typical air cooler. With their Nepton 240M cooler, they enhanced the existing design of their previous AIO products, optimizing its performance with an enhanced pump and radiator design. We measured the unit's performance against that of other high-performance liquid and air coolers to best illustrate its abilities. The Nepton 240M's premium performance comes with a premium price, at a $139.99 MSRP.

03-14_Product_RL-N24M-24PK-R1_03.jpg

Courtesy of Cooler Master

04-flyapart-view.png

Courtesy of Cooler Master

The Nepton 240M AIO liquid cooler features a 240mm aluminum-finned radiator tied to a base unit consisting of a 120 liter per minute pump and a micro-finned copper base plate. Unlike the Glacer model, the Nepton 240M does not feature the ability to drain and refill the unit. Cooler Master designed the Nepton 240M with a 27mm deep, 2x120mm copper radiator with brass internal channels, bundled with two of its 120mm Silencio model fans. The Silencio fans are optimized for low noise and high pressure, perfect for use with a liquid cooling radiator. The radiator and unit base are connected by ribbed FEP (Fluorinated Ethylene Propylene) tubing, allowing for high flexibility without the worry of tube kinking.

Continue reading our review of the Cooler Master Nepton 240M CPU AIO liquid cooler!

Intel Sheds Its Remaining Stake In Imagination Technologies

Subject: General Tech | February 25, 2015 - 08:56 PM |
Tagged: PowerVR, Intel, Imagination Technologies, igp, finance

Update: Currency exchange rates have been corrected. I'm sorry for any confusion!

Intel Foundation is selling off its remaining stake in UK-based Imagination Technologies (IMG.LN). According to JP Morgan, Intel is selling off 13.4 million shares (4.9% of Imagination Technologies) for 245 GBp each. Once all shares are sold, Intel will gross just north of $50.57 Million USD.

PowerVR Rogue Series6XT GPU.png

Imagination Technologies' PowerVR Rogue Series 6XT GPU is used in Apple's A8-series chips.

Intel first invested in Imagination Technologies back in October of 2006 in a deal to gain access to the company’s PowerVR graphics IP portfolio. Since then, Intel has been slowly moving away from PowerVR graphics in favor of it’s own internal HD graphics GPUs. (Further, Intel sold off 10% of its IMG.LN stake in June of last year.) Even Intel’s low cost Atom line of SoCs has mostly moved to Intel GPUs with the exception of the mobile Merrifield and Moorefield” smartphone/tablet SoCs.

The expansion of Intel’s own graphics IP combined with Imagination Technologies acquisition of MIPS are reportedly the “inevitable” reasons for the sale. According to The Guardian, industry analysts have speculated that, as it stands, Intel is a minor customer of Imagination Technologies at less than 5% for graphics (a licensing agreement signed this year doesn’t rule out PowerVR graphics permanently despite the sale). Imagination Technologies still has a decent presence in the mobile (ARM-based) space with customers including Apple, MediaTek, Rockchip, Freescale, and Texas Instruments.

Currently, the company’s stock price is sitting at 258.75 GBp (~$3.99 USD) which seems to indicate that the Intel sell off news was “inevitable” and was already priced in or simply does not have investors that concerned.

What do you think about the sale? Where does this leave Intel as far as graphics goes? Will we see Intel HD Graphics scale down to smartphones or will the company go with a PowerVR competitor? Would Intel really work with ARM’s Mali, Qualcomm’s Adreno, or Samsung’s rumored custom GPU cores? On that note, an Intel powered smartphone with NVIDIA Tegra graphics would be amazing (hint, hint Intel!)

GDC 15: Khronos Acknowledges Mantle's Start of Vulkan

Subject: General Tech, Graphics Cards, Shows and Expos | March 3, 2015 - 03:37 PM |
Tagged: vulkan, Mantle, Khronos, glnext, gdc 15, GDC, amd

khronos-group-logo.png

Neil Trevett, the current president of Khronos Group and a vice president at NVIDIA, made an on-the-record statement to acknowledge the start of the Vulkan API. The quote came to me via Ryan, but I think it is a copy-paste of an email, so it should be verbatim.

Many companies have made great contributions to Vulkan, including AMD who contributed Mantle. Being able to start with the Mantle design definitely helped us get rolling quickly – but there has been a lot of design iteration, not the least making sure that Vulkan can run across many different GPU architectures. Vulkan is definitely a working group design now.

So in short, the Vulkan API was definitely started with Mantle and grew from there as more stakeholders added their opinion. Vulkan is obviously different than Mantle in significant ways now, such as its use of SPIR-V for its shading language (rather than HLSL). To see a bit more information, check out our article on the announcement.

Update: AMD has released a statement independently, but related to Mantle's role in Vulkan