Flash player not detected. Click here to install flash.
« 1 2 3 4 5 »

PCPer Live! NVIDIA Maxwell, GTX 980, GTX 970 Discussion with Tom Petersen, Q&A

Subject: Graphics Cards | September 26, 2014 - 09:14 AM |
Tagged: vxgi, video, tom petersen, nvidia, mfaa, maxwell, livestream, live, GTX 980, GTX 970, dsr

UPDATE: If you missed the live stream yesterday, I have good news: the interview and all the information/demos provided are available to you on demand right here. Enjoy!

Last week NVIDIA launched GM204, otherwise known as Maxwell and now branded as the GeForce GTX 980 and GTX 970 graphics cards. You should, of course, have already read the PC Perspective review of these two GPUs, but undoubtedly there are going to be questions and thoughts circulating through the industry. 

IMG_9779.JPG

To help the community get a better grasp and to offer them an opportunity to ask some questions, NVIDIA's Tom Petersen is stopping by our offices on Thursday afternoon where he will run through some demonstrations and take questions from the live streaming audience.

Be sure to stop back at PC Perspective on Thursday, September 25th at 4pm ET / 1pm PT to discuss the new Maxwell GPU, the GTX 980 and GTX 970, new features like Dynamic Super Resolution, MFAA, VXGI and more!  You'll find it all on our PC Perspective Live! page on Monday but you can sign up for our "live stream mailing list" as well to get notified in advance!

NVIDIA Maxwell Live Stream

1pm PT / 4pm ET - September 25th

PC Perspective Live! Page

pcperlive.png

We also want your questions!!  The easiest way to get them answered is to leave them for us here in the comments of this post.  That will give us time to filter through the questions and get the answers you need from Tom.  We'll take questions via the live chat and via Twitter (follow me @ryanshrout) during the event but often time there is a lot of noise to deal with. 

So be sure to join us on Thursday afternoon!

UPDATE: We have confirmed at least a handful of prizes for those of you that tune into the live stream today. We'll giveaway an NVIDIA SHIELD as well as several of the brand new SLI LED bridges that were announced for sale this week!

EVGA Live Stream! Learn about Maxwell, X99 and Win Prizes!!

Subject: General Tech, Graphics Cards, Motherboards, Cases and Cooling | September 27, 2014 - 09:25 PM |
Tagged: X99, video, maxwell, live, GTX 980, GTX 970, evga

UPDATE: If you missed the live stream with myself and Jacob, you can catch the entire event in the video below. You won't want to miss out on seeing the first ever GTX 980 water block as well as announcements on new Torq mice!

EVGA has been a busy company recently. It has continued to innovate with new coolers for the recent GTX 980 and GTX 970 card releases, newer power supplies offer unique features and improved quality and power output, a new line of X99 chipset motherboards including a Micro ATX variant and hey, the company even released a line of high-performance mice this year! PC Perspective has covered basically all of these releases (and will continue to do so with pending GPU and MB reviews) but there is a lot that needs explaining.

To help out, an industry and community favorite will be stopping by from EVGA to the PC Perspective offices: Jacob Freeman. You might know him as @EVGA_JacobF on Twitter or have seen him on countless forums, but he will making an in-person appearance on Friday, September 26th on PC Perspective Live! We plan on discussing the brand new ACX 2.0 cooler on the Maxwell GPUs released last week, go over some of highlights of the new X99 motherboards and even touch on power supplies and the Torq mice line as well.

pcperlive.png

EVGA GTX 980/970, X99, PSU and Torq Live Stream featuring Jacob Freeman

3pm ET / 12pm PT - September 26th

PC Perspective Live! Page

EVGA has been a supporter of PC Perspective for a long time and we asked them to give back to our community during this live stream - and they have stepped up! Look at this prize list:

How can you participate and win these awesome pieces of hardware? Just be here at 3pm ET / 12pm PT on http://www.pcper.com/live and we'll be announcing winners as we go for those that tune in. It really couldn't be more simple!

x99class.jpg

If you have questions you want to ask Jacob about EVGA, or any of its line of products, please leave them in the comments section below and we'll start compiling a list to address on the live stream Friday. Who knows, we may even save some prizes for some of our favorite questions!

To make sure you don't miss our live stream events, be sure you sign up for our spam-free PC Perspective Live! Mailing List. We email that group a couple hours before each event gets started.

NVIDIA Confirms It Has No Plans to Support Adaptive Sync

Subject: Graphics Cards | September 27, 2014 - 04:24 PM |
Tagged: nvidia, maxwell, gsync, g-sync, freesync, adaptive sync

During an interview that we streamed live with NVIDIA's Tom Petersen this past Thursday, it was confirmed that NVIDIA is not currently working on, or has any current plans to, add support for the VESA-based and AMD-pushed Adaptive Sync portion of the DisplayPort 1.2a specification. To quote directly:

There is no truth [to that rumor of NVIDIA Adaptive Sync support] and we have made no official comments about Adaptive Sync. One thing I can say is that NVIDIA as a company is 100% dedicated to G-Sync. We are going to continue to invest in G-Sync and it is a way we can make the gaming experience better. We have no need for Adaptive Sync. We have no intention of [implementing it]."

Discussion of G-Sync begins at 1:27:14 in our interview.

To be clear, the Adaptive Sync part of DP 1.2a and 1.3+ are optional portions of the VESA spec that is not required for future graphics processors or even future display scalar chips. That means that upcoming graphics cards from NVIDIA could still be DisplayPort 1.3 compliant without implementing support for the Adaptive Sync feature. Based on the comments above, I fully expect that to be the case.

IMG_9328.JPG

The ASUS ROG Swift PG278Q G-Sync monitor

With that new information, you can basically assume that the future of variable refresh monitors is going to be divided: one set for users of GeForce cards and one set for users with Radeon cards. (Where Intel falls into this is up in the air.) Clearly that isn't ideal for a completely open ecosystem but NVIDIA has made the point, over and over, that what they have developed with G-Sync is difficult and not at all as simple as could be solved with the blunt instrument that Adaptive Sync is. NVIDIA has a history of producing technologies and then keeping them in-house, focusing on development specifically for GeForce owners and fans. The dream of having a VRR monitor that will run on both vendors GPUs appears to be dead.

When asked about the possibility of seeing future monitors that can support both NVIDIA G-Sync technology as well as Adaptive Sync technology, Petersen stated that while not impossible, he "would not expect to see such a device."

The future of G-Sync is still in development. Petersen stated:

"Don't think that were done. G-Sync is not done. Think of G-Sync as the start of NVIDIA solving the problems for gamers that are related to displays...G-Sync is our first technology that makes games look better on displays. But you can start looking at displays and make a lot of things better."

tearing5.jpg

Diagram showing how G-Sync affects monitor timings

So now we await for the first round of prototype FreeSync / Adaptive Sync monitors to hit our labs. AMD has put a lot of self-inflicted pressure on itself for this release by making claims, numerous times, that FreeSync will be just as good of an experience as G-Sync, and I am eager to see if they can meet that goal. Despite any ill feelings that some users might have about NVIDIA and some of its policies, it typically does a good job of maintaining a high quality user experience with these custom technologies. AMD will have to prove that what it has developed is on the same level. We should know more about that before we get too much further into fall.

You can check out our stories and reviews covering G-Sync here:

Microsoft Introduces Windows 10 to the Enterprise

Subject: General Tech | September 30, 2014 - 08:46 PM |
Tagged: windows 9, Windows 8.1, Windows 7, windows 10, windows, threshold, microsoft

The Windows event for the enterprise, which took place today in San Francisco, revealed the name of the upcoming OS. It is not Windows 9, or One Windows, or just Windows. It will be Windows 10. Other than the name, there is not really any new information from a feature or announcement standpoint (except the Command Prompt refresh that I actually will give a brief mention later). My interest comes from their mindset with this new OS -- what they are changing and what they seem to be sticking with.

If you would like Microsoft's commentary before reading mine, the keynote is embed above.

Okay, so one thing that was shown is "Continuum". If you have not seen its prototype at the end of the above video, it is currently a small notification that appears when a keyboard and mouse is attached (or detached). If a user accepts, this will flip the user interface between tablet and desktop experiences. Joe Belfiore was clear that the video clip was not yet in code, but represents their vision. In practice, it will have options for whether to ask the user or to automatically do some chosen behavior.

windows-10-continuum.jpg

In a way, you could argue that it was necessary to go through Windows 8.x to get to this point. From the demonstrations, the interface looks sensible and a landing point for users on both Windows 7 and Windows 8 paths. That said, I was fine with the original Windows 8 interface, barring a few glitches, like disappearing icons and snapping sidebars on PCs with multiple monitors. I always considered the "modern" Windows interface to be... acceptable.

It was the Windows Store certification that kept me from upgrading, and Microsoft's current stance is confusing at the very least. Today's announcement included the quote, "Organizations will also be able to create a customized store, curating store experiences that can include their choice of Store apps alongside company-owned apps into a separate employee store experience." Similar discussion was brought up and immediately glossed over during the keynote.

Who does that even apply to? Would a hobbyist developer be able to set up a repository for friends and family? Or is this relegated to businesses, leaving consumers to accept nothing more than what Microsoft allows? The concern is that I do not want Microsoft (or anyone) telling me what I can and cannot create and install on my devices. Once you build censorship, the crazies will come. They usually do.

windows-10.png

But onto more important things: Command Prompt had a major UX overhaul. Joe Belfiore admitted that it was mostly because most important changes were already leaked and reported on, and they wanted to surprise us with something. They sure did. You can now use typical keyboard shortcuts, shift to select, ctrl+c and ctrl+v to copy/paste, and so forth. The even allow a transparency option, which is common in other OSes to make its presence less jarring. Rather than covering over what you're doing, it makes it feel more like it overlays on top of it, especially for quick commands. At least, that is my opinion.

Tomorrow, October 1st, Microsoft will launch their "Windows Inside Program". This will give a very early glimpse at the OS for "most enthusiastic Windows fans" who are "comfortable running pre-release software that will be of variable quality". They "plan to share all the features (they) are experimenting with". They seem to actually want user feedback, a sharp contrast from their Windows 8 technical preview. My eye will on relaxing certification requirements, obviously.

Source: Microsoft

The Evil Within System Requirements

Subject: General Tech, Graphics Cards | September 26, 2014 - 11:59 PM |
Tagged: rage, pc gaming, consolitis

Shinji Mikami has been developing a survival horror game, which makes sense given a good portion of his portfolio. He created Resident Evil and much of the following franchise. The Evil Within is about to release, having recently gone gold. At around this time, publishers begin to release system requirements and Bethesda does not disappoint in that regard.

bethesda-evilwithin.jpg

Are the requirements... RAGE-inducing?

A case could be made for disappointing requirements, themselves, though.

Basically, Bethesda did not release minimum requirements. Instead, they said "This is what we recommend. It will run on less. Hope it does!" This would not be so problematic if one of their requirements wasn't a "GeForce GTX 670 with 4GBs of VRAM".

They also recommend a quad-core Core i7, 4GB of system memory, 50GB of hard drive space, and a 64-bit OS (Windows 7 or Windows 8.x).

Before I go on, I would like to mention that The Evil Within is built on the RAGE engine. Our site has dealt extensively with that technology when it first came out in 2011. While I did not have many showstopping performance problems with that game, personally, it did have a history with texture streaming. Keep that in mind as you continue to read.

A typical GTX 670 does not even have 4GBs of VRAM. In fact, the GTX 780 Ti does not even have 4GB of VRAM. Thankfully, both of the newly released Maxwell GPUs, the GTX 970 and the GTX 980, have at least 4GB of RAM. Basically, Bethesda is saying, "I really hope you bought the custom model from your AIB vendor". They literally say:

Note: We do not have a list of minimum requirements for the game. If you’re trying to play with a rig with settings below these requirements (you should plan to have 4 GBs of VRAM regardless), we cannot guarantee optimal performance.

Each time I read, "You should plan to have 4 GBs of VRAM regardless", it is more difficult for me to make an opinion about it. That is a lot of memory. Personally, I would wait for reviews and benchmarks, specifically for the PC, before purchasing the title. These recommended settings could be fairly loose, to suit the vision of the game developers, or the game could be a revival of RAGE, this time without the engine's original architect on staff.

The Evil Within launches on October 14th.

Source: Bethesda
Author:
Manufacturer: Apple

One Small Step

While most articles surrounding the iPhone 6 and iPhone 6 Plus this far have focused around user experience and larger screen sizes, performance, and in particular the effect of Apple's transition to the 20nm process node for the A8 SoC have been our main questions regarding these new phones. Naturally, I decided to put my personal iPhone 6 though our usual round of benchmarks.

applea83.jpg

First, let's start with 3DMark.

3dmark-iceunlimited.png

Comparing the 3DMark scores of the new Apple A8 to even the last generation A7 provides a smaller improvement than we are used to seeing generation-to-generation with Apple's custom ARM implementations. When you compare the A8 to something like the NVIDIA Tegra K1, which utilizes desktop-class GPU cores, the overall score blows Apple out of the water. Even taking a look at the CPU-bound physics score, the K1 is still a winner.

A 78% performance advantage in overall score when compared the A8 shows just how much of a powerhouse NVIDIA has with the K1. (Though clearly power envelopes are another matter entirely.)

octane.png

If we look at more CPU benchmarks, like the browser-based Google Octane and SunSpider tests, the A8 starts to shine more.

sunspider.png

While the A8 edges out the A7 to be the best performing device and 54% faster than the K1 in SunSpider, the A8 and K1 are neck and neck in the Google Octane benchmark.

gfxbench-manhattan.png

Moving back to a graphics heavy benchmark, GFXBench's Manhattan test, the Tegra K1 has a 75% percent performance advantage over the A8 though it is 36% faster than the previous A7 silicon.

These early results are certainly a disappointment compared to the usual generation-to-generation performance increase we see with Apple SoCs.

However, the other aspect to look at is power efficiency. With normal use I have noticed a substantial increase in battery life of my iPhone 6 over the last generation iPhone 5S. While this may be due to a small (about 1 wH) increase in battery capacity, I think more can be credited to this being an overall more efficient device. Certain choices like sticking to a highly optimized Dual Core CPU design and Quad Core GPU, as well as a reduction in process node to 20nm all contribute to increased battery life, while surpassing the performance of the last generation Apple A7.

apple-a8-dieshot-chipworks.png

In that way, the A8 moves the bar forward for Apple and is a solid first attempt at using the 20nm silicon technology at TSMC. There is a strong potential that further refined parts (like the expected A8x for the iPad revisions) Apple will be able to further surpass 28nm silicon in performance and efficiency.

AMD Catalyst 14.9 for Windows

Subject: Graphics Cards | September 29, 2014 - 02:33 PM |
Tagged: whql, radeon, Catalyst 14.9, amd

AMD.jpg

The full release notes are available here or take a look at the highlights below.

The latest version of the AMD Catalyst Software Suite, AMD Catalyst 14.9 is designed to support the following Microsoft Windows platforms:

Highlights of AMD Catalyst 14.9 Windows Driver

  • Support for the AMD Radeon R9 280
  • Performance improvements (comparing AMD Catalyst 14.9 vs. AMD Catalyst 14.4)
    • 3DMark Sky Diver improvements
      • AMD A4 6300 – improves up to 4%
      • Enables AMD Dual Graphics / AMD CrossFire support
    • 3DMark Fire Strike
      • AMD Radeon R9 290 Series - improves up to 5% in Performance Preset
    • 3DMark11
      • AMD Radeon R9 290 Series / R9 270 Series - improves up to 4% in Entry and Performance Preset
    • BioShock Infinite
      • AMD Radeon R9 290 Series – 1920x1080 - improves up to 5%
    • Company of Heroes 2
      • AMD Radeon R9 290 Series - improves up to 8%
    • Crysis 3
      • AMD Radeon R9 290 Series / R9 270 Series – improves up to 10%
    • Grid Auto Sport
      • AMD CrossFire profile
    • Murdered Soul Suspect
      • AMD Radeon R9 290X (2560x1440, 4x MSAA, 16x AF) – improves up to 50%
      • AMD Radeon R9 290 Series / R9 270 Series – improves up to 6%
      • CrossFire configurations improve scaling up to 75%
    • Plants vs. Zombies (Direct3D performance improvements)
      • AMD Radeon R9 290X - 1920x1080 Ultra – improves up to 11%
      • AMD Radeon R9290X - 2560x1600 Ultra – improves up to 15%
      • AMD Radeon R9290X CrossFire configuration (3840x2160 Ultra) - 92% scaling
    • Batman Arkham Origins:
      • AMD Radeon R9 290X (4x MSAA) – improves up to 20%
      • CrossFire configurations see up to a 70% gain in scaling
    • Wildstar
      • Power Xpress profile Performance improvements to improve smoothness of application
      • Performance improves up to 30% on the AMD Radeon R9 and R7 Series of products for both single GPU and Multi-GPU configurations
    • Tomb Raider
      • AMD Radeon R9 290 Series – improves up to 5%
    • Watch Dogs
      • AMD Radeon R9 290 Series / R9 270 Series – improves up to 9%
      • AMD CrossFire – Frame pacing improvement
      • Improved CrossFire performance – up to 20%
    • Assassin's Creed IV
      • Improves CrossFire scaling (3840x2160 High Settings) up to 93% (CrossFire scaling improvement of 25% compared to AMD Catalyst 14.4)
    • Lichdom
      • Improves performance for single GPU and Multi-GPU configurations
    • Star Craft II
      • AMD Radeon R9 290X (2560x1440, AA, 16x AF) – improves up to 20%

AMD Eyefinity enhancements

  • Mixed Resolution Support
    • A new architecture providing brand new capabilities
    • Display groups can be created with monitors of different resolution (including difference sizes and shapes)
    • Users have a choice of how surface is created over the display group
      • Fill – legacy mode, best for identical monitors
      • Fit – create the Eyefinity surface using best available rectangular area with attached displays
      • Expand – create a virtual Eyefinity surface using desktops as viewports onto the surface
    • Eyefinity Display Alignment
      • Enables control over alignment between adjacent monitors
      • One-Click Setup Driver detects layout of extended desktop
      • Can create Eyefinity display group using this layout in one click!
      • New user controls for video color and display settings
      • Greater control over Video Color Management:
        • Controls have been expanded from a single slider for controlling Boost and Hue to per color axis
        • Color depth control for Digital Flat Panels (available on supported HDMI and DP displays)
        • Allows users to select different color depths per resolution and display

AMD Mantle enhancements

  • Mantle now supports AMD Mobile products with Enduro technology
    • Battlefield 4: AMD Radeon HD 8970M (1366x768; high settings) – 21% gain
    • Thief: AMD Radeon HD 8970M (1920x1080; high settings) – 14% gain
    • Star Swarm: AMD Radeon HD 8970M (1920x1080; medium settings) – 274% gain
  • Enables support for Multi-GPU configurations with Thief (requires the latest Thief update)
  • AMD AM1 JPEG decoding acceleration
    • JPEG decoding acceleration was first enabled on the A10 APU Series in AMD Catalyst 14.1 beta, and has now been extended to the AMD AM1 Platform
    • Provides fast JPEG decompression Provides Power Efficiency for JPEG decompression

Resolved Issues

  • 60Hz SST flickering has been identified as an issue with non-standard display timings exhibited by the AOC U2868PQU panel on certain AMD Radeon graphics cards. A software workaround has been implemented in the AMD Catalyst 14.9 driver to resolve the display timing issues with this display
  • Users seeing flickering issues in 60Hz SST mode are further encouraged to obtain newer display firmware from their monitor vendor that will resolve flickering at its origin.
  • Users are additionally advised to utilize DisplayPort-certified cables to ensure the integrity of the DisplayPort data connection.
  • 4K panel flickering issues found on the AMD Radeon R9 290 Series and AMD Radeon HD 7800
  • Series Screen tearing observed on AMD CrossFire systems with Eyefinity portrait display configurations
  • Instability issues for Grid Autosport when running in 2x1 or 1x2 Eyefinity configurations
  • Geometry corruption in State of Decay
Source: AMD

Euclideon's voxel point clouds are rather pretty

Subject: General Tech | September 25, 2014 - 10:04 AM |
Tagged: euclideon, voxels, larrabee, point cloud

Could the next Elder Scrolls game you play look like the screenshot below?  Euclideon is working to make that a reality with their new voxel engine.  The engine is strictly CPU based, similar to the long dead Larrabee architecture but with one major difference, currently they are capable of rendering 2000x1000 frames at around 32 FPS on a six-core processor.  They are properly referred to as frames because this is a point cloud solution, not pixel based.  They generated the images in the video you can see at The Tech Report by rendering 3D scans of real objects and locations but programmers will still be able to create scenes with Maya or 3ds Max.  Euclideon feels that they can still get a lot more performance out of a CPU with software refinements and are not planning on moving to GPU at this time.  With two unannounced games using this new engine in development it might be time to make sure your machine has at least 6 cores so that you can be ready for their launch

outdoors.jpg

"We first heard about Euclideon back in 2011, when the company posted a video of a voxel-based rendering engine designed to enable environments with unlimited detail. This month, the firm made headlines again with a new video showing the latest iteration of is technology, which uses 3D scanners to capture real-world environments as point-cloud data. We spoke to Euclideon CEO Bruce Dell to find out more about these innovations—and about the first games based on them."

Here is some more Tech News from around the web:

Tech Talk

New GPUs also mean lower prices for the previous generation

Subject: Graphics Cards | September 26, 2014 - 11:22 AM |
Tagged: asus, ROG, gtx 780 ti, MATRIX Platinum, DirectCU II

With the release of the new Maxwell cards comes an opportunity for those with a smaller budget to still get a decent upgrade for their systems.  Early adopters will often sell their previous GPUs once they've upgraded allowing you to get a better card than your budget would usually allow, though with a risk of ending up with a bum card.  The ASUS ROG GTX 780 Ti MATRIX Platinum is a good example with a DirectCU II air cooler for general usage but the LN2 switch will also allow more extreme cooling methods for those looking for something a little more impressive.  The factory overclock is not bad at 1006/1072MHz core and 7GHz effective memory but the overclock [H]ard|OCP managed at 1155/1220MHz and 7.05GHz pushes the performance above that of the R9 290X of the same family.  If you can find this card used at a decent price it could give you more of an upgrade than you thought you could afford.

141133148469XXJKl2RB_1_1.jpg

"In today's evaluation we are breaking down the ASUS ROG GTX 780 Ti MATRIX Platinum video card. We put this head-to-head with the ASUS ROG R9 290X MATRIX Platinum. Which provides a better gaming experience, best overclocking performance, and power and temperature? Which one provides the best value? "

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

Broadwell-U (BGA) Lineup Leaked

Subject: Graphics Cards, Processors | September 30, 2014 - 12:33 AM |
Tagged: iris, Intel, core m, broadwell-y, broadwell-u, Broadwell

Intel's upcoming 14nm product line, Broadwell, is expected to have six categories of increasing performance. Broadwell-Y, later branded Core M, is part of the soldered BGA family at expected TDPs of 3.5 to 4.5W. Above this is Broadwell-U, which are also BGA packages, and thus require soldering by the system builder. VR-Zone China has a list of seemingly every 15W SKU in that category. 28W TDP "U" products are expected to be available in the following quarter, but are not listed.

intel-broadwell-u_1.png

Image Credit: VR-Zone

As for those 15W parts though, there are seventeen (17!) of them, ranging from Celeron to Core i7. While each product is dual-core, the ones that are Core i3 and up have Hyper-Threading, increasing the parallelism to four tasks simultaneously. In terms of cache, Celerons and Pentiums will have 2MB, Core i7s will have 4MB, and everything in between will have 3MB. Otherwise, the products vary on the clock frequency they were binned (bin-sorted) at, and the integrated graphics that they contain.

intel-broadwell-u_2.png

Image Credit: VR-Zone

These integrated iGPUs range from "Intel HD Graphics" on the Celerons and Pentiums, to "Intel Iris Graphics 6100" on one Core i7, two Core i5s, and one Core i3. The rest pretty much alternate between Intel HD Graphics 5500 and Intel HD Graphics 6000. Maximum frequency of any given iGPU can vary within the same product, but only by about 100 MHz at the most. The exact spread is below.

  • Intel HD Graphics: 300 MHz base clock, 800 MHz at load.
  • Intel HD Graphics 5500: 300 MHz base clock, 850-950 MHz at load (depending on SKU).
  • Intel HD Graphics 6000: 300 MHz base clock, 1000 MHz at load.
  • Intel Iris Graphics 6100: 300 MHz base clock, 1000-1100 MHz at load (depending on SKU).

Unfortunately, without the number of shader units to go along with the core clock, we cannot derive a FLOP value yet. This is a very important metric for increasing resolution and shader complexity, and it would provide a relatively fair metric to compare the new parts against previous offerings for higher resolutions and quality settings, especialy in DirectX 12 I would assume.

intel-broadwell-iris-graphics-6100.png

Image Credit: VR-Zone

Probably the most interesting part to me is that "Intel HD Graphics" without a number meant GT1 with Haswell. Starting with Broadwell, it has been upgraded to GT2 (apparently). As we can see from even the 4.5W Core M processors, Intel is taking graphics seriously. It is unclear whether their intention is to respect gaming's influence on device purchases, or if they are believing that generalized GPU compute will be "a thing" very soon.

Source: VR-Zone

Samsung is releasing new PCIe SSDs

Subject: General Tech | September 26, 2014 - 10:26 AM |
Tagged: PCIe SSD, Samsung, NVMe, SM1715, 3d nand

Samsung's new SM1715 NVMe PCIe SSD will use their new 3D V-NAND and come in a 3.2TB card, double the previous model and perhaps the smallest of the new line of SSDs they are working on.  The stats are fairly impressive at 750,000/130,000 random read/write IOPS or 3GB/sec read bandwidth and 2.2GB/sec write bandwidth if you prefer that measurement.  Samsung offers a nice mix of bandwidth and size with the new model and you can expect the competition to start releasing new models with increased capacities and speeds in the near future.  The Register was not provided the full set of specifications for the drive but those should be forthcoming in the near future.

sm1715.jpg

"Faster, fatter flash cards that speed up server applications are in demand, and Samsung has announced it is mass-producing a 3.2TB NVMe PCIe SSD using its 3D V-NAND technology. It says higher capacities are coming."

Here is some more Tech News from around the web:

Tech  Talk

 

Source: The Register

Podcast #319 - GTX 980 and 970, Noctua NH-D15, Acer's 4K G-Sync Display and more!

Subject: General Tech | September 25, 2014 - 09:24 AM |
Tagged: podcast, video, GTX 980, GTX 970, maxwell, nvidia, amd, noctua, NH-D15, acer, 4k, 4k gsync, XB280HK, 840, 840 evo, Samsung

PC Perspective Podcast #319 - 09/25/2014

Join us this week as we discuss our GTX 980 and 970 Review, Noctua NH-D15, Acer's 4K G-Sync Display and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano

Subscribe to the PC Perspective YouTube Channel for more videos, reviews and podcasts!!

HTC Making the Google Nexus 9 with NVIDIA Tegra K1?

Subject: General Tech, Mobile | September 25, 2014 - 10:45 PM |
Tagged: tablet, Nexus, google, nexus 9, nvidia, tegra k1

The Nexus line is due for an update, with each product being released for at least a year. They are devices which embody Google's vision... for their own platform. You can fall on either side of that debate, whether it guides OEM partners or if it is simply a shard the fragmentation issue, if you even believe that fragmentation is bad, but they are easy to recommend and a good benchmark for Android.

google-nexus-logo.png

We are expecting a few new entries in the coming months, one of which being the Nexus 9. Of note, it is expected to mark the return of HTC to the Nexus brand. They were the launch partner with the Nexus One and then promptly exited stage left as LG, Samsung, and ASUS performed the main acts.

We found this out because NVIDIA spilled the beans on their lawsuit filing against Qualcomm and Samsung. Apparently, "the HTC Nexus 9, expected in the third quarter of 2014, is also expected to use the Tegra K1". It has since been revised to remove the reference. While the K1 has a significant GPU to back it up, it will likely be driving a very high resolution display. The Nexus 6 is expected to launch at around the same time, along with Android 5.0 itself, and the 5.2-inch phone is rumored to have a 1440p display. It seems unlikely that a larger, tablet display will be lower resolution than the phone it launches alongside -- and there's not much room above it.

The Google Nexus 9 is expected for "Q3".

Source: The Verge

Apple A8 Die Shot Released (and Debated)

Subject: Graphics Cards, Processors, Mobile | September 28, 2014 - 10:53 PM |
Tagged: apple, a8, a7, Imagination Technologies, PowerVR

First, Chipworks released a dieshot of the new Apple A8 SoC (stored at archive.org). It is based on the 20nm fabrication process from TSMC, which they allegedly bought the entire capacity for. From there, a bit of a debate arose regarding what each group of transistors represented. All sources claim that it is based around a dual-core CPU, but the GPU is a bit polarizing.

apple-a8-dieshot-chipworks.png

Image Credit: Chipworks via Ars Technica

Most sources, including Chipworks, Ars Technica, Anandtech, and so forth believe that it is a quad-core graphics processor from Imagination Technologies. Specifically, they expect that it is the GX6450 from the PowerVR Series 6XT. This is a narrow upgrade over the G6430 found in the Apple A7 processor, which is in line with the initial benchmarks that we saw (and not in line with the 50% GPU performance increase that Apple claims). For programmability, the GX6450 is equivalent to a DirectX 10-level feature set, unless it was extended by Apple, which I doubt.

apple-a8-dieshot-dailytech.png

Image Source: DailyTech

DailyTech has their own theory, suggesting that it is a GX6650 that is horizontally-aligned. From my observation, their "Cluster 2" and "Cluster 5" do not look identical at all to the other four, so I doubt their claims. I expect that they heard Apple's 50% claims, expected six GPU cores as the rumors originally indicated, and saw cores that were not there.

Which brings us back to the question of, "So what is the 50% increase in performance that Apple claims?" Unless they had a significant increase in clock rate, I still wonder if Apple is claiming that their increase in graphics performance will come from the Metal API even though it is not exclusive to new hardware.

But from everything we saw so far, it is just a handful of percent better.

Introducing the all new Dynamic Super Resolution Duo!

Subject: General Tech | October 1, 2014 - 10:09 AM |
Tagged: nvidia, maxwell, GTX 980, GTX 970, GM204, geforce, dx12, dsr

Move over Super Best Friends, the Dynamic Super Resolution Duo is here to slay the evil Jaggies!  Ryan covered NVIDIA's new DSR in his review of the new Maxwell cards and how it can upsample a monitor with a resolution of 2560x1440 or lower to much higher resolutions using a process similar to supersampling but is in fact a 13-tap gaussian filter.  That is important because supersampling would have some interesting challenges rendering 2560x1440 on a 1080p monitor.  DSR gives you a much wider choice of resolutions as you can see in the Guild Wars screenshot below, allowing you to choose a variety of multipliers to your displays native resolution to give your game a much smoother look.   The Tech Report has assembled a variety of screenshots from games with different DSR and AA settings which you can examine with your own eyeballs to see what you think.

gw2-resolutions.jpg

"One of the more intriguing capabilities Nvidia introduced with the GeForce GTX 970 and 980 is a feature called Dynamic Super Resolution, or DSR, for short. Nvidia bills it as a means of getting 4K quality on a 2K display. How good is it? We take a look."

Here is some more Tech News from around the web:

Tech Talk

HP Releases Cheaaaaaap PCs and Windows Tablets

Subject: Systems, Mobile | September 30, 2014 - 01:15 AM |
Tagged: Windows 8.1, hp, cheap tablet, cheap computer

Before I get into the devices, the $149 HP Stream 8 tablet and certain models of the HP Stream 13 laptop (the ones with an optional 4G modem) includes "free 4G for life" for customers in the USA. Reading in the fine print, the device company apparently signed a deal with T-Mobile for 200MB/mo of 4G service. Of course, 200MB will barely cover the Windows Update regimen of certain months, but you have WiFi for that. It is free, and free is good. I can guess that T-Mobile is crossing their fingers that dripping a drop of water on the tongues of the thirsty will convince them to go to the fountain.

hp-stream-laptop.jpg

If it works? Great. That is just about the most honest way that I have ever seen a telecom company attract new customers.

Back to these devices. Oh right, they're cheap. They are so cheap, they barely have any technical specifications. The $199.99 HP Stream 11 laptop has an 11-inch display. The $229.99 HP Stream 13 laptop has a 13-inch display and can be configured with an optional 4G modem. Both are passively cooled (more fanless PCs...) and run on a dual-core processor. Both provide a year of Office 365 Personal subscriptions. Both are available in blueish-purple or pinkish-purple.

The two tablets (7-inch Stream 7 and 8-inch Stream 8) are a similar story. They run an x86 processor with full Windows 8.1 and a year's subscription to Office 365. Somehow, the tablets are based on Intel quad-core CPUs (rather than the laptop's passively cooled dual-cores) despite being cheaper. Then again, they could be completely different architectures.

While HP is interested in, you know, selling product, I expect that Microsoft's generous licensing terms (see also the Toshiba alternative we reported earlier) is an attempt to push their cloud services. They know that cheaper device categories cannot bare as much royalties as a fully-featured laptop, and not having a presence at those prices is conceding it to Google -- and conceding that to Google is really giving up on cloud services for those customers. The simple solution? Don't forfeit those markets, just monetize with your own cloud service. I doubt that it will harm their higher-end devices.

The four devices (Stream 7 - $99, Stream 8 -$149, Stream 11 - $199, Stream 13 - $229) are coming soon.

Source: HP
Manufacturer: Seasonic

Introduction and Features

2-P-Banner.jpg

Today we have a double header for your reading enjoyment. Not one, but two Platinum Series power supplies from Seasonic. The two latest additions to Seasonic’s flagship product line are the Platinum 1050W and Platinum 1200W PSUs. The power supplies feature tight voltage regulation (±1~2%), quiet operation (fanless mode), and high efficiency (80Plus Platinum certified). Both PSUs are fully modular and come backed by a 7-year warranty.

Seasonic is a well known and highly respected OEM that produces some of the best PC power supplies on the market today. In addition to supplying power supplies to many big-name companies who re-brand the units with their own name, Seasonic also sells a full line of power supplies under the Seasonic name. Both new power supplies feature an improved Hybrid Fan control circuit and upgraded copper conduction bars on the main PCB, which together help increase efficiency and performance.

3-Front-cables.jpg

Seasonic Platinum 1050W & 1200W Special Features

Ultra Tight Voltage Regulation Improved load voltage regulation keeps the voltage fluctuations on the 12V output within +2% and -0% (no negative tolerance), and on the 3.3V and 5V outputs between +1% and -1%, which (under 80 Plus load conditions) results in smooth and stable operation.

Seasonic Hybrid Silent Fan Control The industry first, advanced three-phased thermal control balances between silence and cooling. The Hybrid Silent Fan Control provides three operational stages: Fanless, Silent and Cooling Mode.  In addition, a selector switch is provided to allow for manual selection between the Seasonic S2FC (fan control without Fanless Mode) or S3FC (fan control including Fanless Mode).

Reduced Cooling Fan Hysteresis is achieved by a new fan control IC, which optimizes how frequently the fan switches on and off. At 25°C ambient temperature the fan turns on when the load rises above 30% (±5%) and turns off when the load drops below 20% % (±5%). Due to this lag in response the fan switches on and off less frequently, which reduces power loss in Fanless and Silent Mode.

Dual Copper Conduction Bars on the power supply PCB help reduce impedance and minimize voltage drop, which further improves efficiency and performance.

80Plus Platinum The Platinum 1050W & 1200W power supplies are certified in accordance to the 80PLUS organization's Platinum standards, offering performance and energy savings with up to ≥92% efficiency and a true power factor of greater than 0.9 PF.

Full Modular Design (DC to DC) The Seasonic Platinum Series power supplies feature an integrated DC connector panel with onboard VRM (Voltage Regulator Module) that enables not only near perfect DC-to-DC conversion with reduction of current loss/impedance and increase of efficiency but also a fully modular DC cabling that enables maximum flexibility of integration and forward compatibility.

Seasonic Platinum Series 1050W & 1200W PSU Key Features:

•    80Plus Platinum certified super high efficiency
•    Ultra tight voltage regulation
•    Fully Modular Cable design with flat ribbon-style cables
•    Seasonic DC Connector Panel with integrated VRMs
•    DC to DC Converter design
•    Hybrid Silent Fan Control (3 modes of operation: Fanless, Silent and Cooling)
•    High-quality Sanyo Denki San Ace dual ball bearing fan with PWM
•    Ultra-tight voltage regulation (+2% and -0% +12V rail)
•    Dual copper conduction bars on PCB for improved efficiency and performance
•    Supports multi-GPU technologies
•    Conductive polymer aluminum solid capacitors
•    High reliability 105°C Japanese made electrolytic capacitors
•    ErP Lot 6 2013 compliant and Intel Haswell processor ready
•    High current Gold plated terminals with Easy Swap connectors
•    Active PFC (0.99 PF typical) with Universal AC input
•    7-Year manufacturer's warranty worldwide

Please continue reading our Seasonic Platinum 1050W & 1200W power supply review!

Assassin's Creed: Rogue Might Be for the PC?

Subject: General Tech | September 28, 2014 - 11:59 PM |
Tagged: assassin's creed, pc gaming, ubisoft

Ubisoft's upcoming Assassin's Creed: Rogue is currently only announced for the Xbox 360 and PlayStation 3, but it might see a PC release, too. This is particularly weird because Rogue is scheduled to launch, for the two aforementioned consoles, on the same day as Assassin's Creed: Unity is scheduled for the Xbox One, PlayStation 4, and PC. Unless they are planning a delayed PC launch, PC gamers might receive two games, in the same franchise, on the same day.

ubisoft-assassinscreed-rogue.jpg

I would have to expect that its PC release would need to be staggered though, right? I mean, how would they market two similar games at the same time? In particular, how would they market two fairly long, similar games at the same time? I mean, thanks Ubisoft, but it just seems like an unnecessary risk of market cannibalization.

About that evidence, though. PC Gamer apparently found reference to the title in the Brazillian software ratings board and the title was mentioned on one of Ubisoft's Uplay page for the PC. Those are pretty good pieces of evidence, although we need to take their word on it, which is implicitly trusting screenshots from NeoGAF. Also, PC Gamer really needs to link to the exact thread at NeoGAF because it was buried under the first several pages by the time I got there.

Assassin's Creed: Unity launches on November 11th, 2014 (not the 14th). Rogue -- maybe?

Need an extreme cooling fan?

Subject: Cases and Cooling | September 29, 2014 - 12:48 PM |
Tagged: noctua, water

The colours of the Noctua Industrial PPC family of fans are familiar but these 8 fans have some serious tricks up their sleeves.  While most fans focus on the amount of air they can move and how quietly they can manage that feat these fans are also rated on how harsh the conditions can be while they can maintain full functionality.  For instance the IP67 fans will operate perfectly even when submerged in water to a depth of 1m and provide complete protection against dust if used in air.  While it is unlikely your computer will function when submerged there are builds which could take advantage of the ability to move water around.  Generally these fans are intended for use cooling systems that reside in industrial plants and other harsh conditions but it is nice to know you can pick up fans specifically designed to operate in very dusty or wet environments.  Check out the review at Modders Inc if the idea of having a ruggedized cooling system appeals to you.

nff12dpwm.jpg

"The Noctua Industrial PPC line packaging is similarly Spartan like the new Noctua Redux line, packed without extras such as adapters, splitters and alternate mounting options but comes with a 4-piece screw mounting kit."

Here are some more Cases & Cooling reviews from around the web:

CASES & COOLING

Source: Modders Inc

The Internet of Thing is a confusing place for manufacturers right now

Subject: General Tech | September 30, 2014 - 10:11 AM |
Tagged: arm, internet of things, Si106x, 108x, Silicon Labs, Intel, quark

While the Internet of Things is growing at an incredible pace the chip manufacturers which are competing for this new market segment are running into problems when trying to design chips to add to appliances.  There is a balance which needs to be found between processing power and energy savings, the goal is to design very inexpensive chips which can run on  microWatts of power but still be incorporate networked communication and sensors.  The new Cortex-M7 is a 32-bit processor which is directly competing with 8 and 16 bit microcontrollers which provide far less features but also consume far less power.  Does a smart light bulb really need to have a 32bit chip in it or will a lower cost MCU provide everything that is needed for the light to function?  Intel's Quark is in a similar position, the processing power it is capable of could be a huge overkill compared to what the IoT product actually needs.  The Register has made a good observation in this article, perhaps the Cortex M0 paired with an M4 or M7 when the application requires the extra horsepower is a good way for ARM to go in.  Meanwhile, Qualcomm's Snapdragon 600 has been adopted to run an OS to control robots so don't think this market is going to get any less confusing in the near future.

cortex-mo.png

"The Internet of Things (IoT) is growing an estimated five times more quickly than the overall embedded processing market, so it's no wonder chip suppliers are flocking to fit out connected cars, home gateways, wearables and streetlights as quickly as they can."

Here is some more Tech News from around the web:

Tech Talk

Source: The Register