Flash player not detected. Click here to install flash.
« 1 2 3 4 5 »
Author:
Manufacturer: AMD

High Bandwidth Memory

UPDATE: I have embedded an excerpt from our PC Perspective Podcast that discusses the HBM technology that you might want to check out in addition to the story below.

The chances are good that if you have been reading PC Perspective or almost any other website that focuses on GPU technologies for the past year, you have read the acronym HBM. You might have even seen its full name: high bandwidth memory. HBM is a new technology that aims to turn the ability for a processor (GPU, CPU, APU, etc.) to access memory upside down, almost literally. AMD has already publicly stated that its next generation flagship Radeon GPU will use HBM as part of its design, but it wasn’t until today that we could talk about what HBM actually offers to a high performance processor like Fiji. At its core HBM drastically changes how the memory interface works, how much power is required for it and what metrics we will use to compare competing memory architectures. AMD and its partners started working on HBM with the industry more than 7 years ago, and with the first retail product nearly ready to ship, it’s time to learn about HBM.

We got some time with AMD’s Joe Macri, Corporate Vice President and Product CTO, to talk about AMD’s move to HBM and how it will shift the direction of AMD products going forward.

The first step in understanding HBM is to understand why it’s needed in the first place. Current GPUs, including the AMD Radeon R9 290X and the NVIDIA GeForce GTX 980, utilize a memory technology known as GDDR5. This architecture has scaled well over the past several GPU generations but we are starting to enter the world of diminishing returns. Balancing memory performance and power consumption is always a tough battle; just ask ARM about it. On the desktop component side we have much larger power envelopes to work inside but the power curve that GDDR5 is on will soon hit a wall, if you plot it far enough into the future. The result will be either drastically higher power consuming graphics cards or stalling performance improvements of the graphics market – something we have not really seen in its history.

01-gddr5powergraph.jpg

While it’s clearly possible that current and maybe even next generation GPU designs could still have depended on GDDR5 as the memory interface, the move to a different solution is needed for the future; AMD is just making the jump earlier than the rest of the industry.

Continue reading our look at high bandwidth memory (HBM) architecture!!

ASUS Announces the ZenFone 2 Powered by Quad-Core Intel Atom

Subject: Mobile | May 18, 2015 - 02:00 PM |
Tagged: ZenFone 2, smartphones, intel atom, atom z3580, asus

ZenFone 2 is the new flagship smartphone from ASUS ZenFone, and features a new design powered by an Intel Atom Z3580 (Moorefield) processor with a massive 4GB of RAM.

zenfone2_main.png

The phone has a 5.5-inch 1080p IPS display with 403 PPI for crisp scaling, and the “Ergonomic Arc” design includes a volume-control key on the rear of the phone “within easy reach of the user's index finger”, with a curved profile that tapers to a 0.15 inch at the edges.

The camera also features a 13 MP PixelMaster camera with a f/2.0 aperture and claimed “zero shutter-lag”. The battery weighs in at 3000mAh and features “BoostMaster” fast-charge technology that sounds similar to Qualcomm’s Quick Charge 2.0 standard.

But one of the most attractive features will be price, as ASUS will be selling these online through their retail channels as affordable unlocked smartphones:

  • 2GB / 16GB storage / Atom 3560 - $199
  • 4GB / 64GB storage / Atom 3580 / QuickCharger - $299

Here's look at the specifications:

  • CPU: Intel Quad-Core 64-bit Atom Z3580 @ 2.3GHz (Min Clock 333MHz, Max Clock 2333MHz)
  • GPU: PowerVR Series 6 G6430 with OpenGL 3.0 Support (Min Clock 457MHz, Max Clock 533MHz)
  • Display: 5.5in IPS, 1920x1080 resolution (403 PPI), Corning Gorilla Glass 3 with Anti-Fingerprint Coating
  • Memory: 4GB 800 MHz LPDDR3
  • Storage: 64GB eMMC
  • SIM: Support Dual active micro-SIM
  • Micro-SD slot: SDXC support up to 128GB
  • Modem: Intel XMM7260 LTE-Advanced
    • FDD LTE 1/2/3/4/5/7/8/9/177/18/19/20/28/29
    • TDD LTE 38/39/40/41
    • WCDMA 850/900/1900
    • TD-SCDMA 1900/2100
    • EDGE/GPRS/GSM 850/900/1800/1900
  • Wireless: WiFi 802.11a/b/g/n/ac
  • Rear Camera: 13MP, aperture f/2.0, sensor size 1/3.2 inch
  • Front Camera: 5MP
  • Maximum Video Resolution: 1080p/30
  • Battery: 3000 mAh Lithium-Polymer (11.4 Wh), Boostmaster Fast-Charging
  • Colors: Glacier Gray, Osmium Black, Glamour Red, Sheer Gold
  • Dimensions: 152.5 mm x 77.2 mm x 10.9-3.9 mm (6 x 3.04 x 0.43-.15 inches)
  • Weight: 170g

PR after the break.

Source: ASUS

ASUS Announces QHD ZenBook UX305, 4K UX501 Notebook

Subject: Mobile | May 18, 2015 - 02:00 PM |
Tagged: zenbook pro, zenbook, UX501, UX305, QHD+, notebooks, ips, asus, 4k, 2560x1440

ASUS has annouced a new QHD+ version of the affordable ZenBook UX305 notebook as well as the new ZenBook Pro UX501.

ux305_1.jpg

The ZenBook UX305 was released as a disruptive notebook with specs far above its $699 price tag, and this new version goes far beyond the 1920x1080 screen resolution of the original. This new QHD+ (3200x1800) panel is IPS just like the original, but with this ultra-high resolution it boasts 276 PPI for either incredibly sharp, or incredibly tiny text depending on how well your application scales.

The new ZenBook Pro UX501 takes resolution a step further with a 4K/UHD 3820x2160 IPS panel and a powerful quad-core Intel Core i7-4720HQ processor with 16GB of RAM at its disposal. NVIDIA GeForce GTX 960M graphics power this 15.6-inch, 282 PPI UHD panel, and naturally 4x PCIe storage is available as well.

ux501.png

More information and specs are available in the full PR for both notebooks after the break.

Source: ASUS
Subject: General Tech
Manufacturer: ASUS

Introduction and First Impressions

The ASUS ROG Gladius mouse features sleek styling and customizable lighting effects, but the biggest aspect is the underlying technology. With socketed Omron switches designed to be easily swapped and an adjustable 6400dpi optical sensor this gaming mouse offers a lot on paper. So how does it feel? Let's find out.

gladius_box.jpg

There are a few aspects to the way a mouse feels, including the shape, surface material, and overall weight. Beyond the physical properties there is the speed and accuracy of the sensor (which also affects hand movement) and of course the mouse buttons and scroll wheel. Really, there's a lot going on with a modern gaming mouse - a far cry from the "X-Y position indicator" that the inventors had nicknamed "mouse" in the 1960s.

One of the hallmarks of the ASUS ROG (Republic of Gamers) lineup is the sheer amount of additional features the products tend to have. I use an ROG motherboard in my personal system, and even my micro-ATX board is stuffed with additional functionality (and the box is loaded with accessories). So it came as no surprise to me when I opened the Gladius mouse and began to look it over. Sure, the box contents aren't as numerous as one of the Maximus motherboards, but there's still quite a bit more than I've encountered with a mouse before.

gladius_top.jpg

Continue reading our review of the ASUS ROG Gladius Gaming Mouse!!

NVIDIA Under Attack Again for GameWorks in The Witcher 3: Wild Hunt

Subject: Graphics Cards | May 17, 2015 - 12:04 PM |
Tagged: The Witcher 3, nvidia, hairworks, gameworks, amd

I feel like every few months I get to write more stories focusing on the exact same subject. It's almost as if nothing in the enthusiast market is happening and thus the cycle continues, taking all of us with it on a wild ride of arguments and valuable debates. Late last week I started hearing from some of my Twitter followers that there were concerns surrounding the upcoming release of The Witcher 3: Wild Hunt. Then I found a link to this news post over at Overclock3d.net that put some of the information in perspective.

Essentially, The Witcher 3 uses parts of NVIDIA's GameWorks development tools and APIs, software written by NVIDIA to help game developers take advantage of new technologies and to quickly and easily implement them into games. The problem of course is that GameWorks is written and developed by NVIDIA. That means that optimizations for AMD Radeon hardware are difficult or impossible, depending on who you want to believe. Clearly it doesn't benefit NVIDIA to optimize its software for AMD GPUs financially, though many in the community would like NVIDIA to give a better effort - for the good of said community.

gwlogo.jpg

Specifically in regards to The Witcher 3, the game implements NVIDIA HairWorks technology to add realism on many of the creatures of the game world. (Actually, the game includes HairWorks, HBAO+, PhysX,  Destruction and Clothing but our current discussion focuses on HairWorks.) All of the marketing and video surrounding The Witcher 3 has been awesome and the realistic animal fur simulation has definitely been a part of it. However, it appears that AMD Radeon GPU users are concerned that performance with HairWorks enabled will suffer.

An example of The Witcher 3: Wild Hunt with HairWorks

One of the game's developers has been quoted as such:

Many of you have asked us if AMD Radeon GPUs would be able to run NVIDIA’s HairWorks technology – the answer is yes! However, unsatisfactory performance may be experienced as the code of this feature cannot be optimized for AMD products. Radeon users are encouraged to disable NVIDIA HairWorks if the performance is below expectations.

There are at least several interpretations of this statement floating around the web. First, and most enflaming, is that NVIDIA is not allowing CD Project Red to optimize it by not offering source code. Another is that CD Project is choosing to not optimize for AMD hardware due to time considerations. The last is that it simply isn't possible to optimize it because of hardware limitations of HairWorks.

I went to NVIDIA with these complaints about HairWorks and Brian Burke gave me this response:

We are not asking game developers do anything unethical.
 
GameWorks improves the visual quality of games running on GeForce for our customers.  It does not impair performance on competing hardware.
 
Demanding source code access to all our cool technology is an attempt to deflect their performance issues. Giving away your IP, your source code, is uncommon for anyone in the industry, including middleware providers and game developers. Most of the time we optimize games based on binary builds, not source code.
 
GameWorks licenses follow standard industry practice.  GameWorks source code is provided to developers that request it under license, but they can’t redistribute our source code to anyone who does not have a license. 
 
The bottom line is AMD’s tessellation performance is not very good and there is not a lot NVIDIA can/should do about it. Using DX11 tessellation has sound technical reasoning behind it, it helps to keep the GPU memory footprint small so multiple characters can use hair and fur at the same time.
 
I believe it is a resource issue. NVIDIA spent a lot of artist and engineering resources to help make Witcher 3 better. I would assume that AMD could have done the same thing because our agreements with developers don’t prevent them from working with other IHVs. (See also, Project Cars)
 
I think gamers want better hair, better fur, better lighting, better shadows and better effects in their games. GameWorks gives them that.  

Interesting comments for sure. The essential take away from this is that HairWorks depends heavily on tessellation performance and we have known since the GTX 680 was released that NVIDIA's architecture performs better than AMD's GCN for tessellation - often by a significant amount. NVIDIA developed its middleware to utilize the strength of its own GPU technology and while it's clear that some disagree, not to negatively impact AMD. Did NVIDIA know that would be the case when it was developing the software? Of course it did. Should it have done something to help AMD GPUs more gracefully fall back? Maybe.

Next, I asked Burke directly if claims that NVIDIA was preventing AMD or the game developer from optimizing HairWorks for other GPUs and platforms were true? I was told that both AMD and CD Project had the ability to tune the game, but in different ways. The developer could change the tessellation density based on the specific GPU detected (lower for a Radeon GPU with less tessellation capability, for example) but that would require dedicated engineering from either CD Project or AMD to do. AMD, without access to the source code, should be able to make changes in the driver at the binary level, similar to how most other driver optimizations are built. Burke states that in these instances NVIDIA often sends engineers to work with game developers and that AMD "could have done the same had it chosen to."  And again, NVIDIA reiterated that in no way do its agreements with game developers prohibit optimization for AMD GPUs.

It would also be possible for AMD to have pushed for the implementation of TressFX in addition to HairWorks; a similar scenario played out in Grand Theft Auto V where several vendor-specific technologies were included from both NVIDIA and AMD, customized through in-game settings. 

NVIDIA has never been accused of being altruistic; it doesn't often create things and then share it with open arms to the rest of the hardware community. But it has to be understood that game developers know this as well - they are not oblivious. CD Project knew that HairWorks performance on AMD would be poor but decided to implement the technology into The Witcher 3 anyway. They were willing to sacrifice performance penalties for some users to improve the experience of others. You can argue that is not the best choice, but at the very least The Witcher 3 will let you disable the HairWorks feature completely, removing it from the performance debate all together.

In a perfect world for consumers, NVIDIA and AMD would walk hand-in-hand through the fields and develop hardware and software in tandem, making sure all users get the best possible experience with all games. But that style of work is only helpful (from a business perspective) for the organization attempting to gain market share, not the one with the lead. NVIDIA doesn't have to do it and chooses to not. If you don't want to support that style, vote with your wallet.

Another similar controversy surrounded the recent release of Project Cars. AMD GPU performance was significantly lower than comparable NVIDIA GPUs, even though this game does not implement any GameWorks technologies. In that case, the game's developer directly blamed AMD's drivers, saying that it was a lack of reaching out from AMD that caused the issues. AMD has since recanted its stance that the performance delta was "deliberate" and says a pending driver update will address gamers performance issues.

wicher3-1.jpg

All arguing aside, this game looks amazing. Can we all agree on that?

The only conclusion I can come to from all of this is that if you don't like what NVIDIA is doing, that's your right - and you aren't necessarily wrong. There will be plenty of readers that see the comments made by NVIDIA above and continue to believe that they are being at best disingenuous and at worst, are straight up lying. As I mentioned above in my own comments NVIDIA is still a for-profit company that is responsible to shareholders for profit and growth. And in today's world that sometimes means working against other companies than with them, resulting in impressive new technologies for its customers and push back from competitor's customers. It's not fun, but that's how it works today.

Fans of AMD will point to G-Sync, GameWorks, CUDA, PhysX, FCAT and even SLI as indications of NVIDIA's negative impact on open PC gaming. I would argue that more users would look at that list and see improvements to PC gaming, progress that helps make gaming on a computer so much better than gaming on a console. The truth likely rests somewhere in the middle; there will always be those individuals that immediately side with one company or the other. But it's the much larger group in the middle, that shows no corporate allegiance and instead just wants to have as much fun as possible with gaming, that will impact NVIDIA and AMD the most.

So, since I know it will happen anyway, use the comments page below to vent your opinion. But, for the benefit of us all, try to keep it civil!

NVIDIA SHIELD and SHIELD Pro Show up on Amazon

Subject: Mobile | May 16, 2015 - 01:00 PM |
Tagged: Tegra X1, tegra, shield pro, shield console, shield, nvidia

UPDATE: Whoops! It appears that Amazon took the listing down... No surprise there. I'm sure we'll be seeing them again VERY SOON. :)

Looks like the release of the new NVIDIA SHIELD console device, first revealed back at GDC in March, is nearly here. A listing for "NVIDIA SHIELD" as well as the new "NVIDIA SHIELD Pro" showed up on Amazon.com today.

shield1.jpg

Though we don't know what the difference between the SHIELD and SHIELD Pro are officially, according to Amazon at least, the difference appears to be the internal storage. The Pro model will ship with 500GB of internal storage, the non-Pro model will only have 16GB. You'll have to get an SD Card for more storage on the base model if you plan on doing anything other than streaming games through NVIDIA GRID it seems.

shield2.jpg

No pricing is listed yet and there is no release date on the Amazon pages either, but we have always been told this was to be a May or June on-sale date. Both models of the NVIDIA SHIELD will include an HDMI cable, a micro-USB cable and a SHIELD Controller. If you want the remote or stand, you're going to have to pay out a bit more.

shield3.jpg

For those of you that missed out on the original SHIELD announcement from March, here is a quick table detailing the specs, as we knew them at that time. NVIDIA's own Tegra X1 SoC featuring 256 Maxwell GPU cores powers this device using the Android TV operating system, promising 4K video playback, the best performing Android gaming experience and NVIDIA GRID streaming games.

  NVIDIA SHIELD Specifications
Processor NVIDIA® Tegra® X1 processor with 256-core Maxwell™ GPU with 3GB RAM
Video Features 4K Ultra-HD Ready with 4K playback and capture up to 60 fps (VP9, H265, H264)
Audio 7.1 and 5.1 surround sound pass through over HDMI
High-resolution audio playback up to 24-bit/192kHz over HDMI and USB
High-resolution audio upsample to 24-bit/192hHz over USB
Storage 16 GB
Wireless 802.11ac 2x2 MIMO 2.4 GHz and 5 GHz Wi-Fi
Bluetooth 4.1/BLE
Interfaces Gigabit Ethernet
HDMI 2.0
Two USB 3.0 (Type A)
Micro-USB 2.0
MicroSD slot (supports 128GB cards)
IR Receiver (compatible with Logitech Harmony)
Gaming Features NVIDIA GRID™ streaming service
NVIDIA GameStream™
SW Updates SHIELD software upgrades directly from NVIDIA
Power 40W power adapter
Weight and Size Weight: 23oz / 654g
Height: 5.1in / 130mm
Width: 8.3in / 210mm
Depth: 1.0in / 25mm
OS Android TV™, Google Cast™ Ready
Bundled Apps PLEX
In the box NVIDIA SHIELD
NVIDIA SHIELD controller
HDMI cable (High Speed), USB cable (Micro-USB to USB)
Power adapter (Includes plugs for North America, Europe, UK)
Requirements TV with HDMI input, Internet access
Options SHIELD controller, SHIELD remove, SHIELD stand

 

Source: Amazon.com

Raptr's Top PC Games of April 2015

Subject: General Tech | May 15, 2015 - 11:56 PM |
Tagged: raptr, pc gaming

The PC gaming utility, Raptr, is normally used to optimize in-game settings, chat and socialize, and record game footage. It also keeps track of game-hours and aggregates them into a list once per month, which makes it one of the few sources for this type of data on the PC. We were late on it last month, which means that another was posted just a week later.

caas-most_played_april-2015.jpg

April marks the release of Grand Theft Auto V for the PC. It went live on the 14th and, despite only counting for half of the month, ended up at 4th place. Next month's survey will tell us whether the post-release drop-off was countered by counting Grand Theft Auto for a full month, which is double what they have now. It was just 0.17% of global play time behind CS:GO. Despite an error on the graph, it knocked DOTA 2 down to fifth, and Diablo III down to sixth. In fact, just about everything below Grand Theft Auto V dropped at least one rank.

Only three games actually gained places this month: ArcheAge, Warframe, and Spider Solitaire. Yes, that game is now the 19th most played, as tracked by Raptr. You could sort-of say that Hearthstone gained a rank by not losing one, but you would be wrong. Why would you say that?

League of Legends dropped less than a percent of total play time, settling in at about 21%. This is just about on target for the game, which proves that not even Rockstar can keep people from having a Riot.

Source: Raptr

NVIDIA Releases 352.84 WHQL Drivers for Windows 10

Subject: Graphics Cards | May 15, 2015 - 11:36 PM |
Tagged: windows 10, geforce, graphics drivers, nvidia, whql

The last time that NVIDIA has released a graphics driver for Windows 10, they added a download category to their website for the pre-release operating system. Since about January, graphics driver updates were pushed by Windows Update and, before that, you would need to use Windows 8.1 drivers. Receiving drivers from Windows Update also meant that add-ons, such as PhysX runtimes and the GeForce Experience, would not be bundled with it. I know that some have installed them separately, but I didn't.

nvidia-geforce.png

The 352.84 release, which is their second Windows 10 driver to be released outside of Windows Update, is also certified by WHQL. NVIDIA has recently been touting Microsoft certification for many of their drivers. Historically, they released a large number of Beta drivers that were stable, but did not wait for Microsoft to vouch for them. For one reason or another, they have put a higher priority on that label, even for “Game Ready” drivers that launch alongside a popular title.

For some reason, the driver is only available via GeForce Experience and NVIDIA.com, but not GeForce.com. I assume NVIDIA will publish it there soon, too.

Source: NVIDIA

Oculus Rift "Full Rift Experience" Specifications Released

Subject: Graphics Cards, Processors, Displays, Systems | May 15, 2015 - 03:02 PM |
Tagged: Oculus, oculus vr, nvidia, amd, geforce, radeon, Intel, core i5

Today, Oculus has published a list of what they believe should drive their VR headset. The Oculus Rift will obviously run on lower hardware. Their minimum specifications, published last month and focused on the Development Kit 2, did not even list a specific CPU or GPU -- just a DVI-D or HDMI output. They then went on to say that you really should use a graphics card that can handle your game at 1080p with at least 75 fps.

oculus-dk2-product.jpg

The current list is a little different:

  • NVIDIA GeForce GTX 970 / AMD Radeon R9 290 (or higher)
  • Intel Core i5-4590 (or higher)
  • 8GB RAM (or higher)
  • A compatible HDMI 1.3 output
  • 2x USB 3.0 ports
  • Windows 7 SP1 (or newer).

I am guessing that, unlike the previous list, Oculus has a more clear vision for a development target. They were a little unclear about whether this refers to the consumer version or the current needs of developers. In either case, it would likely serve as a guide for what they believe developers should target when the consumer version launches.

This post also coincides with the release of the Oculus PC SDK 0.6.0. This version pushes distortion rendering to the Oculus Server process, rather than the application. It also allows multiple canvases to be sent to the SDK, which means developers can render text and other noticeable content at full resolution, but scale back in places that the user is less likely to notice. They can also be updated at different frequencies, such as sleeping the HUD redraw unless a value changes.

The Oculus PC SDK (0.6.0) is now available at the Oculus Developer Center.

Source: Oculus

Budget headphones that are just good enough, Ozone Rage ST

Subject: General Tech | May 15, 2015 - 02:01 PM |
Tagged: audio, ozone, Rage ST, gaming headset

With a pricetag of $40 many may be a bit leery of purchasing the Ozone Rage ST Headset as it is significantly lower in price than most gaming headsets which implies lower quality too.  It does use the 40mm drivers common in most headsets with a response range of  20-20kHz but the microphone is omnidirectional as opposed to unidirectional which means you will send background noise.  Modders-Inc tried it out and were pleasantly surprised; while it has none of the extra features that $100+ headsets do, the overall quality was worth the price of admission.  If you are in need of a headset but are strapped for cash, these are a good choice for you.

RAGEST_colors.jpg

"Despite the stereotype, gamers are social creatures too. Competitive games after all requires another person to play with, but as expressive as some gestures may be such as virtual teabagging, it is not nearly as effective in conveying what you really feel when you shout out expletives through a headset. It feels very natural in fact that one almost feels …"

Here is some more Tech News from around the web:

Audio Corner

Source: Modders Inc

What Makes a Mobile GPU Tick? Interview with ARM's Jem Davies

Subject: Mobile | May 15, 2015 - 01:56 PM |
Tagged: video, mali, jem davies, interview, arm

Have you ever wondered how a mobile GPU is born? Or how the architecture of a mobile GPU like ARM Mali differs from the technology in your discrete PC graphics card? Perhaps you just want to know if ideas like HBM (high bandwidth memory) are going to find their way into the mobile ecosystem any time soon?

armicon.jpg

Josh and I sat down (virtually) with ARM's VP of Technology and Fellow, Jem Davies,  to answer these questions and quite a bit more. The resulting interview will shed light on the design process of a mobile GPU, how you get the most out of an SoC that measures power by the milliwatt, what the world of mobile benchmarking needs to do to clean up its act and quite a bit more. 

You'd be hard pressed to find a better way to spend the next hour of your day as you will without a doubt walk away more informed about the world of smartphones, tablets and GPUs.

Another juicy rumour; Lenovo wants MSI's gaming laptops?

Subject: General Tech | May 15, 2015 - 01:02 PM |
Tagged: rumours, msi, Lenovo

When you think of Lenovo laptops you tend to think of suits and office suites, not Cheetos and Red Bull but DigiTimes has heard tell that this could possibly change.  With Acer, Asustek's ROG and Dell's Alienware lineups all seeing decent profits from the niche market of high end gaming laptops the rumour is that Lenovo would like in on some of that filthy lucre.  DigiTimes' source posits that MSI's gaming laptop subdivision would be the obvious target for Lenovo.  It is possible that this is all hot air but Lenovo is a huge company and could easily afford to buy a division of a competitor, if they were willing to sell.

336446-lenovo-u430-touch.jpg

"Micro-Star International (MSI) has been successful in selling gaming notebooks and Lenovo is interested in acquiring MSI's gaming notebook business unit, according to sources from supply chain makers. However, MSI has denied the reports."

Here is some more Tech News from around the web:

Tech Talk

Source: DigiTimes
Subject: Displays
Manufacturer: Acer

Introduction and Specifications

Displays have been a hot item as of late here at PC Perspective. Today we are looking at the new Acer XB270HU. In short, this is an IPS version of the ASUS ROG Swift. For the long version, it is a 1440P, 144Hz, G-Sync enabled 27 inch display. This is the first G-Sync display released with an IPS panel, which is what makes this release such a big deal. Acer has been pushing hard on the display front, with recent releases of the following variable refresh capable displays:

  • XB270H 27in 1080P 144Hz G-Sync
  • XB280HK 28in 4K 60Hz G-SYnc
  • XG270HU 27in 1440P 40-144Hz FreeSync
  • XB270HU 27in 1440P 144Hz G-Sync < you are here

The last entry in that list is the subject of todays review, and it should look familiar to those who have been tracking Acer's previous G-Sync display releases:

DSC01299.JPG

Here's our video overview of this new display. I encourage you to flip through the review as there are more comparison pictures and information to go along.

Continue reading our review of the Acer XB270HU 1440P 144Hz IPS G-Sync Monitor!!

Corsair's 128GB DDR4 Unbuffered Memory Kits - for the rich and famous

Subject: Memory | May 14, 2015 - 07:16 PM |
Tagged: Vengeance LPX, Dominator Platinum, ddr4, corsair, 128Gb

ram.PNG

Corsair has just released the three largest unbuffered DDR4 kits available for enthusiasts who can afford the asking price.  Two 128GB Dominator Platinum kits, one clocked at 2400MHz and one at 2666MHz along with a 2400MHz Vengeance LPX have just gone on sale.  All three kits consist of eight 16GB modules which means that the number of motherboards that support these kits is extremely limited, the EVGA X99 Classified, ASRock's X99 Extreme4 and the Asus X99-E WS are among the few.   As you can see below the investment is rather high but if you want bragging rights, or an amazingly large RAM drive then Corsair has a solution for you.

128gb2.PNG

 

Corsair, a worldwide leader in high-performance PC components, today announced the availability of the world’s first available 128GB DDR4 unbuffered memory kits. Available in Corsair’s Vengeance LPX and Dominator Platinum Series lines, the new 128GB capacities give content creators an unprecedented amount of high-speed DDR4 SDRAM for memory-hungry applications.

The 128GB (8 x 16GB) DDR4 memory kits are designed for the latest Intel X99 series motherboards and support XMP 2.0 for the ultimate compatibility, reliability, and performance. The first available kits are rated at speeds of 2666MHz and 2400MHz and higher speeds will be announced soon. Like all Corsair memory, the new kits are backed by a lifetime warranty.

Dominator Platinum Series 128GB DDR4 Memory
The most advanced memory kits available, the Dominator Platinum series DDR4 modules feature a striking industrial design for good looks, patented DHX technology for cooler operation, and user-swappable colored “light pipes” for customizable LED lighting. Dominator Platinum memory is built with hand-screened ICs, undergoes rigorous performance testing, and incorporates patented DHX cooling technology for reliable performance in demanding environments.

Vengeance LPX Series 128GB DDR4 Memory
Vengeance LPX memory is designed for high-performance overclocking with aluminum heatspreaders for faster heat dissipation and eight-layer PCB for superior overclocking headroom. Each IC is individually screened for performance potential.

Pricing and Lifetime Warranty
Corsair Dominator Platinum and Vengeance LPX DDR4 memory kits are available from Corsair.com and Corsair’s worldwide network of authorized distributors and resellers. All Corsair memory is backed with a limited lifetime warranty and Corsair customer service and technical support.

 

Source: Corsair

Podcast #349 - Death of Media Center, i7 NUC, Fractal Define S and more!

Subject: General Tech | May 14, 2015 - 02:46 PM |
Tagged: podcast, video, supermicro, X99, Intel, amd, corsair, H100i GTX, H80i GT, fractal, define s, akracing, nvidia, shield, grid, epson, xeon e7 v3

PC Perspective Podcast #349 - 05/14/2015

Join us this week as we discuss the Death of Media Center, i7 NUC, Fractal Define S and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano

Subscribe to the PC Perspective YouTube Channel for more videos, reviews and podcasts!!

 

Revisiting the Corsair AX1500i

Subject: Cases and Cooling | May 14, 2015 - 01:27 PM |
Tagged: Digital Power Supply, Corsair Link, AX1500i, 80 Plus Titanium

It has been almost a year since Lee reviewed the Corsair AX1500i, one of the best high wattage PSUs it has been his pleasure to test so it seems appropriate to remind you of it's quality with this review from [H]ard|OCP.  While the PSU is not new and still costs more than the competition at $400 the 7 year warranty is better than most.  The review is of two Corsair AX1500i PSUs, one provided by Corsair and one purchased from a retail outlet for reasons you can read about in the full review.  In the end [H] gave this PSU a pass, they felt that the unchanged price was a strike against Corsair as are the claims of the marketing team but as far as performance this PSU provided solid power in even their torture tests.

1428882118AOldLKbooU_2_11_l.jpg

"It is not every day that a company has the moxie to come out and say that it makes "the best" of anything, but that is exactly what Corsair is saying about its AX1500i computer power supply; "The best enthusiast power supply you can own." Of course that begs one question, "Is it, or isn't it the best enthusiast PSU you can own?" We answer that."

Here are some more Cases & Cooling reviews from around the web:

CASES & COOLING

Source: [H]ard|OCP

You can teach a Wolfram Alpha new tricks

Subject: General Tech | May 14, 2015 - 12:35 PM |
Tagged: wolfram alpha, stephen wolfram, image identify

You may have had the pleasure of using the Google Goggles image identification app, not so much for its successes as for its often hilarious misses.  There is now a new image identification app from Wolfram Alpha which you can try out.  The Register immediately tried a random picture of Stephen Wolfram who is apparently a podium in disguise but Image Identify seems very fond of capyberas.   Head on over to amuse yourself and of course only use your pictures for proper training as we wouldn't want to reclassify the podium as Daddy, now would we?

index.jpg

"search your heart ..."

"WOLFRAM ALPHA has released a new site designed to help you identify any image that you throw at it."

Here is some more Tech News from around the web:

Tech Talk

Source: The Register

Rumor: AMD Radeon R9 300-series Release Dates

Subject: Graphics Cards | May 14, 2015 - 07:00 AM |
Tagged: tonga, radeon, R9, pitcairn, Fiji, bonaire, amd

Benchlife.info, via WCCFTech, believes that AMD's Radeon R9 300-series GPUs will launch in late June. Specifically, the R9 380, the R7 370, and the R7 360 will arrive on the 18th of June. These are listed as OEM parts, as we have mentioned on the podcast, which Ryan speculates could mean that the flagship Fiji XT might go by a different name. Benchlife.info seems to think that it will be called by the R9 390(X) though, and that it will be released on the 24th of June.

WCCFTech is a bit more timid, calling it simply “Fiji XT”.

amd-gaming-evolved.png

In relation to industry events, this has the OEM lineup launching on the last day of E3 and Fiji XT launching in the middle of the following week. This feels a little weird, especially because AMD's E3 event with PC Gamer is on the 16th. While it makes sense for AMD to announce the launch a few days before it happens, that doesn't make sense for OEM parts unless they were going to announce a line of pre-built PCs. The most likely candidate to launch gaming PCs is Valve, and they're one of the few companies that are absent from AMD's event.

And this is where I run out of ideas. Launching a line of OEM parts at E3 is weird unless it was to open the flood gates for OEMs to make their own announcements. Unless Valve is scheduled to make an announcement earlier in the day, or a surprise appearance at the event, that seems unlikely. Something seems up, though.

Source: WCCFTech
Author:
Subject: General Tech
Manufacturer: AKracing

Or: How to fall asleep at work.

I will be the first to admit that just a couple of weeks ago I had zero need for a gaming chair. But when 4GamerGear.com offered to send us one of the AKracing AK-6014 ergonomic executive units, I agreed. I went out of town for a week and returned to find the chair assembled and already in use by Ken, our go to video editor and engineer. Of course I had to steal it from him to take part in the "review" process and the results were great! As I sit here now in my Ikea office chair, writing up this post, while Ken sits just feet away tapping away on some edits, I can't help but want to use my authority to take it back.

akracing1.jpg

The price is steep, but the added comfort you get from a chair like the AKracing model we tested is substantial. Every part of the design, based on a racing seat for a car, is built to keep you in place. But instead of preventing lateral movements caused from taking corners, this is more to keep your butt in place and your back straight to encourage good posture. The arm rests are height adjustable (as is the seat itself of course) and the back reclines for different desk and resting positions. You can lay it PAST flat for naps if you're into that kind of thing.

akracin2.jpg

You can find these chairs for sale on Amazon in different color combinations with a current price of $349. It's expensive, I can't deny that. But it looks and feels way cooler than what you are sitting in right now. And aren't you worth it?

Firefox 38 Launches with (and also without) DRM Support

Subject: General Tech | May 13, 2015 - 05:06 PM |
Tagged: mozilla, firefox, DRM

Mozilla has just released Firefox 38. With it comes the controversial Adobe Primetime DRM implementation through the W3C's Encrypted Media Extensions (EME). Or, maybe not. If you upgrade the browser through one of the default channels, the Adobe Primetime Content Decryption Module will appear in the Plugins tab of your Add-ons manager on Windows Vista or later (but it might take a few minutes after the upgrade).

Mozilla_Firefox_logo_2013.png

Alternatively, you can use Mozilla's EME-free installer for Firefox and avoid it altogether.

I have mentioned my concerns about DRM in the past. EME does not particularly bother me, because it is just a plugin architecture, but the fundamental concept does. Simply put, copy protection does very little good and a whole lot of bad. If your movie is leaked before it is legally available in consumer's hands, as it regularly does, then what do you expect to accomplish after the fact? It takes one instance to be copied infinitely, and that often comes from the film company's own supply chain, not their customers. Moreover, it is found to reduce sales and hurt customer experience (above and beyond the valid ideological concerns).

Beyond the DRM inclusion, several new features were added. One of the more interesting ones is BroadcastChannel API. This standard allows a web application to share data between “contexts” that have the same “user agent and origin”. In other words, it must be on the same browser and using the same app (even secondary instances of it). This will allow sites to do multi-monitor split screen, which is useful for games and utilities.

WebRTC has also been upgraded with multistream and renegotiation. Even though the general public thinks of WebRTC as a webcam and voice chat standard, it actually allows arbitrary data channels. For example, “BananaBread” is a first person shooter that used WebRTC to synchronize multiplayer state. Character and projectile position is very much not webcam or audio data, but WebRTC doesn't care.

Firefox 38 launched on May 12th with an optional, DRM-incompatible build.

Source: Mozilla