Flash player not detected. Click here to install flash.
« 1 2 3 4 5 »

Tech Report's SSD Endurance Test Is Down to Two

Subject: General Tech, Storage | September 21, 2014 - 05:41 PM |
Tagged: ssd, Samsung, kingston hyper x, kingston, endurance, corsair neutron gtx, corsair, 840 pro

Many drives have died over the last year and a bit. The Tech Report has been torturing SSDs with writes until they drop. Before a full petabyte of data was written, three of the six drives kicked the bucket. They are now at 1500TB of total writes and one of the three survivors, the 240GB Corsair Neutron GTX, dropped out. This was a bit surprising as it was reporting fairly high health when it entered "the petabyte club" aside from a dip in read speeds.

The two remaining drivers are the Samsung 840 Pro (256GB) and Kingston HyperX 3K (240GB).

techreport-deadssds.jpg

Two stand, one fell (Image Credit: Tech Report)

Between those two, the Samsung 840 Pro is given the nod as the Kingston drive lived through uncorrectable errors; meanwhile, the Samsung has yet to report any true errors (only reallocations). Since the test considers a failure to be a whole drive failure, though, the lashings will persist until the final drive gives out (or until Scott Wasson gives up in a glorious sledgehammer apocalypse -- could you imagine if one of them lasted a decade? :3).

Of course, with just one unit from each model, it is difficult to faithfully compare brands with this marathon. While each lasted a ridiculously long time, the worst of the bunch putting up with a whole 2800 full-drive writes, it would not be fair to determine an average lifespan for a given model with one data point each. It is good to suggest that your SSD probably did not die from a defrag run -- but it is still a complete waste of your time and you should never do it.

Source: Tech Report

Oculus Announces Crescent Bay (Prototype)

Subject: General Tech, Displays | September 20, 2014 - 10:55 PM |
Tagged: Oculus, VR, crescent bay, oculus connect

As they progress toward a consumer product, Oculus announced another prototype at their Oculus Connect developer conference. Dubbed Crescent Bay, the headset contains a new display, with a higher refresh rate and higher resolution, better optics, and 360-degree head tracking. It is also lighter and includes built-in speakers.

oculus-crescent-bay-prototype.png

Of course, these features were not quantified with hard specifications.

Brendan Iribe, CEO of Oculus, stressed that this is not the consumer product yet. He claims that this is an increase over DK2 that is equivalent to the increase DK2 saw over the original Oculus Rift. It is not all about hardware, though. This company is engaged in hardware and software, video and audio. This should make sense considering their early acquisition of John Carmack and hundreds of other engineers. They, rightly, see themselves as a platform and, while they see game engines as necessary for VR, due to the ability to reposition the camera in milliseconds of notice, compared to film's never, they are not limiting themselves to just "games" (but yes they consider it a big part of it).

oculus-crescent-bay-prototype2.png

Honestly, months ago, I was sitting at my desk with its five monitors, each with bits of news posts, chats, reference material, and maybe a StarCraft tournament live stream, and Oculus was being discussed. I started to wonder if monitors, especially multiple displays, are just an approximation -- our current best effort -- of how to receive video cues from a PC. I could see a VR platform take on entertainment and even productivity with its infinite, virtual environments.

Currently, there is not even a hint about pricing and availability (as far as I found).

Source: Oculus

Want Haswell-EP Xeons Without Expensive DDR4 Memory?

Subject: General Tech, Motherboards, Processors | September 20, 2014 - 03:51 PM |
Tagged: xeon, Haswell-EP, ddr4, ddr3, Intel

Well this is interesting and, while not new, is news to me.

ram.jpg

The upper-tier Haswell processors ushered DDR4 into the desktops for enthusiasts and servers, but DIMMs are quite expensive and incompatible with the DDR3 sticks that your organization might have been stocking up on. Despite the memory controller being placed on the processor, ASRock has a few motherboards which claim DDR3 support. ASRock, responding to Anandtech's inquiry, confirmed that this is not an error and Intel will launch three SKUs, one eight-core, one ten-core, and one twelve-core, with a DDR3-supporting memory controller.

The three models are:

  E5-2629 v3 E5-2649 v3 E5-2669 v3
Cores (Threads) 8 (16) 10 (20) 12 (24)
Clock Rate 2.4 GHz 2.3 GHz 2.3 Ghz
L3 Cache 20MB 25MB 30MB
TDP 85W 105W 120W

The processors, themselves, might not be cheap or easily attainable, though. There are rumors that Intel will require customers purchase at least a minimum amount. It might not be worth buying these processors unless you have a significant server farm (or similar situation).

Source: Anandtech

ARChon Brings App Runtime for Chrome Outside ChromeOS

Subject: General Tech | September 20, 2014 - 11:33 AM |
Tagged: chrome os, chrome, google, Android

Last week, we reported on Google's App Runtime for Chrome (ARC) beta release. Its goal is to bring apps from the Google Play Store to ChromeOS through an Android stack built atop Native Client. They are sandboxed, but still hardware-dependent for performance. Since then, vladikoff on GitHub has published ARChon, a project which brings that initiative to desktop OSes.

archon-project.jpg

Image Credit: ARChon Project

To use Archon, you will need to use an x86-64 version of Chrome 37 (or later) on Windows, Mac, or Linux. This project is not limited to the handful of ARC-compatible apps that Google officially supports. The Android apps need to be converted into Chrome extensions using a tool, also available, called chromeos-apk. In fact, the example app is an open source version of the game, 2048, rather than just the four launch apps from Google.

Whether Google intends to offer this, officially, with their Chrome browser is the most interesting part for me. I would prefer that everything just works everywhere but, failing that, having a supported Android platform on the desktop without dual-booting or otherwise displacing the host itself could be interesting. And yes, Bluestacks exists, but it has not been something that I would recommend, at least in my experience of it.

Source: ARChon

NVIDIA GeForce 344.11 Driver and GeForce Experience 2.1.2 Released Alongside Maxwell-based GTX 980 and GTX 970

Subject: General Tech, Graphics Cards | September 20, 2014 - 09:56 AM |
Tagged: nvidia, maxwell, graphics drivers, geforce experience

Update: There is also the 344.16 for the GTX 970 and GTX 980, resolving an issue specific to them.

When they release a new graphics card, especially in a new architecture, NVIDIA will have software ready to support it. First and most obvious, Maxwell comes with the GeForce 344.11 drivers - which is the first to support only Fermi and later GPUs. Mostly, the driver's purpose is supporting the new graphics cards and optimizing to Borderlands: The Pre-Sequel, The Evil Within, F1 2014, and Alien: Isolation. It also supports multi-monitor G-Sync, which was previously impossible, even with three single-DisplayPort Kepler cards.

nvidia-geforce.png

At the same time, NVIDIA launched a new GeForce Experience with more exciting features. First, and I feel least expected, it allows the SHIELD Wireless Controller to be connected to a PC, but only wired with its provided USB cable. This also means that you cannot use the controller without a GeForce graphics card.

If you have a GeForce GTX 900-series add-in board, you will be able to use Dynamic Super Resolution (DSR) and record in 4K video with ShadowPlay. Performance when recording on a PC in SLI mode has been improved also, apparently even for Kepler-based cards.

Both the drivers and GeForce Experience are available now.

Source: NVIDIA

Developer's View on DirectX 12 Alongside Maxwell Launch

Subject: General Tech, Graphics Cards | September 20, 2014 - 09:06 AM |
Tagged: unreal engine 4, nvidia, microsoft, maxwell, DirectX 12, DirectX

Microsoft and NVIDIA has decided to release some information about DirectX 12 (and DirectX 11.3) alongside the launch of the Maxwell-based GeForce GTX 980 and GeForce GTX 970 graphics cards. Mostly, they announced that Microsoft teamed up with Epic Games to bring DirectX 12 to Unreal Engine 4. They currently have two demos, Elemental and Infiltrator, that are up and running with DirectX 12.

epic-ue4-infiltrator.jpg

Moreover, they have provided a form for developers who are interested in "early access" to apply for it. They continually discuss it in terms of Unreal Engine 4, but they do not explicitly say that other developers cannot apply. UE4 subscribers will get access to the Elemental demo in DX12, but it does not look like Infiltrator will be available.

DirectX 12 is expected to target games for Holiday 2015.

Source: Microsoft
Subject: Editorial, Storage
Manufacturer: PC Perspecitve
Tagged: tlc, Samsung, bug, 840 evo, 840

Investigating the issue

Over the past week or two, there have been growing rumblings from owners of Samsung 840 and 840 EVO SSDs. A few reports scattered across internet forums gradually snowballed into lengthy threads as more and more people took a longer look at their own TLC-based Samsung SSD's performance. I've spent the past week following these threads, and the past few days evaluating this issue on the 840 and 840 EVO samples we have here at PC Perspective. This post is meant to inform you of our current 'best guess' as to just what is happening with these drives, and just what you should do about it.

The issue at hand is an apparent slow down in the reading of 'stale' data on TLC-based Samsung SSDs. Allow me to demonstrate:

840 EVO 512 test hdtach-2-.png

You might have seen what looks like similar issues before, but after much research and testing, I can say with some confidence that this is a completely different and unique issue. The old X25-M bug was the result of random writes to the drive over time, but the above result is from a drive that only ever saw a single large file write to a clean drive. The above drive was the very same 500GB 840 EVO sample used in our prior review. It did just fine in that review, and at afterwards I needed a quick temporary place to put a HDD image file and just happened to grab that EVO. The file was written to the drive in December of 2013, and if it wasn't already apparent from the above HDTach pass, it was 442GB in size. This brings on some questions:

  • If random writes (i.e. flash fragmentation) are not causing the slow down, then what is?
  • How long does it take for this slow down to manifest after a file is written?

Read on for the full scoop!

ASUS Announces STRIX GTX 980 and STRIX GTX 970

Subject: Graphics Cards | September 19, 2014 - 01:57 PM |
Tagged: asus, strix, STRIX GTX 970, STRIX GTX 980, maxwell

The ASUS STRIX series comes with a custom DirectCU II cooler that is capable of running at 0dB when not under full load, in fact you can choose the temperature at which the fans activate using the included GPU Tweak application.  The factory overclock is modest but thanks to that cooler and the 10-phase power you will be able to push the card even further. The best news is the price, you get all of these extras for almost the same price as the reference cards are selling at!

unnamed.jpg

Fremont, CA (19th September, 2014) - ASUS today announced the STRIX GTX 980 and STRIX GTX 970, all-new gaming graphics cards packed with exclusive ASUS technologies, including DirectCU II and GPU Tweak for cooler, quieter and faster performance. The STRIX GTX 980 and STRIX GTX 970 are factory-overclocked at 1279MHz and 1253MHz respectively and are fitted with 4GB of high-speed GDDR5 video memory operating at speeds up to 7010MHz for the best gameplay experience.

Play League of Legends and StarCraft in silence!
The STRIX GTX 980 and STRIX GTX 970 both come with the ASUS-exclusive DirectCU II cooling technology. With a 10mm a heatpipe to transport heat away from the GPU core, operating temperatures are 30% cooler and 3X quieter than reference designs. Efficient cooling and lower operating temperatures allow STRIX graphics cards to incorporate an intelligent fan-stop mode that can handle games such as League of Legends1 and StarCraft1 passively, making both cards ideal for gamers that prefer high-performance, low-noise PCs.

burger.jpg

Improved stability and reliability with Digi+ VRM technology
STRIX GTX 980 and STRIX GTX 970 graphics cards include Digi+ VRM technology. This 10-phase power design in the STRIX GTX 980 and 6-phase design in the STRIX GTX 970 uses a digital voltage regulator to reduce power noise by 30% and enhance energy efficiency by 15% – increasing long term stability and reliability. The STRIX GTX 970 is designed to use a single 8-pin power connecter for clean and easy cable management.

Real-time monitoring and control with GPU Tweak software
The STRIX GTX 980 and STRIX GTX 970 come with GPU Tweak, an exclusive ASUS tool that enables users to squeeze the very best performance from their graphics card. GPU Tweak provides the ability to finely control GPU speeds, voltages and video memory clock speeds in real time, so overclocking is easy and can be carried out with high confidence.

GPU Tweak also includes a streaming tool that lets users share on-screen action over the internet in real time, meaning others can watch live as games are played. It is even possible to add a title to the streaming window along with scrolling text, pictures and webcam images.

AVAILABILITY & PRICING
ASUS STRIX GTX 980 and GTX 970 graphics cards will be available at ASUS authorized resellers and distributors starting on September 19, 2014. Suggested US MSRP pricing is $559 for the STRIX GTX 980 and $339 for the STRIX GTX 970.

Source: ASUS

The EVGA X99 Classified is expensive and impressive

Subject: Motherboards | September 19, 2014 - 12:41 PM |
Tagged: evga, X99, X99 Classified

The EVGA X99 Classified is definitely a premium board as it carries a $400 price tag and Legit Reviews took a look at it to see if it was worth the price.  It certainly comes with a lot of extras including 2-way, 3-way and 4-way SLI Bridges in addition to an assortment of other cables and headers.  As is implied this board can support 4 way SLI or CrossFire with it's five PCIe x16 slots as well as a 4x slot.  It bears two M.2 ports, one a type 2 and one type 3 as well as an onboard Creative Sound Core3D codex.  The overclocking potential is good, Legit hit Core i7-5960X at 4.5GHz at only 1.33V and they highly recommend it to anyone that can afford it.

evga-x99-classified-layout-2-645x468.jpg

"EVGA has been known as some of the best components out there for some time now. Today I have the opportunity to look at the flagship motherboard from their Intel X99 product stack. The EVGA X99 Classified motherboard (151-HE-E999-KR) is hitting the shelves with a retail price of only $399.99! The EVGA X99 Classified isn’t geared for the casual overclockers, it’s built for those that want to push the Intel core i7-5960X to the extreme speeds using liquid nitrogen and other sub-ambient cooling methods. Read on to see how this board performs!"

Here are some more Motherboard articles from around the web:

Motherboards

NVIDIA's Maxwell offers smart performance

Subject: Graphics Cards | September 19, 2014 - 11:17 AM |
Tagged: vr direct, video, nvidia, mfaa, maxwell, GTX 980, GTX 970, GM204, geforce, dx12, dsr

The answer to the two most important questions are as follows, the GTX 980 will cost you around $560 compared to the $500 for an R9 290X and the GTX 970 an attractive $330 compared to $380 for an R9 290.  Availability is hard to predict but the cards will be shipping soon and you can pre-order your choice of card by following the links on the last page of Ryan's review.  Among all the new features that have been added to this new GPU one of the most impressive is the power draw, as you can see in [H]ard|OCP's review this card pulls 100W less than the 290X at full load although it did run warmer than the 290X Double Dissipation card which [H] compared it to, something that may change with a 980 bearing a custom cooler.  Follow those links to see the benchmarking results of this card, both synthetic and in game.

14110637240cPED1snfp_5_15_l.jpg

"Today NVIDIA launches its newest Maxwell GPU. There will be two new GPUs, the GeForce GTX 980 and GeForce GTX 970. These next generation GPUs usher in new features and performance that move the gaming industry forward. We discuss new features, architecture, and evaluate the gameplay performance against the competition."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

JavaScript Is Still Getting Faster...

Subject: General Tech | September 18, 2014 - 11:08 PM |
Tagged: asm.js, simd, sse, avx, neon, arm, Intel, x86

The language that drives the client-side web (and server-side with Node.js) is continually being improved. Love it or hate it, JavaScript is everywhere and approaching native execution performance. You can write it yourself or compile into it from another, LLVM-compatible language through Emscripten. In fact, initiatives (like ASM.js) actually prefer compiled code because the translator can do what you are intending without accidentally stepping into slow functionality.

javascript-logo.png

Over at Microsoft's Modern.IE status page, many features are listed as being developed or considered. This includes support for Mozilla-developed ASM.js and, expected to be included in ECMAScript 7th edition, SIMD instructions. This is the one that I wanted to touch on most. SIMD, which is implemented as SSE, AVX, NEON, and other instruction sets, to perform many tasks in few, actual instructions. For browsers which support this, it could allow for significant speed-ups in vector-based tasks, such as manipulating colors, vertexes, and other data structures. Emscripten is in the process of integrating SIMD support and the technology is designed to support Web Workers, allowing SIMD-aware C and C++ code to be compiled into SIMD.JS and scale to multiple cores, if available, and they probably are these days.

In short, it will be possible to store and process colors, positions, forces, and other data structures as packed, 32-bit 4-vectors, rather than arbitrary objects with properties that must be manipulated individually. It increases computation throughput for significantly large datasets. This should make game developers happy, in particular.

Apparently, some level of support has been in Firefox Nightly for the last several versions. No about:config manipulation required, just call the appropriate function on window's SIMD subobject. Internet Explorer is considering it and Chromium is currently reviewing Intel's contribution.

Source: Modern.IE
Author:
Manufacturer: NVIDIA

The GM204 Architecture

James Clerk Maxwell's equations are the foundation of our society's knowledge about optics and electrical circuits. It is a fitting tribute from NVIDIA to include Maxwell as a code name for a GPU architecture and NVIDIA hopes that features, performance, and efficiency that they have built into the GM204 GPU would be something Maxwell himself would be impressed by. Without giving away the surprise conclusion here in the lead, I can tell you that I have never seen a GPU perform as well as we have seen this week, all while changing the power efficiency discussion in as dramatic a fashion.

IMG_9754.JPG

To be fair though, this isn't our first experience with the Maxwell architecture. With the release of the GeForce GTX 750 Ti and its GM107 GPU, NVIDIA put the industry on watch and let us all ponder if they could possibly bring such a design to a high end, enthusiast class market. The GTX 750 Ti brought a significantly lower power design to a market that desperately needed it, and we were even able to showcase that with some off-the-shelf PC upgrades, without the need for any kind of external power.

That was GM107 though; today's release is the GM204, indicating that not only are we seeing the larger cousin of the GTX 750 Ti but we also have at least some moderate GPU architecture and feature changes from the first run of Maxwell. The GeForce GTX 980 and GTX 970 are going to be taking on the best of the best products from the GeForce lineup as well as the AMD Radeon family of cards, with aggressive pricing and performance levels to match. And, for those that understand the technology at a fundamental level, you will likely be surprised by how much power it requires to achieve these goals. Toss in support for things like a new AA method, Dynamic Super Resolution, and even improved SLI performance and you can see why doing it all on the same process technology is impressive.

The NVIDIA Maxwell GM204 Architecture

The NVIDIA Maxwell GM204 graphics processor was built from the ground up with an emphasis on power efficiency. As it was stated many times during the technical sessions we attended last week, the architecture team learned quite a bit while developing the Kepler-based Tegra K1 SoC and much of that filtered its way into the larger, much more powerful product you see today. This product is fast and efficient, but it was all done while working on the same TSMC 28nm process technology used on the Kepler GTX 680 and even AMD's Radeon R9 series of products.

GeForce_GTX_980_Block_Diagram_FINAL.png

The fundamental structure of GM204 is setup like the GM107 product shipped as the GTX 750 Ti. There is an array of GPCs (Graphics Processing Clustsers), each comprised of multiple SMs (Streaming Multiprocessors, also called SMMs for this Maxwell derivative) and external memory controllers. The GM204 chip (the full implementation of which is found on the GTX 980), consists of 4 GPCs, 16 SMMs and four 64-bit memory controllers.

Continue reading our review of the GeForce GTX 980 and GTX 970 GM204 Graphics Cards!!

Micron's M600 SSD, SLC in the front MLC in the back

Subject: Storage | September 18, 2014 - 04:10 PM |
Tagged: micron, M600, SLC. MLC, DWA

Micron's M600 SSD has a new trick up its sleeve, called dynamic write acceleration which is somewhat similar to the HDDs with an NAND cache to accelerate the speed frequently accessed data can be read but with a brand new trick.  In this case SLC NAND acts as the cache for MLC NAND but it does so dynamically, the NAND can switch from SLC to MLC and back depending on the amount of usage.  There is a cost, the SLC storage capacity is 50% lower than MLC so the larger the cache the lower the total amount of storage is available.  As well the endurance rating is also higher than previous drives, not because of better NAND but because of new trim techniques being used.  This is not yet a retail product so The Tech Report does not have benchmarks but this goes to show you there are plenty more tricks we can teach SSDs.

drives.jpg

"Micron's new M600 SSD can flip its NAND cells between SLC and MLC modes on the fly, enabling a dynamic write cache that scales with the drive's unused capacity. We've outlined how this dynamic write acceleration is supposed to impact performance, power consumption, and endurance."

Here are some more Storage reviews from around the web:

Storage

Three Display Component Vendors Support AMD FreeSync

Subject: Graphics Cards, Displays | September 18, 2014 - 03:52 PM |
Tagged: amd, freesync, DisplayPort, adaptive sync

MStar, Novatek, and Realtek, three vendors of scaler units for use in displays, have announced support for AMD's FreeSync. Specifically, for the Q1'15 line of monitors, these partners will provide scaler chips that use DisplayPort Adaptive-Sync and, when paired with a compatible AMD GPU, will support FreeSync.

amd-freesync1.jpg

The press release claims that these scalar chips will either support 1080p and 1440p monitors that are up to 144Hz, or drive 4K displays that are up to 60Hz. While this is promising, at least compared to the selection at G-Sync's launch a year earlier, it does not mean that this variety of monitors will be available -- just that internal components will be available for interested display vendors. Also, it means that there are probably interested display vendors.

AMD and partners "intend to reveal" displays via a "media review program" in Q1. This is a little later than what we expected from Richard Huddy's "next month" statements, but it is possible that "Sampling" and "Media Review Program" are two different events. Even if it is "late", this is the sort of thing that is forgivable to me (missing a few months while relying on a standards body and several, independent companies).

Source: AMD

Acer's two new XBO series gaming displays; G-SYNC on both, 4k on one

Subject: Displays | September 18, 2014 - 02:59 PM |
Tagged: Acer XB270H, XB280HK, 4k, g-sync

The Acer XB280HK is a 28" 4K G-SYNC display which will launch next month at expected price of US$799 or $849.99CDN.  The XB270H is a 27" 1080p display also with G-SYNC support and is currently available at $599USD or $649CDN.  As both are rated with a 1ms response time it is likely these are backlit TN panels but with the recent advances in TN panels the viewing angles should be much better than the original generation.

XB_sku_main.png

SAN JOSE, Calif., Sept. 18, 2014 – Acer America is bringing its new XBO series gaming displays featuring NVIDIA G-SYNC technology to gaming enthusiasts in North America. This cutting-edge line delivers significant performance advantages that infuse gaming with incredibly smooth, realistic and responsive visuals, elevating game play to a new level of stunning realism.

The two XBO series display models for North America include the Acer XB280HK boasting a 28-inch 4K2K Ultra HD (3840 x 2160) display with a @60Hz refresh rate and the Acer XB270H with a 27-inch screen and a maximum Full HD 1080p @ 144Hz resolution. Both models provide a quick 1ms response time, further enhancing in-game performance. They also feature revolutionary NVIDIA G-SYNC technology, comfortable ergonomics and excellent connectivity.

“We’re excited to bring these first-rate gaming displays to gamers in the United States and Canada,” said Ronald Lau, Acer America business manager. “The incredibly sharp and smooth images provided by NVIDIA G-SYNC technology are sure to thrill the most avid gamers. Combined with Acer’s highly flexibly ergonomic stand, non-glare ComfyView panel and low dimming technology, users are assured long hours of both comfortable and visually stunning game play.”  

NVIDIA G-SYNC: Picture-Perfect Visuals
NVIDIA G-SYNC technology ensures that every frame rendered by the GPU is perfectly portrayed by synchronizing the monitor’s refresh rates to the GPU in a GeForce GTX-powered PC. This breakthrough technology eliminates screen tearing and minimizes display stutter and input lag to deliver a smooth, fast and breathtaking gaming experience on the hottest PC gaming titles. Scenes appear instantly, objects look visually sharp, and gameplay is more responsive to provide faster reaction times, giving gamers a competitive edge.

“NVIDIA G-SYNC technology dramatically improves the way gamers see their games, by delivering images that are fast, sharp and stutter-free,” said Tom Petersen, distinguished engineer at NVIDIA. “This is the way games were meant to be played, and gamers will absolutely love these new Acer XBO monitors.”

Comfortable Ergonomics
By making gaming as comfortable as possible, the XBO series monitors help extend game time with three Acer innovations. Acer flicker-less technology reduces eye strain via a stable power supply that eliminates screen flicker. Its low dimming technology provides users the ability to adjust brightness down to 15 percent in low-light environments and Acer ComfyView non-glare screen reduces reflection for clearer viewing, a significant benefit for gamers.

A flexible, multi-function ErgoStand extends a wide range of options for maximum comfort and viewing perspectives. For finding the best angle, the screen tilts from -5 to -35 degrees and the height can be raised by up to 5.9 inches. In addition, the base rotates 120 degrees from left to right for easy screen sharing during game play and collaboration with others. Plus, the screen pivots from horizontal to vertical to accommodate two entirely different gaming scenarios.

Both new XBO series monitors deliver wide viewing angles up to 170 degrees horizontal and up to 160 degrees vertical. The Acer XB280HK delivers 1.07 billion colors and the Acer XB270HL provides 16.7 million colors, while both offer a native contrast ratio of 1000:1, a 300 nits brightness and a 72 percent NTSC color saturation, a combination that delivers exceptionally vibrant, detailed and high-quality imagery.

Superb Connectivity
The displays come with DisplayPort as well as high-speed USB 3.0 ports (1 up, 4 down) that are located on the side and down of screen for easily connecting a mouse, keyboard, gaming headset, joystick and other peripherals. One of the USB ports is equipped for battery charging.  

Eco-Friendly Design
EPEAT Gold registered, the highest level of EPEAT registration available, the displays meet all of EPEAT’s required criteria and at least 75 percent of EPEAT’s optional criteria. They’re also mercury-free and LED-backlit, which reduces energy costs by consuming less power than standard CCFL-backlit displays. ENERGY STAR 6.0 and TCO 6.0 qualified, they adhere to strict environmental, performance and ergonomic design standards.

Pricing and Availability
The Acer XB270H is available now at leading online retailers in the United States and Canada with a MSRP of US$599 and $649.99 CAD. The Acer XB280HK will be available next month at leading online retailers in the United States and Canada with a manufacturer’s suggested retail price (MSRP) of US$799 and $849.99 CAD.

Acer displays are backed by professional, high-quality technical support and a three-year warranty. Acer’s online community at community.acer.com provides customers discussion forums, answers to frequently asked questions and the opportunity to share ideas for new and enhanced services and products.

Source: Acer

Can a heavily overclocked ASUS STRIX GTX 750 Ti OC beat an R9 270?

Subject: Graphics Cards | September 18, 2014 - 12:56 PM |
Tagged: DirectCU II, asus, STRIX GTX 750 Ti OC, factory overclocked

The ASUS STRIX GTX 750 Ti OC sports the custom DirectCU II cooling system which not only improves the temperatures on the card but also reduces the noise produced by the fans.  It comes out of the box with an overclocked  GPU base clock 1124MHz and a boost clock of 1202MHz, with the 2GB of VRAM set to the stock speed of 5.4GHz; [H]ard|OCP managed to increase that to an impressive 1219/1297MHz and 6.0GHz even for the VRAM without increasing voltages.  Unfortunately even with that overclock it lagged behind the Sapphire Radeon R9 270 Dual-X which happens to be about the same price at $170.

1410203265ODA6sXGQNR_1_12_l.jpg

"Rounding out our look at ASUS' new STRIX technology we have another STRIX capable video card on our test bench today, this time based on the GTX 750 Ti GPU. We will take the ASUS STRIX GTX 750 Ti OC Edition and test it against an AMD Radeon R9 270 and AMD Radeon R9 265 to see what reigns supreme under $200."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

Podcast #318 - GTX 980 and R9 390X Rumors, Storage News from IDF, ADATA SP610 SSDs and more!

Subject: General Tech | September 18, 2014 - 10:59 AM |
Tagged: windows 9, video, TSV, supernova, raptr, r9 390x, podcast, p3700, nvidia, Intel, idf, GTX 980, evga, ECS, ddr4, amd

PC Perspective Podcast #318 - 09/18/2014

Join us this week as we discuss GTX 980 and R9 390X Rumors, Storage News from IDF, ADATA SP610 SSDs and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano

Subscribe to the PC Perspective YouTube Channel for more videos, reviews and podcasts!!

 

Upward mobility for both Linux and Windows

Subject: General Tech, Mobile | September 18, 2014 - 10:11 AM |
Tagged: Red Hat, microsoft, Feedhenry

Red Hat just acquired Feedhenry for around €63.5 million to enhance their ability to support mobile apps.  Feedhenry designs mobile apps on both the client and server side which run on Android, iOS, Windows Phone, QNX and HTML5 as well as integration with apps from companies such as Salesforce, SAP and Oracle.  This purchase could help Red Hat become an attractive alternative for companies wishing to serve apps across all platforms and increased usage of Openshift and Openstack.  The Inquirer also posted news on a extension to the price discount on Microsoft's licensing for mobile developers.  They are still offering lifetime accounts for Dev Center for $19.99 for individuals and $99.99 for businesses, which compares favourably to the one time Android fee of $25 and even better against Apple's $99 per year.  If they could just get their phones to play nicely with O365 this could well increase their market share for mobile phones.

index.jpg

"RED HAT HAS ACQUIRED Feedhenry, a designer of mobile apps for the enterprise market. The company sees the acquisition as a key driver to offer cross-platform support for its existing software products, including Red Hat Enterprise Linux Openstack 7, which it released earlier this year."

Here is some more Tech News from around the web:

Tech Talk

Source: The Inquirer

ECS "Design Your Own LIVA" Competition Is Almost Over!

Subject: General Tech, Systems | September 17, 2014 - 11:27 PM |
Tagged: LIVA, ECS, case mods, case mod contest

ECS USA is holding a competition for North American users to design mods for the LIVA mini PC kit. The contest consists of three phases and round one, whose winners will advance to the second phase, ends on September 30th. If you want to enter in the contest, you will need to submit your first phase entry before then to be eligible for the second phase. Check out Morry's post for a second opinion.

ECS-Liva-Logo.png

What are the phases?

Round 1 (Ends September 30th): You will need to publish the "soft copy" of your design draft to Facebook. This will consist of six illustrations: Front, Rear, Left Side, Right Side, Top, and 45-degree 3D illustration. See the image below for an example. The top ten participants, based on Facebook likes, will be provided with a white LIVA mini PC kit to modify in Round 2.

ECS-LIVA-design-spec.png

Round 2 (Ends October 31st): The winners of Round 1 will, using the provided LIVA kits and your design draft, implement their customizations. Photographs of these modified cases will be sent to ECS (I assume by Facebook) for a team of judges to rank them first, second, third, or runner-up.

Round 3 (November 7th): Sit back, relax, and wait for the judges to select winners. The Champion will receive $1000 USD for their trouble, second place will get $500 USD, and third will get $300 USD. The honorable mentions will get various swags.

The contest is open to residents of the USA and Canada. Do it fast! It's less than two weeks and, as I understand it, the later you enter, the less time you will have to accumulate Facebook likes.

Source: ECS

Microsoft's Universal Mobile Keyboard Is Coming Soon

Subject: General Tech, Cases and Cooling | September 17, 2014 - 03:57 PM |
Tagged: windows, mobile, microsoft, keyboard, ios, Android

Let me share a story. There was a time, around the first Surface launch, that I worked in an electronics retail store (and the several years prior -- but I digress). At around that time, Microsoft was airing ads with people dancing around, clicking keyboards to the Surface tablet with its magnetic click or snap. One day, a customer came in looking for the keyboard from the TV spots for their iPad. I thought about it for a few seconds and realized how terrible Microsoft's branding actually was.

microsoft-mobile-keyboard-universal.jpg

Without already knowing the existence of their Windows 8 and RT tablets, which the ads were supposed to convey, it really did look like an accessory for an iPad.

Doing Microsoft's job for them, I explained the Surface Pro and Surface RT tablets along with its keyboard-cover accessories. Eventually, I told them that it was a Microsoft product for their own tablet brand and would not see an iPad release. The company felt threatened by these mobile, touch devices and was directly competing with them.

...

So Microsoft is announcing a keyboard for Windows, Android, and iOS. Sure, it is very different from the Type and Touch Covers; for instance, it does not attach to these devices magnetically. Microsoft has also been known to develop hardware, software, and services for competing platforms. While it is not unsurprising that Microsoft keyboards would work on competing devices, it does feel weird for their keyboard to have features that are specialized for these competing platforms.

There are three things interesting about this keyboard: it has a built-in stand, it has special keys for Android and iOS that are not present in Windows, and it has a built-in rechargeable battery that lasts up to 6 months. The peripheral pairs wirelessly with all of these devices through Bluetooth.

The Microsoft Universal Mobile Keyboard is coming soon for $79.95 (MSRP).

Source: Microsoft