Flash player not detected. Click here to install flash.
« 1 2 3 4 5 »

Micron Planning To Launch GDDR6 Graphics Memory In 2017

Subject: Graphics Cards | February 4, 2017 - 08:29 PM |
Tagged: micron, graphics memory, gddr6

This year is shaping up to be a good year for memory with the promise of 3D XPoint (Intel/Micron), HBM2 (SK Hynix and Samsung), and now GDDR6 graphics memory from Micron launching this year. While GDDR6 was originally planned to be launched next year, Micron recently announced its intentions to start producing the memory chips by the later half of 2017 which would put it much earlier than previously expected.

Micron Logo.png

Computer World reports that Micron is citing the rise of e-sports and gaming driving the computer market that now sees three year upgrade cycles rather than five year cycles (I am not sure how accurate that is, however as it seems like PCs are actually lasting longer between upgrade as far as relevance but i digress) as the primary reason for shifting GDDR6 production into high gear and moving up the launch window. The company expects the e-sports market to grow to 500 million fans by 2020, and it is a growing market that Micron wants to stay relevant in.

If you missed our previous coverage, GDDR6 is the successor to GDDR5 and offers twice the bandwidth at 16 Gb/s (gigabits per second) per die. It is also faster than GDDR5X (12 Gb/s) and uses 20% less power which the gaming laptop market will appreciate. HBM2 still holds the bandwidth crown though as it offers 256 GB/s per stack and up to 1TB/s with four stacks connected to a GPU on package.

As such, High Bandwidth Memory (HBM2 and then HBM3) will power the high end gaming and professional graphics cards while GDDR6 will become the memory used for mid range cards and GDDR5X (which is actually capable of going faster but will likely not be pushed much past 12 Gbps after all if GDDR6 does come out this soon) will replace GDDR5 on most if not all of the lower end products.

I am not sure if Micron’s reasoning of e-sports, faster upgrade cycles, and VR being the motivating factor(s) to ramping up production early is sound or not, but I will certainly take the faster memory coming out sooner rather than later! Depending on exactly when in 2017 the chips start rolling off the fabs, we could see graphics cards using the new memory technology as soon as early 2018 (just in time for CES announcements? oh boy I can see the PR flooding in already! hehe).

Will Samsung change course as well and try for a 2017 release for its GDDR6 memory as well?

Are you ready for GDDR6?

Sony Will Add USB Hard Drive Support To PS4 and Boost Mode to PS4 Pro

Subject: General Tech | February 4, 2017 - 07:25 AM |
Tagged: usb 3.0, sony, ps4 pro, ps4, gaming, console

Sony is taking the wraps off of its latest firmware with the release of version 4.50 “Sasuke” beta firmware for the PS4. With the new firmware, Sony is rolling out a number of UI/UX improvement and users will finally be able to use external storage with the game console. On the PS4 Pro front, Sony will be adding a “boost mode” in a future update (it may not be ready in time for a production 4.50 release) that lets legacy games access the additional GPU horsepower of the Pro version of the console to smooth out frame rates without needing any special patches from the game developers.

Sony PS4 Pro.jpg

The new firmware adds support for USB 3.0 hard drives (or SSDs) up to 8TB. Users will be able to use the external storage to store games, downloaded applications, screenshots, and videos and have it all show up on the main system menu along with the local storage. Users will not need to shuffle game data back and forth in order to play their games either. Note that currently, the actual save game data is still stored locally even if the game itself is stored on the external hard drive. Fans of the PlayStation VR (PS VR) also get an update with firmware 4.50 in the form of support for watching 3D Blu Rays. Beyond those big feature updates, Sony is also changing up the interface slightly. The Quick Menu now takes up less screen space and will allow gamers to create and join parties right from there rather than going to a separate app. In the notification area, Sony has condensed all the various notification types into a single unified list. Further, users will be able to set in game screenshots as the home screen wallpaper.

Perhaps most interesting is the planned “boost mode” for the PS4 Pro which is currently in beta. Gamers are reporting that titles such as The Evil Within and Just Cause 3 are significantly smoother frame rates with noticeably reduced stuttering. Reportedly, the boost mode will work with most PS4 games that were programmed with unlocked frame rates though the exact benefits will vary. Games that have a hard cap on the frame rate will still need specific patches from the game developers to get any improvements. Ars Technica speculates that the “boost mode” is simply Sony removing its own blocks it put in place to force compatibility with older games that were developed with the base PS4 in mind. When the boost mode is off, the PS4 Pro GPU has part of itself turned off such that it functions exactly as the PS4’s GPU and activating boost mode takes away the blocks and allows the full GPU (with it's 36 CUs) to process the game data as best it can. Getting things like native higher resolutions or more detailed textures will still require patches, of course.

If you have a PS4 or PS4 Pro, keep an eye on the beta Sasuke 4.50 firmware.

Also read:

Source: Ars Technica

Intel Has Started Shipping Optane Memory Modules

Subject: Memory | February 4, 2017 - 01:42 AM |
Tagged: XPoint, server, Optane, Intel Optane, Intel, big data

Last week Hexus reported that Intel has begun shipping Optane memory modules to its partners for testing. This year should see the launch of both these enterprise products designed for servers as well as tiny application accelerator M.2 solid state drives based on the Intel and Micron joint 3D memory venture. The modules that Intel is shipping are the former type of Optane memory and will be able to replace DDR4 DIMMs (RAM) with a memory solution that is not as fast but is cheaper and has much larger storage capacities. The Optane modules are designed to slot into DDR4 type memory slots on server boards. The benefit for such a product lies in big data and scientific workloads where massive datasets will be able to be held in primary memory and the processor(s) will be able to access the data sets at much lower latencies than if it had to reach out to mass storage on spinning rust or even SAS or PCI-E solid state drives. Being able to hold all the data being worked on in one pool of memory will be cheaper with Optane as well as it is allegedly priced closer to NAND than RAM and the cost of RAM adds up extremely quickly when you need many terabytes of it (or more!). Various technologies attempting to bring higher capacity non volatile and/or flash-based storage in memory module form have been theorized or in the works in various forms for years now, but it appears that Intel will be the first ones to roll out actual products.

Intel Optane Memory Module.JPG

It will likely be years before the technology trickles down to consumer desktops and notebooks, so slapping what would effectively be a cheap RAM disk into your PC is still a ways out. Consumers will get a small taste of the Optane memory in the form of tiny storage drives that were rumored for a first quarter 2017 release following its Kaby Lake Z270 motherboards. Previous leaks suggest that the Intel Optane Memory 8000P would come in 16 GB and 32 GB capacities in a M.2 form factor. With a single 128-bit (16 GB) die Intel is able to hit speeds that current NAND flash based SSDs can only hit with multiple dies. Specifically the 16GB Optane application accelerator drive is allegedly capable of 285,000 random 4K IOPS, 70,000 random write 4K IOPS, Sequential 128K reads of 1400 MB/s, and sequential 128K writes of 300 MB/s. The 32GB Optane drive is a bit faster at 300,000 4K IOPS, 120,000 4K IOPS, 1600 MB/s, and 500 MB/s respectively.

Unfortunately, I do not have any numbers on how fast the Optane memory that will slot into the DDR4 slots will be, but seeing as two dies already max out the x2 PCI-E link they use in the M.2 Optane SSD, a dual sided memory module packed with rows of Optane dies on the significantly wider memory bus is very promising. It should lie somewhere closer to (but slower than) DDR4 but much faster than NAND flash while still being non volatile (it doesn't need constant power to retain the data).

I am interested to see what the final numbers are for Intel's Optane RAM and Optane storage drives. The company has certainly dialed down the hype for the technology as it approached fruition though that may be more to do with what they are able to do right now versus what the 3D XPoint memory technology itself is potentially capable of enabling. I look forward to what it will enable in the HPC market and eventually what will be possible for the desktop and gaming markets.

What are your thoughts on Intel and Micron's 3D XPoint memory and Intel's Optane implementation (Micron's implementation is QuantX)?

Also read:

Source: Hexus

Report: AMD Ryzen Performance in Ashes of the Singularity Benchmark

Subject: Processors | February 4, 2017 - 01:22 AM |
Tagged: titan x, ryzen, report, processor, nvidia, leak, cpu, benchmark, ashes of the singularity, amd

AMD's upcoming 8-core Ryzen CPU has appeared online in an apparent leak showing performance from an Ashes of the Singularity benchmark run. The benchmark results, available here on imgur and reported by TechPowerUp (among others today) shows the result of a run featuring the unreleased CPU paired with an NVIDIA Titan X graphics card.

Ryzen_Ashes_Screenshot.jpg

It is interesting to consider that this rather unusual system configuration was also used by AMD during their New Horizon fan event in December, with an NVIDIA Titan X and Ryzen 8-core processor powering the 4K game demos of Battlefield 1 that were pitted against an Intel Core i7-6900K/Titan X combo.

It is also interesting to note that the processor listed in the screenshot above is (apparently) not an engineering sample, as TechPowerUp points out in their post:

"Unlike some previous benchmark leaks of Ryzen processors, which carried the prefix ES (Engineering Sample), this one carried the ZD Prefix, and the last characters on its string name are the most interesting to us: F4 stands for the silicon revision, while the 40_36 stands for the processor's Turbo and stock speeds respectively (4.0 GHz and 3.6 GHz)."

March is fast approaching, and we won't have to wait long to see just how powerful this new processor will be for 4K gaming (and other, less important stuff). For now, I want to find results from an AotS benchmark with a Titan X and i7-6900K to see how these numbers compare!

Source: TechPowerUp

NVIDIA Releases Vulkan Developer 376.80 Beta Drivers

Subject: Graphics Cards | February 3, 2017 - 10:58 PM |
Tagged: nvidia, graphics drivers, vulkan

On February 1st, NVIDIA released a new developer beta driver, which fixes a couple of issues with their Vulkan API implementation. Unlike what some sites have been reporting, you should not download it to play games that use the Vulkan API, like DOOM. In short, it is designed for developers, not end-users. The goal is to provide correct results when software interacts with the driver, not the best gaming performance or anything like that.

khronos-2016-vulkanlogo2.png

In a little more detail, it looks like 376.80 implements the Vulkan 1.0.39.1 SDK. This update addresses two issues with accessing devices and extensions, under certain conditions, when using the 1.0.39.0 SDK. 1.0.39.0 was released on January 23rd, and thus it will not even be a part of current video games. Even worse, it, like most graphics drivers for software developers, is based on the old, GeForce 376 branch, so it won’t even have NVIDIA’s most recent fixes and optimizations. NVIDIA does this so they can add or change the features that Vulkan developers require without needing to roll-in patches every time they make a "Game Ready" optimization or something. There is no reason to use this driver unless you are developing Vulkan applications, and you want to try out the new extensions. It will eventually make it to end users... when it's time.

If you are wishing to develop software using Vulkan’s bleeding-edge features, then check out NVIDIA’s developer portal to pick up the latest drivers. Basically everyone else should use 378.49 or its 378.57 hotfix.

Source: NVIDIA

Learn about AMD's Vega Memory Architecture

Subject: General Tech | February 3, 2017 - 06:40 PM |
Tagged: amd, Vega, Jeffrey Cheng

Tech ARP had a chance talk with AMD's Jeffrey Cheng about the new Vega GPU memory architecture.  He provided some interesting details such as the fact that the new architecture can handle up to 512 TB of addressable memory.  With such a large pool it would be possible to store data sets in HBM2 memory to be passed to the GPU as opposed to sitting in general system memory.  Utilizing the memory present on the GPU could also reduce costs and energy consumption, not to mention the fact it will perform far more quickly.  Pop by to watch the video to see how he feels this could change the way games and software could be programmed.

AMD-Vega-Tech-Report-15.jpg

"Want to learn more about the AMD Vega memory architecture? Join our Q&A session with AMD Senior Fellow Jeffrey Cheng at the AMD Tech Summit!"

Here is some more Tech News from around the web:

Tech Talk

Source: TechARP

Yes Acer was the company that released the XR382CQK bmijqphuzx Display

Subject: Displays | February 2, 2017 - 08:16 PM |
Tagged: ultra-widescreen, freesync, adaptive sync

Yes, this is the product Ryan mentioned, a curved 37.5" IPS adaptive sync display from Acer.  As opposed to yesterday, today Quad HD refers to a 3840x1600 2300R curve ultra wide screen resolution, making shopping for a monitor even easier, before you even try to type in the model number.  It supports Adaptive Sync, with a refresh rate that tops out at 75Hz; sorry G-SYNC fans.

XR382CQK_sku_main.png

As with yesterdays model it has as slimmed down bezel, called ZeroFrame in this case.  It supports HDMI 1.3 10-bit colour, or at least states it offers 1.07 billion colours as well as a 100,000,000:1 contrast ratio and 300 nit brightness.  The monitor also includes DTS Sound speakers and has a USB 3.0 Type-C port.  You can read a bit more about it here.

XR382CQK_gallery_04.png

Source: Acer

Noctua announces three new AM4 CPU coolers for AMD Ryzen

Subject: Cases and Cooling | February 2, 2017 - 07:34 PM |
Tagged: Socket AM4, ryzen, noctua, NH-U12S SE-AM4, NH-L9x65 SE-AM4, NH-D15 SE-AM4, amd

If you are already planning your new AMD Ryzen build and are wondering what gigantic brown and tan coolers might work then Noctua has a page that will make you smile.  They have listed all of their current coolers which can be made compatible with AM4 using a free adapter which you can order from that page.  They also list some which could be made compatible but are not eligible for the free adapter and those which will not work at all.

notua am4.PNG

Along with the compatibility list comes three brand new coolers, which you can see larger than life by clicking on their names.  The NH-D15 SE-AM4 is a contender for Morry's next favourite cooler for mATX boards, 980g of metal and that is before you add the two 140mm fans.  The NH-U12S SE-AM4 is slimmer 580g but is still 158mm tall and will use a 120mm fan.  For those who prefer their coolers in petite sizes the NH-L9x65 SE-AM4 is a svelte 340g and stands a mere 65mm while wearing its custom fit 92mm fan

You can pick them up soon, the NH-D15 SE-AM4 at $99.90, NH-U12S SE-AM4 for $64.90
and the NH-L9x65 SE-AM4 at $52.90.  PR below the fold.

Source: Noctua

Well, forget about gifting those bundled NVIDIA games

Subject: General Tech | February 2, 2017 - 05:54 PM |
Tagged: nvidia, GFE, game bundle, geforce experience 3.0

Giving away a bonus to your customers is a nice thing to do; watching them jump through hoops to get the bonus is less so.  Many companies have realized that offering a mail in rebate is a great way to look like you are doing something for your customers while at the same time ensuring a significant percentage of those customers never actually claim said MIR.  Strangely this practice has not impressed consumers.

NVIDIA started to embrace something similar towards the end of 2016 with GeForce Experience 3.0, requiring a mandatory login to get at beta drivers and more pertinently, giveaways.  That login requirement includes all the game bundles, such as the one announced yesterday.  Ars Technica reported something interesting this morning that anyone thinking of picking up one of the games bundles should be aware of; NVIDIA now intends to tie these game codes to hardware.  Currently you will need to sign into GFE, verify your code and then allow GFE to verify you have a GTX 1070 or 1080 installed in your system, which strongly suggests you will need to install software.  Ars speculates that this could one day be tied directly to a card via a hardwareID or serial number; making it impossible to give away.

The rational offered references an incident with Microsoft a few months back when "some Gears 4 Windows 10 keys were obtained illegitimately via our Nvidia promotion".  This was  thanks to a loophole created by Amazon's return policy.  Of course, some of those so called illegitimate installations were caused by someone giving or even selling the game key which they obtained legally, because they had already purchased Gears 4.  It is unclear if NVIDIA only pays for codes which are redeemed or if the money has already been invested; in the former it at least makes financial sense, if the latter then Bird Culture has an appropriate phrase.

Once again, everyone must be punished thanks to the overreaction caused by a few smegheads.  Keep this in mind when you are shopping in the future.

nvidia-geforce-gtx-prepare-for-battle-for-honor-and-ghost-recon-wildlands-bundle-640px.jpg

"GFE then performs "a hardware verification step to ensure the coupon code is redeemed on the system with the qualifying GPU." It's not yet clear whether the codes will be tied to a specific serial number/hardware identifier, or whether they will be tied to an overall product line like a GTX 1070 or GTX 1080."

Here is some more Tech News from around the web:

Tech Talk

Source: Ars Technica

Podcast #435 - Qualcomm aptX, FSP Twin 500w PSU, Micro 5100 Enteprise SSDs, AMD Fiscal Results, ASUS Tinker Board, ZeniMax

Subject: Editorial | February 2, 2017 - 03:34 PM |
Tagged: podcast, zenimax, UHD Blu-Ray, toshiba, tinker board, Reundant PSU, qualcomm, micron, Laser Networking, fsp, enterprise ssd, DirectX, delidding, asus, aptX, amd

PC Perspective Podcast #435 - 02/02/17

Join us this week as we discuss Qualcomm aptX, FSP Reundant PSUs, Micron Enterprise SSDs, 5G LTE, AMD Fiscal Year, ZeniMax lawsuit, and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

Hosts: Allyn Malventano, Ken Addison, Josh Walrath, Jermey Hellstrom, Sebastian Peak

Program length: 1:46:22

Podcast topics of discussion:
  1. Week in Review:
  2. News items of interest:
  3. Hardware/Software Picks of the Week
  4. Closing/outro
 

Source:

Radeon Software Crimson ReLive 17.1.2 Drivers Released

Subject: Graphics Cards | February 2, 2017 - 12:02 PM |
Tagged: graphics drivers, amd

A few days ago, AMD released their second graphics drivers of January 2017: Radeon Software Crimson ReLive 17.1.2. The main goal of these drivers are to support the early access of Conan Exiles as well as tomorrow’s closed beta for Tom Clancy’s Ghost Recon Wildlands. Optimization that AMD has been working on prior to release, for either game, are targeted at this version.

amd-2016-crimson-relive-logo.png

Beyond game-specific optimizations, a handful of bugs are also fixed, ranging from crashes to rendering artifacts. There was also an issue with configuring WattMan on a system that has multiple monitors, where the memory clock would drop or bounce around. There is driver also has a bunch of known issues, including a couple of hangs and crashes under certain situations.

Radeon Software Crimson ReLive Edition 17.1.2 is available at AMD’s website.

Source: AMD

NVIDIA Release GeForce 378.57 Hotfix Drivers

Subject: Graphics Cards | February 2, 2017 - 12:01 PM |
Tagged: nvidia, graphics drivers

If you were having issues with Minecraft on NVIDIA’s recent 378.49 drivers, then you probably want to try out their latest hotfix. This version, numbered 378.57, will not be pushed down GeForce Experience, so you will need to grab them from NVIDIA’s customer support page.

nvidia-2015-bandaid.png

Beyond Minecraft, this also fixes an issue with “debug mode”. For some Pascal-based graphics cards, the option in NVIDIA Control Panel > Help > Debug Mode might be on by default. This option will reduce factory-overclocked GPUs down to NVIDIA’s reference speeds, which is useful to eliminate stability issues in testing, but pointlessly slow if you’re already stable. I mean, you bought the factory overclock, right? I’m guessing someone at NVIDIA used it to test 378.49 during its development, fixed an issue, and accidentally commit the config file with the rest of the fix. Either way, someone caught it, and it’s now fixed, even though you should be able to just untick it if you have a factory-overclocked GPU.

Source: NVIDIA

ZeniMax Awarded $500 Million USD in Oculus Lawsuit

Subject: General Tech | February 2, 2017 - 01:19 AM |
Tagged: zenimax, VR, Oculus, facebook

On May 1st, 2014, ZeniMax, who owns id Software and Bethesda Softworks, sued Oculus VR, claiming that it had some ownership of their virtual reality technology. This accusation occurred about a month after Facebook announced that they were acquiring Oculus for $2 billion USD. At least part of their claim was due to work that John Carmack did before he left id Software the year prior, in August 2013.

Facebook.png

Today, a North Texas jury awarded ZeniMax $500 million USD from Oculus. This figure is broken down as follows: $300 million against Oculus as a company, $50 million against Palmer Luckey as an individual, and $150 million against Brendan Iribe as an individual. The jury found John Carmack wasn’t responsible for any damages as an individual, so he’s clear of this whole issue.

Oculus and Facebook plan to appeal their judgments.

According to this decision, the jury believes that ZeniMax has some ownership over Oculus’ trademark and source code copyrights. They also believe that, again, according to the verdict, which is hosted by Ars Technica, Oculus violated a non-disclosure agreement that caused $200,000,000 in damages to ZeniMax, but will not continue to damage the company in the future. (See the few pages before Page 49, inclusive.) The personal judgments against Palmer Luckey and Brendan Iribe are due to the pair not acknowledging ZeniMax’s contributions to Oculus.

Update (February 2nd @ 12:30pm EST): As pointed out in the comments, that was an old tweet from 2014. I just came across it and somehow missed the date stamp. My mistake!

After this decision, John Carmack tweeted:

 

 

As always, lots of things can change during the appeals process. For now, it looks like both ZeniMax and John Carmack received a little vindication, though.

Source: Ars Technica

EVGA's new EVGA CLC 120 and 280 Liquid Coolers

Subject: Cases and Cooling | February 1, 2017 - 08:57 PM |
Tagged: water cooler, evga, clc 280, clc 120, all in one

EVGA have just released two new All in One coolers, or Closed Loop Coolers if you prefer.  As you would expect, the CLC 120 features a single 120mm fan on its radiator while the CLC 280 uses two 140mm fans to move heat out of your cooling system.  On both models you will find a new style of fan, with Teflon Nano Bearings and a curved housing which should increase airflow although it may also increase turbulence as the air can travel out the side; testing will determine the actual effect.

CLCs.PNG

Along with the announcement of these two coolers EVGA also hinted at the coming release of their new Flow Control Software.  This software will do more than simply monitor the temperature and speeds of the cooler and fan, it will allow you to create up to ten separate cooling profiles so you can switch modes depending on what you are doing. 

unnamed.jpg

As with most recent products, it has been infected with RGB features, though this particular strain of the disease can form a symbiotic relationship with certain NVIDIA GPUs from EVGA; allowing you to synchronize their colours and effects. 

unnamed.png

You can purchase these two coolers as of today, the CLC 120 has an MSRP of $89.99, $129.99 for the CLC 280 and both come with a free AM4 bracket for AMD users.

rgbd.jpg

 

Source: EVGA

I can't believe its not bezel! AOC's Frameless Q2781PQ display

Subject: Displays | February 1, 2017 - 07:25 PM |
Tagged: AOC, Q2781PQ, ips display, 1440p

Ignoring the creative marketing terms used in the PR, which describe the Q2781PQ as a Quad HD 4-sided “frameless” AH-IPS panel, the new monitor does have its good points.  It is a 27" 1440p Advanced High Performance IPS with a 50,000,000:1 dynamic contrast ratio and a 5ms response time.  While there certainly is a bezel, it appears to be quite slim and the stand will not block off a large portion of your desk.  The MSRP of $499 is not unreasonable for this product although if you want to go all out you can get the Swarovski crystal encrusted Q2781PS for an extra $100.  Gamers may be less enamoured of this panel as it lacks adaptive sync technology but for watching or creating media it is certainly worth a peek. 

url.jpg

Fremont, Calif. – January 31, 2017 – AOC, a worldwide leader in monitor display technology, today announces the 27-inch Quad HD Ultra Slim Frameless IPS Monitor (Q2781PQ). AOC’s Q2781PQ sports an ultra slim design and asymmetric stand, along with QHD resolution (2560 x 1440) a 50,000,000:1 dynamic contrast ratio and a new 4-sided “frameless” AH-IPS panel. It features a modern AH-IPS panel that allows for wide viewing angles of 178°, ensuring brilliant colors and Clear Vision, the image performance engine that can upscale Standard Definition (SD) sources to High Definition (HD) for sharper, more vivid viewing. The display also features Full sRGB color for the best color uniformity from any perspective. With excellent picture quality and features such as Flicker-FREE technology, the monitor meets the needs of style conscious home users and professionals alike. The AOC Q2781PQ is available now at Amazon.com for an MSRP of $499. AOC is also launching the Q2781PS, which sports the same features as the Q2781PQ, along with a Rose Gold base and edge and a luxury back panel adorned with crystals from Swarovski. It will be available on Amazon.com in the coming months for $599.

The Q2781PQ comes with an upgraded design and improved image quality that is certain to impress design-conscious users of all types. Alongside its ultra slim appearance and stylish asymmetric stand, the display comes in a new 4-sided “frameless” design with minimal black borders around the screen. The stand is also compact and saves space on the desk. Inside its elegant design, the display boasts the latest technology providing you with a first-class viewing experience. It features a modern AH-IPS panel that allows for wide viewing angles of 178°, ensuring brilliant colors with Full sRGB consistency and best color uniformity from any perspective. The AOC Q2781PQ comes with QHD resolution (2560 x 1440 pixels) and over 3.6 million pixels. Users interested in image or video editing will benefit from crisp and vivid visuals that impress with detail.

Modern features such as Flicker-FREE technology and multiple video inputs turn the AOC Q2781PQ into a functional and pleasant companion at home or in the office. Users who frequently spend long hours in front of a display will benefit from AOC Flicker-FREE technology, which regulates the monitor’s brightness through a DC (direct current) backlight system and thus reduces the unpleasant flickering that so frequently causes eye discomfort and fatigue. A range of up-to-date inputs allow users to connect the monitor up with their gaming consoles, Blu-ray players or portable devices such as laptops. These include a DisplayPort, two HDMI inputs and D-Sub.

Capture.PNG

 

Source: AOC

You load 58GB and what do you get? A prettier settlement Garvey says is facing a threat

Subject: General Tech | February 1, 2017 - 06:50 PM |
Tagged: gaming, fallout 4

Arriving soon on Steam for Fallout 4 owners is an texture update which is rather impressive in size.  Not only do you need the space to fit a game that will triple in size but the hardware requirements have gone up, at the least an i7-5820K, 8GB GTX 1080 or RX 490 8GB and  8GB of RAM is recommended.  You will be able to add the content on Steam and easily remove it if your box is not quite up to the task. Rock, Paper, SHOTGUN were also told next week is bringing new features to Mod content.  Follow the link for screenshots.

30fallout4hd.jpg

"Can your box blast bonny Boston? Is your rig ready for the roof felt? Can your hog handle HD hats? Is your silicon-snorting framecrusher pumped for 60 reps of sand a second? Will your deck deck the decking? Read on for the system requirements."

Here is some more Tech News from around the web:

Gaming

Microsoft's hot new idea for the server room; IR lasers

Subject: General Tech | February 1, 2017 - 05:51 PM |
Tagged: wireless, servers, firefly

Forget LiFi, Firefly uses infrared lasers to transmit data and torture acronyms.  Researchers out of Penn State, backed by Microsoft, are working on a way to get rid of the wiring in your server room and replace it with IR lasers and mirrors; hold the smoke.  By using multiplexed beams, they have created a proof of concept test which allows bi-directional data streams at 10 gigabits per second though there is some work to be done before it is ready for a full test.

The mirrors would be MEMs controlled, ensuring that the signal should theoretically be able to reach any receiver, even ones obscured by other equipment.  Anyone sick of cable management or looking for new ways to keep people out of the server room can take a peek at the link to the research that The Register posted.  On the other hand, the simple act of walking into your server room, setting down a box or even a leaf on the wind would be likely to cause downtime.  Could protective goggles might be the newest sysadmin fashion faux pas? 

500x_laser_01.jpg

"Shown off at Photonics West 2017 in San Francisco, Firefly (acronymically tortured out of Free-space optical Inter-Rack nEtwork with high FLexibilitY, we kid you not) proposes FSO to provide multiple 10 Gbps inter-rack links."

Here is some more Tech News from around the web:

Tech Talk

Source: The Register
Subject: Mobile
Manufacturer: Qualcomm

Introduction

Introduction

In conjunction with Ericsson, Netgear, and Telstra, Qualcomm officially unveiled the first Gigabit LTE ready network. Sydney, Australia is the first city to have this new cellular spec deployed through Telstra. Gigabit LTE, dubbed 4GX by Telstra, offers up to 1Gbps download speeds and 150 Mbps upload speeds with a supported device. Gigabit LTE implementation took partnership between all four companies to become a reality with Ericsson providing the backend hardware and software infrastructure and upgrades, Qualcomm designing its next-gen Snapdragon 835 SoC and Snapdragon X16 modem for Gigabit LTE support, Netgear developing the Nighthawk M1 Mobile router which leverages the Snapdragon 835, and Telstra bringing it all together on its Australian-based cellular network. Qualcomm, Ericsson, and Telstra all see the 4GX implementation as a solid step forward in the path to 5G with 4GX acting as the foundation layer for next-gen 5G networks and providing a fallback, much the same as 3G acted as a fallback for the current 4G LTE cellular networks.

Gigabit LTE Explained

02-telstra-gigabit-lte-explained.jpg

Courtesy of Telstra

What exactly is meant by Gigabit LTE (or 4GX as Telstra has dubbed the new cellular technology)? Gigabit LTE increases both the download and upload speeds of current generation 4G LTE to 1Gbps download and 150 Mbps upload speeds by leveraging several technologies for optimizing the signal transmission between the consumer device and the cellular network itself. Qualcomm designed the Snapdragon X16 modem to operate on dual 60MHz signals with 4x4 MIMO support or dual 80MHz signals without 4x4 MIMO. Further, they increased the modem's QAM support to 256 (8-bit) instead of the current 64 QAM support (6-bit), enabling 33% more data per stream - an increase of 75 Mbps to 100 Mbps per stream. The X16 modem leverages a total of 10 communication streams for delivery of up to 1 Gbps performance and also offers access to previously inaccessible frequency bands using LAA (License Assisted Access) to leverage increased power and speed needs for Gigabit LTE support.

Continue reading our coverage of the Gigabit LTE technology!

AMD Announces Q4 2016 and FY 2016 Results

Subject: Editorial | February 1, 2017 - 04:14 AM |
Tagged: Vega, ryzen, quarterly results, Q4 2016, Q4, FY 2016, amd, AM4

Today AMD announced their latest quarterly earnings.  There was much speculation as to how well or how poorly the company did, especially in light of Intel’s outstanding quarter and their record year.  Intel has shown that the market continues to be strong, even with the popular opinion that we are in a post-PC world.  Would AMD see a strong quarter, or would Intel take further bites out of the company?

icon.jpg

The results for AMD are somewhere in between.  It was not an overly strong quarter, but it was not weak either.  AMD saw strength in the GPU market with their latest RX series of GPUs for both desktop and mobile applications.  Their CPU sales seemingly were flat with limited new products in their CPU/APU stack.  AMD is still primarily shipping 32nm and 28nm products and will not introduce 14nm products until Ryzen in late Q1 of this year.  While AMD has improved their APU offerings at both mobile and desktop TDPs, they still rely on Carrizo and the Bristol Ridge derivative to provide new growth.  The company’s aging Piledriver based Vishera CPUs still comprise a significant portion of sales for the budget and midrange enthusiast markets.

The company had revenues of $1.11B US for Q4 with a $51M net loss.  Q3 featured revenues of $1.31B, but had a much larger loss of $293M.  The primary factor for that loss was the $340M charge for the adjusted wafer start agreement that AMD has with GLOBALFOUNDRIES.  AMD did make less this past quarter, but they were able to winnow their loss down to the $51M figure.  

While AMD stayed steady with the CPU/APU and GPU markets, their biggest decline came in the semi-custom products.  This is understandable due to the longer lead times on these products as compared to AMD’s CPUs/APUs and GPUs.  The console manufacturers purchase these designs and then pay out royalties as the chips are produced.  Sony and Microsoft each had new console revisions for this holiday season that feature new SoC designs from AMD for each.  To hit the holiday rush these companies made significant orders in Q2 and Q3 of this year to allow delivery in Q4.  Once those deliveries are made then Sony and Microsoft dramatically cut orders to allow good sell-through in Q4 and not have massive unsold quantities in Q1 2017.  With royalties down with fewer chips being delivered, AMD obviously suffers at the hand of seasonality typically one quarter sooner than Intel or NVIDIA does.

am4_01.jpg

For the year AMD had nearly $300M more in revenue as compared to 2015.  2016 ended at $4.27B as compared to 2015’s $3.99B.  This is generally where AMD has been for the past decade, but is lower than they have seen in years past with successful parts like Athlon and their Athlon 64 parts.  In 2005 AMD had $5.8B in revenue.  We see that AMD still has a way to go before matching some of their best years as a company.

One of the more interesting aspects is that even through these quarterly losses AMD has been able to increase their cash on hand.  AMD was approaching some $700M a few years back and with the losses they were taking it would not be all many years before liquidity was non-existent.  AMD has been able to build that up to $1.26B at the end of this quarter, giving them more of a cushion to rely upon in tight times.

AMD’s year on year improvement is tangible, but made more impressive when considering how big of an impact the $340M charge that the WSA incurred.  This shows that AMD has been very serious about cutting expenses and monetizing their products to the best of their ability.

This coming year should show further improvement for AMD due to a more competitive product stack in CPUs, APUs, and GPUs.  AMD announced that Ryzen will be launching sometimes this March, hitting the Q1 expectations that the company had in the second half of 2016.  Previous to that AMD thought they could push out limited amounts of Ryzen chips in late Q4 2016, but that did not turn out to be the case.  AMD has shown off multiple Ryzen samples running anywhere from 3.2 GHz base with a potential engineering sample with a boosted speed up to 4 GHz.  Ryzen looks far more competitive against Intel’s current and upcoming products than AMD has in years.

am4_03.png

The GPU side will also be getting a boost in the first half of 2017.  It looks like the high end GPU Vega will be launching in Q2 2017.  AMD has addressed the midrange and budget markets with the Polaris based chips but has been absent at the high end with 14nm chips.  AMD still produces and sells Fury and Nano based offerings that somewhat address the area above the midrange, but they do not adequately compete with the NVIDIA GTX 1070 and 1080 products.  Vega looks to be competitive with what NVIDIA has at the high end, and there is certainly a pent up demand for an AMD card in that market.

AMD had a solid 2016 that showed that the current management team could successfully lead the company through some very challenging times.  The company continues to move forward and we shall see new products with CPUs, GPUs, and motherboards that should all materially contribute to and expand AMD’s bottom line.

Source: AMD

Razer Announces Yellow Mechanical Keyboard Switch

Subject: General Tech | February 1, 2017 - 12:49 AM |
Tagged: razer, mechanical keyboard

Since they ended their reliance upon Cherry’s MX line of switches, Razer created / co-created their own line. Until this month, desktop keyboards contained one of two, color-coded entries: the Razer Green or the Razer Orange mechanical keyboard switches. The Green is designed to be similar to the Cherry MX Blue, with a 50cN activation force and a clicky response. The Orange, on the other hand, aims at the Cherry MX Brown, with a 45cN activation force and a bumpy response, without a click. As such, both of them have some sort of feedback at the point of activation.

(One cN weighs about as much as a gram on the surface of the Earth.)

razer-2017-switchyellow.png

This month, Razer announced the Razer Yellow switch. They are claiming this one is linear and silent, with an activation force of 45cN. Comparing back to my table, you would see this fits right in with the Cherry MX Red switch, although Razer has, again, changed the design slightly, mostly around travel distance. I’m personally not really a fan of linear switches on keyboards, mostly because I type and I tend to bottom them out. Still, they are a beloved option for many, and now Razer provides the option.

The Razer Yellow switch is just available in the Razer Blackwidow Chrome V2 at the moment.

Source: Razer