Star Wars Humble Bundle III; better than the movie!

Subject: General Tech | February 7, 2017 - 08:01 PM |
Tagged: gaming, Star Wars, humble bundle

Do like Star Wars games, PCPer and Unicef?  If so there is a Humble Bundle perfect for you running for the next two weeks.  Depending on how much you pay you can get up to 15 games and an X-Wing verus TIE Fighter t-shirt, with a percentage of your purchase helping us to continue to provide the content you love.  There is some overlap with previous bundles you may have picked up but for those of you missing KOTOR 1 or 2, The Force Unleashed 1 or 2, Shadows of the Empire or even the second Star Wars Battlefront game it is well worth the cost.

68ceffd08ee3ec146869d8eda9767d32eefcb7fa.png

How can you resist that t-shirt?

 

The new ASUS Maximus IX Formula is put through its paces

Subject: Motherboards | February 7, 2017 - 07:35 PM |
Tagged: z270 express, Maximus IX Formula, intel z270, ASUS ROG, asus

ASUS' Maximus Formula series have become familiar to high end system builders and the newest member looks to live up to our expectations.  The list of features is comprehensive, including two M.2 slots and a U.2 slot, two USB 3.1 ports including a Type-C and an ASUS 2T2R dual band 802.11a/b/g/n/ac antenna.  [H]ard|OCP had mixed results when overclocking, some testers had a perfect experience while others ran into some hurdles, that may be due to the processors they used so do not immediately write this motherboard off.  Take a look at the full review before you decide one way or the other.

1486385016RAfE27Cjtg_1_8_l.jpg

"ASUS is nothing like Hollywood. ASUS can actually turn out sequels which not only match the originals, but surpass them. ASUS Republic of Gamers Maximus IX Formula is another sequel in the long line of Maximus motherboards. Can ASUS continue its long history of awesome sequels? One things for certain, it’s no Robocop 3."

Here are some more Motherboard articles from around the web:

Motherboards

 

Source: [H]ard|OCP

Intel's Atom C2xxx processors may just make like a banana and split

Subject: General Tech | February 7, 2017 - 06:31 PM |
Tagged: Intel, c2000, Avoton

"System May Experience Inability to Boot or May Cease Operation" is not the errata note you want to read, but for those running devices powered by an Intel Avoton C2xxx family Atom processor it is something to pay attention to.  The Low Pin Count bus clock may stop functioning permanently after the chip has been in service for a time, rendering the device non-functional.  Intel had little to say about the issue when approached by The Register but did state that there is a board level workaround available to resolve the issue.

The Avoton famliy of chips were released in 2013 and were designed to compete against ARM's new low powered server chips.  The flaw is likely responsible for the issues with Cisco routers that have been reported on recently; the chip can also be found in the Synology DS1815+ and some Dell server products.  It will be interesting to see how Intel responds to this issue, they have a history of reluctance discussing flaws in their product's architecture.

Avoton.png

"Intel's Atom C2000 processor family has a fault that effectively bricks devices, costing the company a significant amount of money to correct. But the semiconductor giant won't disclosed precisely how many chips are affected nor which products are at risk."

Here is some more Tech News from around the web:

Tech Talk

Source: The Register

Logitech Announces BRIO Webcam: 4K and HDR

Subject: General Tech | February 7, 2017 - 09:31 AM |
Tagged: logitech, webcam, brio, 4k, hdr

Today’s announcement of the Logitech BRIO rolls in many features that have been lacking in webcams. With it, you can record in 720p30, 720p60, 1080p30, 1080p60, and, the big reveal, 4K30. It is also capable of shooting in HDR using RightLight 3, although they don’t specify color space formats, so it’s unclear what you will be able to capture with video recording software.

logitech-2017-brio-hero.png

On top of these interesting video modes, the camera also supports infrared for Windows Hello “or other facial recognition software”. Unlike Intel’s RealSense, the webcam claims support for the relatively ancient Core 2 and higher, which sounds promising for AMD users. I’m curious what open-source developers will be able to accomplish, especially if it’s general enough to do background rejection (and so forth). Obviously, this is just my speculation -- Logitech hasn’t even hinted at this in their documentation.

As you would expect for a 4K sensor, Logitech is also advertising quite a bit of digital zoom. They claim up to 5X and FOVs user-configurable between 65 degrees and 90 degrees.

Finally, the price is $199 USD / $249 CDN and it ships today.

Source: Logitech

Mozilla to Require Rust (and Dependencies) for Firefox

Subject: General Tech | February 7, 2017 - 07:47 AM |
Tagged: mozilla, firefox, web browser, Rust, llvm

Firefox 52 will be the company’s next Extended Support Release (ESR) branch of their popular web browser. After this release, Mozilla is planning a few changes that will break compatibility, especially if you’re building the browser from source. If you’re an end-user, the major one to look out for is Mozilla disabling NPAPI-based plugins (except Flash) unless you are using Firefox 52 ESR. This change will land in the consumer version of Firefox 52, though. It’s not really clear why they didn’t just wait until Firefox 53, rather than add a soft-kill in Firefox 52 and hard-code it the next version, but that’s their decision. It really does not affect me in the slightest.

mozilla-rust.png

The more interesting change, however, is that Mozilla will begin requiring Rust (and LLVM) in an upcoming version. I’ve seen multiple sources claim Firefox 53, Firefox 54, and Firefox 55 as possible targets for this, but, at some point around those versions, critical components of the browser will be written in Rust. As more of the browser is migrated to this language, it should be progressively faster and more secure, as this language is designed to enforce memory safety and task concurrency.

Firefox 52 is expected in March.

If you were going to sell Mechanical Keyboards, what name would you choose?

Subject: General Tech | February 6, 2017 - 10:26 PM |
Tagged: MK Fission, mechanical keyboard, input, Cherry MX

If you wanted MechanicalKeyboards.com then TechPowerUp has some bad news for you, as it is already taken.  When not brainstorming with Captain Obvious, they are the North American retailer for Ducky Keyboards, a name you might have possibly heard before.  Their MK Fission comes in 18 flavours, you can only choose black or white keycaps but you have your choice of the full range of Cherry switches.  If you have lost track of the score that includes Red, Brown, Blue, Black, Silent Red, Speed Silver, Green, Clear and White.  The keyboard has blue backlighting and the RGB disease has only infected the outer casing of the keyboard, giving it a look which might be familiar to anyone who knew someone in the 90s' with questionable taste in car accessories.

mk-fission.jpg

"MechanicalKeyboards.com is a prominent retailer of mechanical keyboards, as the name would suggest, based in the USA. Today we get to take a look at their new MK Fission full size keyboard that comes in 18 possible options to choose from, Yes, there is RGB included but perhaps not the way you think."

Here is some more Tech News from around the web:

Tech Talk

 

Source: TechPowerUp

Lenovo Announces new ThinkPad P51s P51 and P71 Mobile Workstations

Subject: Systems, Mobile | February 6, 2017 - 08:37 PM |
Tagged: xeon, Thinkpad, quadro, P71, P51s, P51, nvidia, notebook, mobile workstation, Lenovo, kaby lake, core i7

Lenovo has announced a trio of new ThinkPad mobile workstations, featuring updated Intel 7th-generation Core (Kaby Lake) processors and NVIDIA Quadro graphics, and among these is the thinnest and lightest ThinkPad mobile workstation to date in the P51s.

P51s.jpg

"Engineered to deliver breakthrough levels of performance, reliability and long battery life, the ThinkPad P51s features a new chassis, designed to meet customer demands for a powerful but portable machine. Developed with engineers and professional designers in mind, this mobile workstation features Intel’s 7th generation Core i7 processors and the latest NVIDIA Quadro dedicated workstation graphics, as well as a 4K UHD IPS display with optional IR camera."

Lenovo says that the ThinkPad P51s is more than a half pound lighter than the previous generation (P50s), stating that "the P51s is the lightest and thinnest mobile workstation ever developed by ThinkPad" at 14.4 x 9.95 x 0.79 inches, and weight starting at 4.3 lbs.

Specs for the P51s include:

  • Up to a 7th Generation Intel Core i7 Processor
  • NVIDIA Quadro M520M Graphics
  • Choice of standard or touchscreen FHD (1920 x 1080) IPS, or 4K UHD (3840 x 2160) IPS display
  • Up to 32 GB DDR4 2133 RAM (2x SODIMM slots)
  • Storage options including up to 1 TB (5400 rpm) HDD and 1 TB NVMe PCIe SSDs
  • USB-C with Intel Thunderbolt 3
  • 802.11ac and LTE-A wireless connectivity

Lenovo also announced the ThinkPad P51, which is slightly larger than the P51s, but brings the option of Intel Xeon E3-v6 processors (in addition to Kaby Lake Core i7 CPUs), Quadro M2200M graphics, faster 2400 MHz memory up to 64 GB (4x SODIMM slots), and up to a 4K IPS display with X-Rite Pantone color calibration.

Thinkpad_P51.jpg

Finally there is the new VR-ready P71 mobile workstation, which offers up to an NVIDIA Quadro P5000M GPU along with Oculus and HTC VR certification.

"Lenovo is also bringing virtual reality to life with the new ThinkPad P71. One of the most talked about technologies today, VR has the ability to bring a new visual perspective and immersive experience to our customers’ workflow. In our new P71, the NVIDIA Pascal-based Quadro GPUs offer a stunning level of performance never before seen in a mobile workstation, and it comes equipped with full Oculus and HTC certifications, along with NVIDIA’s VR-ready certification."

Thinkpad_P71.jpg

Pricing and availability is as follows:

  • ThinkPad P51s, starting at $1049, March
  • ThinkPad P51, starting at $1399, April
  • ThinkPad P71, starting at $1849, April
Source: Lenovo

Do you like turtles? Upgrade your online banter with the Turtle Beach Stream Mic

Subject: General Tech | February 6, 2017 - 07:46 PM |
Tagged: turtle beach, microphone, audio, Stream Mic

Upgrading your microphone from the one found on your gaming headset can make a significant difference in the way you sound online.  Being able to do so for around $50 and to be able to use the same device on your PC, Xbox and PS4 might just convince some that the upgrade is worth it.  The Turtle Beach Multi-Format Stream Microphone can be transferred between devices with a simple switch and it will run without any drivers.  It also has a a built in headphone amplifier so you can move your headset with you without unplugging.  Drop by eTeknix for a look at it.

DSC_1916.jpg

"While many of us only need a standard headset with a simple boom mic, there’s a growing demand for higher quality microphones for both gamers and streamers, on Twitch, YouTube Live and much more. Turtle Beach are not the first to make a dedicated streaming microphone, but they are one of the more affordable options too, and their new Stream Mic comes with support for Xbox One, PlayStation 4 and PC, making it a tempting solution for the multi-format gamer and streamer."

Here is some more Tech News from around the web:

Audio Corner

Source: eTeknix

The first Cyber Grand Challenge; using AI to hunt bugs. What could go wrong?

Subject: General Tech | February 6, 2017 - 06:36 PM |
Tagged: darpa, ai, security, Usenix Enigma 2017

DARPA hosted the first Cyber Grand Challenge last summer, in which the software from seven machine learning projects competed to find and patch vulnerabilities in a network, and to attack each other.  While the specific vulnerabilities discovered have not been made public you can read a bit about what was revealed about the contest at Usenix Enigma 2017 over at The Register.  For instance, one of the programs managed to find a flaw in the OS all the machines were running on and then hack into another to steal data.  A different machine noticed this occurring and patched itself on the fly, making sure that it was protected from that particular attack.  Also worth noting is that the entire contest was over in 20 minutes. 

enigma-logo.png

"The exact nature of these new bug types remains under wraps, although we hear that at least one involves exploitable vulnerabilities in data queues."

Here is some more Tech News from around the web:

Tech Talk

 

Source: The Register

Palit Introduces Fanless GeForce GTX 1050 Ti KalmX GPU

Subject: Graphics Cards | February 6, 2017 - 04:43 PM |
Tagged: video card, silent, Passive, palit, nvidia, KalmX, GTX 1050 Ti, graphics card, gpu, geforce

Palit is offering a passively-cooled GTX 1050 Ti option with their new KalmX card, which features a large heatsink and (of course) zero fan noise.

kalmx_1.jpg

"With passive cooler and the advanced powerful Pascal architecture, Palit GeForce GTX 1050 Ti KalmX - pursue the silent 0dB gaming environment. Palit GeForce GTX 1050 Ti gives you the gaming horsepower to take on today’s most demanding titles in full 1080p HD @ 60 FPS."

kalmx_3.jpg

The specs are identical to a reference GTX 1050 Ti (4GB GDDR5 @ 7 Gb/s, Base 1290/Boost 1392 MHz, etc.), so expect the full performance of this GPU - with some moderate case airflow, no doubt.

kalmx_2.jpg

We don't have specifics on pricing or availablity just yet.

Source: Palit

Windows 10 Game Mode Gets Benchmarked, Still Needs Work

Subject: General Tech | February 5, 2017 - 04:54 AM |
Tagged: windows insider, Windows Game Mode, windows 10, pc gaming, creators update, beta

Last month Microsoft confirmed that a new "Game Mode" would be part of the upcoming Windows 10 Creator's Update planned for a spring release ("early 2017"). Microsoft has recently started rolling out the Game Mode to its beta testers in the latest Windows Insider preview build (for those on the fast track anyway, I am currently on the slow ring and do not have Game Mode yet). Now that it is rolled out in preview form, gamers have naturally started benchmarking it, and PC Games News has posted an article on their testing of the new feature and their findings on two Win32 and one UWP game. Not to spoil the results, but at this point Game Mode does not appear to offer anything and can even result in less frames per second with it turned on with its only saving grace being that in some situations it does offer increased performance when the Game DVR feature is also being used to record gameplay. They tested both a NVIDIA GTX 1060 and an AMD RX 480, and Game Mode in it's current preview software on a preview OS appears to have more benefits for NVIDIA while the AMD card PC Games News tested mostly just did it's thing regardless of whether Game Mode was turned on or off (heh, not necessarily a bad thing). 

microsoft-2016-win10-event-groovemusicmaker.jpg

With Game Mode now rolling out to Windows Insiders, there is more information on how Microsoft plans to implement it. Rather than hiding it in the Xbox app, Microsoft has thankfully put it in the main settings app under the Gaming category and users access it by bringing up a Game Bar menu in-game for those games that support it (PC Games News noted Doom and GTA V did not work). Game Mode is an OS-level feature that will dedicate a certain amount of CPU threads to the game when it is turned on and leaves the remaining threads to be used by background processes (which themselves are reportedly minimized). Currently, this seems to work better with multi-threaded games and older games that were coded to only use one or two threads may not see any benefit in turning Game Mode on (and it may actually result in lower FPS). To Microsoft's credit, they are not over promising with Game Mode and note that it should be good for around 2% better performance when enabled with Game Mode having a bigger impact on UWP titles.

I encourage you to check out the PC Games News article where they have their benchmark results presented in a number of bar graphs. Most of the tests saw little to no benefit from using Game Mode, but not a negative effect. Some games like Hitman saw a 6% increase in average frames per second on the GTX 1060. On the other side of things, Forza Horizon 3 (ironically, a UWP game) performance actually drops when Game Mode is turned on to the tune of 13% to 23% less FPS with the RX 480 and 9% to 15% less with the GTX 1060. As far as Tomb Raider, things are more in the middle and things stay the same or get slightly better minimum frames per second when Game Mode and Game DVR are both turned on (though oddly there is a result in there that shows a performance drop with Game Mode on and Game DVR off).

It ia also worth noting that overall, the trend seems to be that Game Mode is going to be most beneficial at increasing the minimum frame rates on games with the Game DVR feature is being used moreso than getting overall maximum or average FPS out of a game. The biggest hurdle is going to be game compatiblity, especially for older games, and Microsoft tweaking things so that at worst Game Mode won't tank performance (like it currently does with Hitman minimum frame rates when Game Mode is on but DVR is off) and things will stay the same as if Game Mode was not on at all and at best gamers will get slightly smoother gameplay.

gamemode1.jpg

Source: PCGamesN.com

Right now Game Mode is not compelling, but it is still a work in progress and if Microsoft can get Game Mode right it could be a useful addition (and incentive to upgrade to Windows 10 is probably why they are interested in pursuing this feature) and could come in handy especially on gaming laptops! I am not writing off the feature yet, and neither should you, but I do hope that compatibility is improved and the performance hits are reduced or eliminated when it is enabled. My guess is that the games that will work well with Game Mode are going to be almost all newer games and especially games that are developed post Creator's Update final release with Game Mode in mind.

Hopefully we can use our frame rating on the final product to see how well it truly works as far as user experience and smooth gameplay. What are your thoughts on Windows 10's Game Mode?

Micron Planning To Launch GDDR6 Graphics Memory In 2017

Subject: Graphics Cards | February 4, 2017 - 08:29 PM |
Tagged: micron, graphics memory, gddr6

This year is shaping up to be a good year for memory with the promise of 3D XPoint (Intel/Micron), HBM2 (SK Hynix and Samsung), and now GDDR6 graphics memory from Micron launching this year. While GDDR6 was originally planned to be launched next year, Micron recently announced its intentions to start producing the memory chips by the later half of 2017 which would put it much earlier than previously expected.

Micron Logo.png

Computer World reports that Micron is citing the rise of e-sports and gaming driving the computer market that now sees three year upgrade cycles rather than five year cycles (I am not sure how accurate that is, however as it seems like PCs are actually lasting longer between upgrade as far as relevance but i digress) as the primary reason for shifting GDDR6 production into high gear and moving up the launch window. The company expects the e-sports market to grow to 500 million fans by 2020, and it is a growing market that Micron wants to stay relevant in.

If you missed our previous coverage, GDDR6 is the successor to GDDR5 and offers twice the bandwidth at 16 Gb/s (gigabits per second) per die. It is also faster than GDDR5X (12 Gb/s) and uses 20% less power which the gaming laptop market will appreciate. HBM2 still holds the bandwidth crown though as it offers 256 GB/s per stack and up to 1TB/s with four stacks connected to a GPU on package.

As such, High Bandwidth Memory (HBM2 and then HBM3) will power the high end gaming and professional graphics cards while GDDR6 will become the memory used for mid range cards and GDDR5X (which is actually capable of going faster but will likely not be pushed much past 12 Gbps after all if GDDR6 does come out this soon) will replace GDDR5 on most if not all of the lower end products.

I am not sure if Micron’s reasoning of e-sports, faster upgrade cycles, and VR being the motivating factor(s) to ramping up production early is sound or not, but I will certainly take the faster memory coming out sooner rather than later! Depending on exactly when in 2017 the chips start rolling off the fabs, we could see graphics cards using the new memory technology as soon as early 2018 (just in time for CES announcements? oh boy I can see the PR flooding in already! hehe).

Will Samsung change course as well and try for a 2017 release for its GDDR6 memory as well?

Are you ready for GDDR6?

Sony Will Add USB Hard Drive Support To PS4 and Boost Mode to PS4 Pro

Subject: General Tech | February 4, 2017 - 07:25 AM |
Tagged: usb 3.0, sony, ps4 pro, ps4, gaming, console

Sony is taking the wraps off of its latest firmware with the release of version 4.50 “Sasuke” beta firmware for the PS4. With the new firmware, Sony is rolling out a number of UI/UX improvement and users will finally be able to use external storage with the game console. On the PS4 Pro front, Sony will be adding a “boost mode” in a future update (it may not be ready in time for a production 4.50 release) that lets legacy games access the additional GPU horsepower of the Pro version of the console to smooth out frame rates without needing any special patches from the game developers.

Sony PS4 Pro.jpg

The new firmware adds support for USB 3.0 hard drives (or SSDs) up to 8TB. Users will be able to use the external storage to store games, downloaded applications, screenshots, and videos and have it all show up on the main system menu along with the local storage. Users will not need to shuffle game data back and forth in order to play their games either. Note that currently, the actual save game data is still stored locally even if the game itself is stored on the external hard drive. Fans of the PlayStation VR (PS VR) also get an update with firmware 4.50 in the form of support for watching 3D Blu Rays. Beyond those big feature updates, Sony is also changing up the interface slightly. The Quick Menu now takes up less screen space and will allow gamers to create and join parties right from there rather than going to a separate app. In the notification area, Sony has condensed all the various notification types into a single unified list. Further, users will be able to set in game screenshots as the home screen wallpaper.

Perhaps most interesting is the planned “boost mode” for the PS4 Pro which is currently in beta. Gamers are reporting that titles such as The Evil Within and Just Cause 3 are significantly smoother frame rates with noticeably reduced stuttering. Reportedly, the boost mode will work with most PS4 games that were programmed with unlocked frame rates though the exact benefits will vary. Games that have a hard cap on the frame rate will still need specific patches from the game developers to get any improvements. Ars Technica speculates that the “boost mode” is simply Sony removing its own blocks it put in place to force compatibility with older games that were developed with the base PS4 in mind. When the boost mode is off, the PS4 Pro GPU has part of itself turned off such that it functions exactly as the PS4’s GPU and activating boost mode takes away the blocks and allows the full GPU (with it's 36 CUs) to process the game data as best it can. Getting things like native higher resolutions or more detailed textures will still require patches, of course.

If you have a PS4 or PS4 Pro, keep an eye on the beta Sasuke 4.50 firmware.

Also read:

Source: Ars Technica

Intel Has Started Shipping Optane Memory Modules

Subject: Memory | February 4, 2017 - 01:42 AM |
Tagged: XPoint, server, Optane, Intel Optane, Intel, big data

Last week Hexus reported that Intel has begun shipping Optane memory modules to its partners for testing. This year should see the launch of both these enterprise products designed for servers as well as tiny application accelerator M.2 solid state drives based on the Intel and Micron joint 3D memory venture. The modules that Intel is shipping are the former type of Optane memory and will be able to replace DDR4 DIMMs (RAM) with a memory solution that is not as fast but is cheaper and has much larger storage capacities. The Optane modules are designed to slot into DDR4 type memory slots on server boards. The benefit for such a product lies in big data and scientific workloads where massive datasets will be able to be held in primary memory and the processor(s) will be able to access the data sets at much lower latencies than if it had to reach out to mass storage on spinning rust or even SAS or PCI-E solid state drives. Being able to hold all the data being worked on in one pool of memory will be cheaper with Optane as well as it is allegedly priced closer to NAND than RAM and the cost of RAM adds up extremely quickly when you need many terabytes of it (or more!). Various technologies attempting to bring higher capacity non volatile and/or flash-based storage in memory module form have been theorized or in the works in various forms for years now, but it appears that Intel will be the first ones to roll out actual products.

Intel Optane Memory Module.JPG

It will likely be years before the technology trickles down to consumer desktops and notebooks, so slapping what would effectively be a cheap RAM disk into your PC is still a ways out. Consumers will get a small taste of the Optane memory in the form of tiny storage drives that were rumored for a first quarter 2017 release following its Kaby Lake Z270 motherboards. Previous leaks suggest that the Intel Optane Memory 8000P would come in 16 GB and 32 GB capacities in a M.2 form factor. With a single 128-bit (16 GB) die Intel is able to hit speeds that current NAND flash based SSDs can only hit with multiple dies. Specifically the 16GB Optane application accelerator drive is allegedly capable of 285,000 random 4K IOPS, 70,000 random write 4K IOPS, Sequential 128K reads of 1400 MB/s, and sequential 128K writes of 300 MB/s. The 32GB Optane drive is a bit faster at 300,000 4K IOPS, 120,000 4K IOPS, 1600 MB/s, and 500 MB/s respectively.

Unfortunately, I do not have any numbers on how fast the Optane memory that will slot into the DDR4 slots will be, but seeing as two dies already max out the x2 PCI-E link they use in the M.2 Optane SSD, a dual sided memory module packed with rows of Optane dies on the significantly wider memory bus is very promising. It should lie somewhere closer to (but slower than) DDR4 but much faster than NAND flash while still being non volatile (it doesn't need constant power to retain the data).

I am interested to see what the final numbers are for Intel's Optane RAM and Optane storage drives. The company has certainly dialed down the hype for the technology as it approached fruition though that may be more to do with what they are able to do right now versus what the 3D XPoint memory technology itself is potentially capable of enabling. I look forward to what it will enable in the HPC market and eventually what will be possible for the desktop and gaming markets.

What are your thoughts on Intel and Micron's 3D XPoint memory and Intel's Optane implementation (Micron's implementation is QuantX)?

Also read:

Source: Hexus

Report: AMD Ryzen Performance in Ashes of the Singularity Benchmark

Subject: Processors | February 4, 2017 - 01:22 AM |
Tagged: titan x, ryzen, report, processor, nvidia, leak, cpu, benchmark, ashes of the singularity, amd

AMD's upcoming 8-core Ryzen CPU has appeared online in an apparent leak showing performance from an Ashes of the Singularity benchmark run. The benchmark results, available here on imgur and reported by TechPowerUp (among others today) shows the result of a run featuring the unreleased CPU paired with an NVIDIA Titan X graphics card.

Ryzen_Ashes_Screenshot.jpg

It is interesting to consider that this rather unusual system configuration was also used by AMD during their New Horizon fan event in December, with an NVIDIA Titan X and Ryzen 8-core processor powering the 4K game demos of Battlefield 1 that were pitted against an Intel Core i7-6900K/Titan X combo.

It is also interesting to note that the processor listed in the screenshot above is (apparently) not an engineering sample, as TechPowerUp points out in their post:

"Unlike some previous benchmark leaks of Ryzen processors, which carried the prefix ES (Engineering Sample), this one carried the ZD Prefix, and the last characters on its string name are the most interesting to us: F4 stands for the silicon revision, while the 40_36 stands for the processor's Turbo and stock speeds respectively (4.0 GHz and 3.6 GHz)."

March is fast approaching, and we won't have to wait long to see just how powerful this new processor will be for 4K gaming (and other, less important stuff). For now, I want to find results from an AotS benchmark with a Titan X and i7-6900K to see how these numbers compare!

Source: TechPowerUp

NVIDIA Releases Vulkan Developer 376.80 Beta Drivers

Subject: Graphics Cards | February 3, 2017 - 10:58 PM |
Tagged: nvidia, graphics drivers, vulkan

On February 1st, NVIDIA released a new developer beta driver, which fixes a couple of issues with their Vulkan API implementation. Unlike what some sites have been reporting, you should not download it to play games that use the Vulkan API, like DOOM. In short, it is designed for developers, not end-users. The goal is to provide correct results when software interacts with the driver, not the best gaming performance or anything like that.

khronos-2016-vulkanlogo2.png

In a little more detail, it looks like 376.80 implements the Vulkan 1.0.39.1 SDK. This update addresses two issues with accessing devices and extensions, under certain conditions, when using the 1.0.39.0 SDK. 1.0.39.0 was released on January 23rd, and thus it will not even be a part of current video games. Even worse, it, like most graphics drivers for software developers, is based on the old, GeForce 376 branch, so it won’t even have NVIDIA’s most recent fixes and optimizations. NVIDIA does this so they can add or change the features that Vulkan developers require without needing to roll-in patches every time they make a "Game Ready" optimization or something. There is no reason to use this driver unless you are developing Vulkan applications, and you want to try out the new extensions. It will eventually make it to end users... when it's time.

If you are wishing to develop software using Vulkan’s bleeding-edge features, then check out NVIDIA’s developer portal to pick up the latest drivers. Basically everyone else should use 378.49 or its 378.57 hotfix.

Source: NVIDIA

Learn about AMD's Vega Memory Architecture

Subject: General Tech | February 3, 2017 - 06:40 PM |
Tagged: amd, Vega, Jeffrey Cheng

Tech ARP had a chance talk with AMD's Jeffrey Cheng about the new Vega GPU memory architecture.  He provided some interesting details such as the fact that the new architecture can handle up to 512 TB of addressable memory.  With such a large pool it would be possible to store data sets in HBM2 memory to be passed to the GPU as opposed to sitting in general system memory.  Utilizing the memory present on the GPU could also reduce costs and energy consumption, not to mention the fact it will perform far more quickly.  Pop by to watch the video to see how he feels this could change the way games and software could be programmed.

AMD-Vega-Tech-Report-15.jpg

"Want to learn more about the AMD Vega memory architecture? Join our Q&A session with AMD Senior Fellow Jeffrey Cheng at the AMD Tech Summit!"

Here is some more Tech News from around the web:

Tech Talk

Source: TechARP

Yes Acer was the company that released the XR382CQK bmijqphuzx Display

Subject: Displays | February 2, 2017 - 08:16 PM |
Tagged: ultra-widescreen, freesync, adaptive sync

Yes, this is the product Ryan mentioned, a curved 37.5" IPS adaptive sync display from Acer.  As opposed to yesterday, today Quad HD refers to a 3840x1600 2300R curve ultra wide screen resolution, making shopping for a monitor even easier, before you even try to type in the model number.  It supports Adaptive Sync, with a refresh rate that tops out at 75Hz; sorry G-SYNC fans.

XR382CQK_sku_main.png

As with yesterdays model it has as slimmed down bezel, called ZeroFrame in this case.  It supports HDMI 1.3 10-bit colour, or at least states it offers 1.07 billion colours as well as a 100,000,000:1 contrast ratio and 300 nit brightness.  The monitor also includes DTS Sound speakers and has a USB 3.0 Type-C port.  You can read a bit more about it here.

XR382CQK_gallery_04.png

Source: Acer

Noctua announces three new AM4 CPU coolers for AMD Ryzen

Subject: Cases and Cooling | February 2, 2017 - 07:34 PM |
Tagged: Socket AM4, ryzen, noctua, NH-U12S SE-AM4, NH-L9x65 SE-AM4, NH-D15 SE-AM4, amd

If you are already planning your new AMD Ryzen build and are wondering what gigantic brown and tan coolers might work then Noctua has a page that will make you smile.  They have listed all of their current coolers which can be made compatible with AM4 using a free adapter which you can order from that page.  They also list some which could be made compatible but are not eligible for the free adapter and those which will not work at all.

notua am4.PNG

Along with the compatibility list comes three brand new coolers, which you can see larger than life by clicking on their names.  The NH-D15 SE-AM4 is a contender for Morry's next favourite cooler for mATX boards, 980g of metal and that is before you add the two 140mm fans.  The NH-U12S SE-AM4 is slimmer 580g but is still 158mm tall and will use a 120mm fan.  For those who prefer their coolers in petite sizes the NH-L9x65 SE-AM4 is a svelte 340g and stands a mere 65mm while wearing its custom fit 92mm fan

You can pick them up soon, the NH-D15 SE-AM4 at $99.90, NH-U12S SE-AM4 for $64.90
and the NH-L9x65 SE-AM4 at $52.90.  PR below the fold.

Source: Noctua

Well, forget about gifting those bundled NVIDIA games

Subject: General Tech | February 2, 2017 - 05:54 PM |
Tagged: nvidia, GFE, game bundle, geforce experience 3.0

Giving away a bonus to your customers is a nice thing to do; watching them jump through hoops to get the bonus is less so.  Many companies have realized that offering a mail in rebate is a great way to look like you are doing something for your customers while at the same time ensuring a significant percentage of those customers never actually claim said MIR.  Strangely this practice has not impressed consumers.

NVIDIA started to embrace something similar towards the end of 2016 with GeForce Experience 3.0, requiring a mandatory login to get at beta drivers and more pertinently, giveaways.  That login requirement includes all the game bundles, such as the one announced yesterday.  Ars Technica reported something interesting this morning that anyone thinking of picking up one of the games bundles should be aware of; NVIDIA now intends to tie these game codes to hardware.  Currently you will need to sign into GFE, verify your code and then allow GFE to verify you have a GTX 1070 or 1080 installed in your system, which strongly suggests you will need to install software.  Ars speculates that this could one day be tied directly to a card via a hardwareID or serial number; making it impossible to give away.

The rational offered references an incident with Microsoft a few months back when "some Gears 4 Windows 10 keys were obtained illegitimately via our Nvidia promotion".  This was  thanks to a loophole created by Amazon's return policy.  Of course, some of those so called illegitimate installations were caused by someone giving or even selling the game key which they obtained legally, because they had already purchased Gears 4.  It is unclear if NVIDIA only pays for codes which are redeemed or if the money has already been invested; in the former it at least makes financial sense, if the latter then Bird Culture has an appropriate phrase.

Once again, everyone must be punished thanks to the overreaction caused by a few smegheads.  Keep this in mind when you are shopping in the future.

nvidia-geforce-gtx-prepare-for-battle-for-honor-and-ghost-recon-wildlands-bundle-640px.jpg

"GFE then performs "a hardware verification step to ensure the coupon code is redeemed on the system with the qualifying GPU." It's not yet clear whether the codes will be tied to a specific serial number/hardware identifier, or whether they will be tied to an overall product line like a GTX 1070 or GTX 1080."

Here is some more Tech News from around the web:

Tech Talk

Source: Ars Technica