All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: General Tech | October 25, 2015 - 07:28 AM | Scott Michaud
Tagged: microsoft, edge
One thing that will not be in the November update for Windows 10 is extensions for Microsoft Edge. The browser should be updated in general, that feature needs a little more time before it is ready for the public. The official statement has the feature arriving in “a future Windows 10 update in 2016”. We still don't know how frequent these updates will occur, but Mary Jo Foley has sources that say “Redstone 1” will be released in June (give or take maybe?).
To me, this means that it's either far off, or completely mundane.
Subject: General Tech | October 25, 2015 - 07:01 AM | Scott Michaud
Tagged: windows 10, microsoft
We have been expecting a relatively major update to Windows 10 in the October-ish (which at some point the rumors slipped to November-ish) time frame since the OS launched in July. We already know much of what will be in it, based on the preview builds being sent to Windows Insider participants, so the contents are not really a surprise either. It will update a few user interface elements, tweak how System manages memory, and allow clean installs using Windows 7 and 8.1 product keys that qualify for Windows 10 upgrades.
Really, the major news is how the update will be delivered. I was honestly expecting to do the in-place upgrades that each new Insider build forced upon users. This made sense to me. If you have installed Windows 7 recently, you will know that it is a several-hour updating process that involves several reboots and gigabytes of patches. The build metaphor makes sense in a “Windows as a Service” universe, where all PCs are pushed from milestone to milestone with a few incremental patches in between.
Apparently, it will just be pushed down Windows Update in an item named “Windows 10 November 2015”. That's it. Pretty much the same experience as downloading service packs over Windows Update in previous versions. Oddly familiar, especially given the effort they put into the in-place upgrade interface over the last year and a bit.
Maybe we'll see that in future feature-updates?
Subject: General Tech | October 24, 2015 - 10:54 PM | Sebastian Peak
Tagged: warner bros, steam, release, re-release, pc gaming, batman arkham knight
Four months after being pulled from sale due to performance woes, Batman: Arkham Knight is being re-released for PC (along with a new patch containing all of the fixes) on October 28.
Image credit: Warner Bros.
From the official statement:
“At 10 am PDT, Oct. 28th, Batman: Arkham Knight will be re-released for the PC platform. At the same time we’ll also be releasing a patch that brings the PC version fully up-to-date with content that has been released for console (with the exception of console exclusives).
This means that next week, all PC players will have access to Photo Mode, Big Head Mode, Batman: Arkham Asylum Batman Skin, and character selection in combat AR challenges.”
After such a terrible introduction and long absence after its unprecedented removal from sale on Steam, is there any chance Warner Bros. will still attempt to charge full price for the re-released game? Such a move might be considered controversial, but we will have to wait and see as pricing was not announced.
Subject: Graphics Cards, Displays | October 24, 2015 - 04:16 PM | Ryan Shrout
Tagged: ROG Swift, refresh rate, pg279q, nvidia, GTX 980 Ti, geforce, asus, 165hz, 144hz
In the comments to our recent review of the ASUS ROG Swift PG279Q G-Sync monitor, a commenter by the name of Cyclops pointed me in the direction of an interesting quirk that I hadn’t considered before. According to reports, the higher refresh rates of some panels, including the 165Hz option available on this new monitor, can cause power draw to increase by as much as 100 watts on the system itself. While I did say in the review that the larger power brick ASUS provided with it (compared to last year’s PG278Q model) pointed toward higher power requirements for the display itself, I never thought to measure the system.
To setup a quick test I brought the ASUS ROG Swift PG279Q back to its rightful home in front of our graphics test bed, connected an EVGA GeForce GTX 980 Ti (with GPU driver 358.50) and chained both the PC and the monitor up to separate power monitoring devices. While sitting at a Windows 8.1 desktop I cycled the monitor through different refresh rate options and then recorded the power draw from both meters after 60-90 seconds of time to idle out.
The results are much more interesting than I expected! At 60Hz refresh rate, the monitor was drawing just 22.1 watts while the entire testing system was idling at 73.7 watts. (Note: the display was set to its post-calibration brightness of just 31.) Moving up to 100Hz and 120Hz saw very minor increases in power consumption from both the system and monitor.
But the jump to 144Hz is much more dramatic – idle system power jumps from 76 watts to almost 134 watts – an increase of 57 watts! Monitor power only increased by 1 watt at that transition though. At 165Hz we see another small increase, bringing the system power up to 137.8 watts.
Interestingly we did find that the system would repeatedly jump to as much as 200+ watts of idle power draw for 30 seconds at time and then drop back down to the 135-140 watt area for a few minutes. It was repeatable and very measurable.
So, what the hell is going on? A look at GPU-Z clock speeds reveals the source of the power consumption increase.
When running the monitor at 60Hz, 100Hz and even 120Hz, the GPU clock speed sits comfortably at 135MHz. When we increase from 120Hz to 144Hz though, the GPU clock spikes to 885MHz and stays there, even at the Windows desktop. According to GPU-Z the GPU is running at approximately 30% of the maximum TDP.
Though details are sparse, it seems pretty obvious what is going on here. The pixel clock and the GPU clock are connected through the same domain and are not asynchronous. The GPU needs to maintain a certain pixel clock in order to support the required bandwidth of a particular refresh rate, and based on our testing, the idle clock speed of 135MHz doesn’t give the pixel clock enough throughput to power anything more than a 120Hz refresh rate.
Pushing refresh rates of 144Hz and higher causes a surprsing increase in power draw
The obvious question here though is why NVIDIA would need to go all the way up to 885MHz in order to support the jump from 120Hz to 144Hz refresh rates. It seems quite extreme and the increased power draw is significant, causing the fans on the EVGA GTX 980 Ti to spin up even while sitting idle at the Windows desktop. NVIDIA is aware of the complication, though it appears that a fix won’t really be in order until an architectural shift is made down the road. With the ability to redesign the clock domains available to them, NVIDIA could design the pixel and GPU clock to be completely asynchronous, increasing one without affecting the other. It’s not a simple process though, especially in a processor this complex. We have seen Intel and AMD correctly and effectively separate clocks in recent years on newer CPU designs.
What happens to a modern AMD GPU like the R9 Fury with a similar test? To find out we connected our same GPU test bed to the ASUS MG279Q, a FreeSync enabled monitor capable of 144 Hz refresh rates, and swapped the GTX 980 Ti for an ASUS R9 Fury STRIX.
The AMD Fury does not demonstrate the same phenomenon that the GTX 980 Ti does when running at high refresh rates. The Fiji GPU runs at the same static 300MHz clock rate at 60Hz, 120Hz and 144Hz and the power draw on the system only inches up by 2 watts or so. I wasn't able to test 165Hz refresh rates on the AMD setup so it is possible that at that threshold the AMD graphics card would behave differently. It's also true that the NVIDIA Maxwell GPU is running at less than half the clock rate of AMD Fiji in this idle state, and that may account for difference in pixel clocks we are seeing. Still, the NVIDIA platform draws slightly more power at idle than the AMD platform, so advantage AMD here.
For today, know that if you choose to use a 144Hz or even a 165Hz refresh rate on your NVIDIA GeForce GPU you are going to be drawing a bit more power and will be less efficient than expected even just sitting in Windows. I would bet that most gamers willing to buy high end display hardware capable of those speeds won’t be overly concerned with 50-60 watts of additional power draw, but it’s an interesting data point for us to track going forward and to compare AMD and NVIDIA hardware in the future.
Subject: Graphics Cards | October 23, 2015 - 03:19 PM | Jeremy Hellstrom
Tagged: linux, amd, nvidia, steam os
Steam Machines powered by SteamOS are due to hit stores in the coming months and in order to get the best performance you need to make sure that the GPU inside the machine plays nicely with the new OS. To that end Phoronix has tested 22 GPUs, 15 NVIDIA ranging from a GTX 460 straight through to a TITAN X and seven AMD cards from an HD 6570 through to the new R9 Fury. Part of the reason they used less AMD cards in the testing stems from driver issues which prevented some models from functioning properly. They tested Bioshock Infinite, both Metro 2033 games, CS:GO and one of Josh's favourites, DiRT Showdown. The performance results may not be what you expect and are worth checking out fully. As well Phoronix put in cost to performance findings, for budget conscious gamers.
"With Steam Machines set to begin shipping next month and SteamOS beginning to interest more gamers as an alternative to Windows for building a living room gaming PC, in this article I've carried out a twenty-two graphics card comparison with various NVIDIA GeForce and AMD Radeon GPUs while testing them on the Debian Linux-based SteamOS 2.0 "Brewmaster" operating system using a variety of Steam Linux games."
Here are some more Graphics Card articles from around the web:
- MSI GeForce GTX 980 Ti LIGHTNING @ [H]ard|OCP
- Gigabyte GTX 950 Xtreme @ Modders-Inc
- The EVGA GTX 980 Ti Hybrid Review @ Hardware Canucks
- MSI GTX 980 Ti Sea Hawk Review @ Hardware Canucks
- EVGA GeForce GTX 980 Ti FTW Graphics Card Review @ Techgage
- PNY GTX 950 2GB @ eTeknix
- MSI GTX 980 Ti Lightning 6GB @ Kitguru
- PowerColor Devil 13 Dual Core R9 390 @ [H]ard|OCP
Subject: Processors | October 23, 2015 - 02:21 PM | Sebastian Peak
Tagged: Xeon D, SoC, rumor, report, processor, Pentium D, Intel, cpu
Intel's Xeon D SoC lineup will soon expand to include 12-core and 16-core options, after the platform launched earlier this year with the option of 4 or 8 cores for the 14 nm chips.
The report yesterday from CPU World offers new details on the refreshed lineup which includes both Xeon D and Pentium D SoCs:
"According to our sources, Intel have made some changes to the lineup, which is now comprised of 13 Xeon D and Pentium D SKUs. Even more interesting is that Intel managed to double the maximum number of cores, and consequentially combined cache size, of Xeon D design, and the nearing Xeon D launch may include a few 12-core and 16-core models with 18 MB and 24 MB cache."
The move is not unexpected as Intel initially hinted at an expanded offering by the end of the year (emphasis added):
"...the Intel Xeon processor D-1500 product family is the first offering of a line of processors that will address a broad range of low-power, high-density infrastructure needs. Currently available with 4 or 8 cores and 128 GB of addressable memory..."
Current Xeon D Processors
The new flagship Xeon D model will be the D-1577, a 16-core processor with between 18 and 24 MB of L3 cache (exact specifications are not yet known). These SoCs feature integrated platform controller hub (PCH), I/O, and dual 10 Gigabit Ethernet, and the initial offerings had up to a 45W TDP. It would seem likely that a model with double the core count would either necessitate a higher TDP or simply target a lower clock speed. We should know more before too long.
For futher information on Xeon D, please check out our previous coverage:
- New Intel Xeon D Broadwell Processors Aimed at Low Power, High Density Servers @ PC Perspective.
- Xeon D Podcast Discussion at 0:40:35 (YouTube or downloadable audio).
Subject: General Tech | October 23, 2015 - 01:00 PM | Jeremy Hellstrom
Tagged: google, youtube, youtube red
Google is taking advantage of its near monopoly of online streaming once again. Earlier this year they dropped ad revenue for content creators down to 55%, significantly lower than competitors such as Spotify. Now they are essentially repeating what they did just over a year ago with independent artists, either you sign up for YouTube Red or your content will no longer be visible to anyone. This will only effect those content contributors who make a fair amount of ad revenue, the average uploader will not need to pay the $10/month to enusre their videos are not blocked. One question that doesn't seem to be answered at either The Register nor Techcrunch is the effect YouTube Red will have on ad revenue, if you sign up for the service as a viewer you will no longer see ads, so how exactly will content creators make anything from ads that no longer show up or generate revenue?
"YouTube Red is Google's ad-free subscription service, and rolls up both music and video for $9.99 a month. Google Play subscribers will be opted in, and find that Red videos will be available offline too. Amateur uploaders aren't affected: what Google wants to do is nail down producers who have drawn an audience, and who already draw a tangible quantity of shared advertising revenue."
Here is some more Tech News from around the web:
- Ubuntu 15.10 'Wily Werewolf' Released @ Slashdot
- China: OK, Seagate, you may now kiss your bride (Samsung's disk biz) @ The Register
- Microsoft will release first major Windows 10 update in early November @ The Inquirer
- Microsoft introduces Visual Studio beta bug bounty programme @ The Inquirer
- Acer establishes subsidiary to enter electric vehicle power system and battery industries @ DigiTimes
- NVIDIA GeForce Experience Quarter 4 2015 Update Analysis @ NitroWare
Subject: Graphics Cards | October 23, 2015 - 01:49 AM | Sebastian Peak
Tagged: tape out, rumor, report, Radeon 400 Series, radeon, graphics card, gpu, Ellesmere, Baffin, amd
Details are almost nonexistent, but a new report claims that AMD has reached tape out for an upcoming Radeon 400 series of graphics cards, which could be the true successor to the R9 200-series after the rebranded 3xx cards.
Image credit: WCCFtech
According to the report:
"AMD has reportedly taped out two of its next-gen GPUs, with "Ellesmere" and "Baffin" both taping out - and both part of the upcoming Radeon 400 series of video cards."
I wish there was more here to report, but if this is accurate we should start to hear some details about these new cards fairly soon. The important thing is that AMD is working on the new performance mainstream cards so soon after releasing what was largely a simple rebrand accross much of the 300-series GPUs this year.
Subject: Storage | October 23, 2015 - 01:28 AM | Sebastian Peak
Tagged: storage, ssd, solid-state drive, Samsung, PM1725, enterprise
The 950 Pro SSD is here, (and Allyn has the full review right here) and while it's the fastest consumer SSD out there, the latest enterprise SSD demo from Samsung is absolutely insane.
Image credit: Kit Guru
The PM1725 has a PCI Express 3.0 x8 interface, and a 2.5" version will also be available (though limited to PCI Express 3.0 x4). And with read speeds in excess of 6.2 GB/s the PM1725 sounds like a RAM disk. And if that wasn't enough the drive managed a million IOPS from a demo performance for this new SSD at Dell World in Austin, Texas.
Image credit: Tom's Hardware
Tom's Hardware had hands-on time with the card and was able to run a few benchmarks verifying the outlandish speeds from this SSD, with their 6.2+ GB/s result coming from a 128k QD32 sequential test, with the IOPS test run as a 4k random read.
Image credit: Tom's Hardware
I'm sure the price will be similarly out of this world and this of course isn't a consumer-oriented (or likely even bootable) option. For now the Samsung 950 Pro is the object of NVMe desire for many, and for $199.99 ($0.78/GB) for the 256 GB model and $349.99 ($0.68/GB) for the 512 GB model on Amazon.com the 950 Pro is pretty reasonable - even if they "only" offer up to 2.5 GB/s reads and 1.5 GB/s writes. I'd certainly take it!
Subject: Graphics Cards | October 23, 2015 - 12:29 AM | Sebastian Peak
Tagged: r9 nano, mITX, mini-itx, graphics card, gpu, asus, amd
AMD's Radeon R9 Nano is a really cool product, able to provide much of power of the bigger R9 Fury X without the need for more than a standard air cooler, and doing so with an impossibly tiny size for a full graphics card. And while mini-ITX graphics cards serve a small segment of the market, just who might be buying a white one when this is released?
According to a report published first by Computer Base in Germany, ASUS is releasing an all-white AMD R9 Nano, and it looks really sharp. The stock R9 Nano is no slouch in the looks department as you can see here in our full review of AMD's newest GPU, but with this design ASUS provides a totally different look that could help unify the style of your build depending on your other component choices. White is just starting to show up for things like motherboard PCBs, but it's pretty rare in part due to the difficulty in manufacturing white parts that stay white when they are subjected to heat.
There was no mention on a specific release window for the ASUS R9 Nano White, so we'll have to wait for official word on that. It is possible that ASUS has also implemented their own custom PCB, though details are not know just yet. We should know more by the end of next month according to the report.
Subject: Cases and Cooling | October 22, 2015 - 04:08 PM | Jeremy Hellstrom
Tagged: mini-itx, fractal design, core 500
For those building an HTPC or who prefer a tiny system to a full sized ATX build, Fractal Design is a common choice for a case maker. Their newest is the Core 500 Mini-ITX case, measuring 250x213x380mm (9.8x8.0x14.4"), with a single 5.25" bay on the front, up to six internal drives mixed between 3.5" and 2.5" and a front panel with two USB 3.0 and headphone and microphone jacks. The Tech Report liked the spartan exterior but did have some problems when installing components in the system, the all-in-one liquid cooler they used had issues fitting and larger GPUs will also prove problematic. On the other hand with a $60 price tag the case is much less expensive than other mini-ITX cases and if you plan your components carefully you shouldn't have issues fitting them into the Core 500.
"Fractal Design's Core 500 is the company's take on a Mini-ITX case that stays compact while making room for big radiators and graphics cards, along with plenty of storage. We poked around and put our Casewarmer test system inside to see how the Core 500 measures up."
Here are some more Cases & Cooling reviews from around the web:
- Thermaltake Suppressor F51 Mid-Tower Case Review @ Neoseeker
- Nanoxia Deep Silence 5 Rev. B @ techPowerUp
- SilverStone Fortress FTZ01 Mini-ITX @ Benchmark Reviews
- Silverstone Raven RVZ02B-W & SX500-LG @ Legion Hardware
- Noctua NH-D15S CPU Cooler Review: How the Best Got Better @ Modders-Inc
- Scythe Ninja 4 @ techPowerUp
- Noctua NH-C14S CPU Cooler Review: Balance Through Asymmetry @ Modders-Inc
Subject: General Tech | October 22, 2015 - 02:38 PM | Jeremy Hellstrom
Tagged: Merlin Falcon, Excavator, carrizo, amd
On your latest flight you may have noticed some branding on the displays powering the schedules and in-flight entertainment, or perhaps if you were flying to Vegas you didn't notice it until you were playing the slots. If you were paying attention you would have noticed that the display was powered by AMD, as are many POS, medical and even military displays. A new series of Excavator based processors was announced today, the Merlin Falcon which has four Excavator cores, a Radeon third-gen GCN GPU and support for both DDR3 and DDR4 RAM.
Yes that is right, the first DDR4 chip from AMD is arriving but you won't be running it in your desktop. You should probably be jealous as this processor will have HSA 1.0, hardware based HEVC/H.265 video decode, DirectX 12 support and even the ARM co-processor that provides AMD's new Secure Processor feature. There is more at The Register if you follow the link.
"AMD will today unveil Merlin Falcon, its latest R-series processor aimed at industrial systems, medical devices, gambling machines, digital signs, military hardware, and so on."
Here is some more Tech News from around the web:
- SanDisk, Toshiba to jointly make 3D flash memory @ DigiTimes
- Michael Dell berates Microsoft's Nadella about high price of Surface tablet @ The Inquirer
- Square Enix To Concentrate On Remaking Their Back Catalog @ Slashdot
- Marvell, Longsys partner to make SSDs @ DigiTimes
- IoT's sub-GHz 802.11ah Wi-Fi will be dead on arrival, warn analysts @ The Register
- Amazon Fire TV Review @ Hardware Secrets
Subject: General Tech | October 22, 2015 - 02:34 PM | Scott Michaud
Tagged: pc gaming, audio
Over the last couple of months, we highlighted the work of The iBook Guy because it's very interesting. He also announced a rebrand to “The 8-Bit Guy” because he hasn't published an iBook video “in quite some time”. If you have been a long time follower of PC Perspective, you'll know that we have a history of changing our name to slightly less restrictive titles. Ryan initially named this site after the K7M motherboard, then Athlon motherboards in general, then AMD motherboards, then PC Perspective. I guess we shouldn't cover mobile or console teardowns...
Anywho... back to The 8-Bit Guy. This time, his video discusses how old PCs played (or, more frequently, synthesized) audio. He discusses the early, CPU-driven audio, which were quickly replaced by dedicated sound cards in the 1980s. They could drive audio waves that were either square, triangle, noise, or PCM (microphone-sampled). These four types were combined to make all of the music and sound effects of the time.
This brings us to today. He notes that, with today's modern computers having so much storage and RAM, we end up just mixing everything as an audio file and play that. This is where we can expand a little. Until around the Vista era, sound cards have been increasing in voice count. One of the last examples was the Creative SoundBlaster X-Fi. This card implemented their EAX 5.0 standard, which allowed up to 128 voices in games like Battlefield 2, and that was about it. When Microsoft released Vista, they replaced the entire audio stack with a software-based one. They stated that sound card drivers were a giant cause of bluescreen errors, and thus almost everything was moved out of the kernel.
At around this time, voice limits were removed. They don't make sense anymore because mixing is no longer being done in hardware. Nowadays, even websites through Web Audio API can play thousands of sounds simultaneously, although that probably will sound terrible in practice.
Audio processing doesn't end here, though. Now that we can play as many sounds as we like, and can do so with complete software control over the PCM waves, the problem is shifted into an algorithmic one.
This is an area that I, personally, am interested in.
Earlier this year, I created a demo in WebCL that rendered 20,000 - 30,000 sounds on an Intel HD 4600 GPU, with stereo positioning and linear distance falloff, while the system's main NVIDIA GeForce GTX 670 was busy drawing the WebGL scene. The future goal was to ray-trace (high frequency) and voxelize (low frequency) sound calls based on the environment, to simulate environmentally-accurate reverbs and echoes. Over the summer, I worked with a graduate student from Queen's University to offload audio in the Unity engine (I preferred Unreal). We have not yet introduced geometry.
At this year's Oculus Connect, Michael Abrash also mentioned that audio is interesting for VR, but that it needs to wait for more computational horsepower. A lot more. He also discussed HRTF, which is the current way of adding surround to stereo by measuring how an individual's ears modify sound depending on location. It gets worse if sounds are closer than a meter away, or the actual user's ears differ too much from the experiment subject.
Anyway, enough about me. The 8-Bit Guy's videos are interesting. Check them out.
Subject: General Tech | October 22, 2015 - 02:12 PM | Ken Addison
Tagged: yoga 900, xr321ck, western digital, video, valve, ultrawide, steam link, Steam Controller, sandisk, podcast, Lenovo, freesync, acer, 3440x1440
PC Perspective Podcast #372 - 10/22/2015
Join us this week as we discuss the Steam Controller and Steam Link, Acer XR321CK Ultrawide Freesync Display, and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the Store
- RSS - Subscribe through your regular RSS reader
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Jeremy Hellstrom, Allyn Malventano, and Sebastian Peak
Program length: 1:29:18
Week in Review:
0:40:00 Learn how to add narration to your Kindle ebooks. Visit amazon.com/pcper
News item of interest:
Hardware/Software Picks of the Week:
Subject: General Tech | October 21, 2015 - 09:48 PM | Scott Michaud
Tagged: webgl, tencent, atlas, artillery games
The Chinese investment and Web company, Tencent, has taken interest in many American video game companies. In a couple installments, Tencent purchased chunks of Riot Games, developer of League of Legends, which now total up to over 90% of the game studio. They later grabbed a “minority” (~48%) stake in Epic Games, which creates Unreal Engine, Unreal Tournament, Fortnite, Infinity Blade, the original three Gears of War games, and a few other franchises.
This time, they purchased an undisclosed share of Artillery Games. Artillery has not released a title yet, but they are working on a WebGL-powered engine. In other words, titles created with this technology will run directly in web browsers without plug-ins or extensions. At some point, Artillery Games decided to make a native client alongside their web engine, which was announced in September. This was apparently due to latency introduced in the Pointer Lock API and networking issues until WebRTC matures. (WebRTC brings P2P network sockets to web browsers. While people mentally equate it to video conferencing, it is also used for client-to-client multiplayer. There is even a BitTorrent client that runs in a web browser using it.)
Unfortunately, the real story would be how much of Artillery they have purchased, and we don't know that yet (if ever). They are buying up quite a lot of formerly-independent studios though, considering how many are left.
Subject: General Tech | October 21, 2015 - 02:35 PM | Jeremy Hellstrom
Tagged: gaming, sword coast legends
Sword Coast Legends was just released and you should probably take a look at this quick preview of the game at Rock, Paper, SHOTGUN if you are hoping this will fill your time until Baldurs Gate: Siege of Dragonspear is released. They have not yet had time to complete a full review including the multiplayer and Dungeon Master modes but the overall initial impression that this game feels more like Dungeon Seige than previous D&D games. You can still pause the game to order your party in combat but it seems less necessary as you are using a small pool of abilities to hack and slash your way to victory. The review is not primarily negative and it sounds like there is fun to be had but if you were hoping for something more intricate and involved then you may be disappointed. Then again, it may prove that the multiplayer mode with a DM overseeing a custom adventure may make this title worth picking up.
"We’ll have some thoughts on the multiplayer portion of just-released, latter-day Dungeons & Dragons RPG Sword Coast Legends [official site] – including the all-important DM mode – very soon, but while RPS gathers its party to sally forth, I thought I’d share some initial impressions on singleplayer."
Here is some more Tech News from around the web:
- After Burner: Sega’s jet-fighting, puke-inducing arcade marvel @ The Register
- The RPG Scrollbars: Conning Purple @ Rock, Paper, SHOTGUN
- Mad Max Review @ OCC
- Tales From The Borderlands Episode 5 Is Out @ Rock, Paper, SHOTGUN
Subject: General Tech | October 21, 2015 - 01:49 PM | Jeremy Hellstrom
Tagged: patriot, Viper V360, gaming headset, 7.1 headset, audio
Patriot has expanded into the gaming headset market with the Viper 360, which has two 40mm Neodymium drivers and two 30mm sub-drivers which use software to emulate 7.1 surround sound. The earcups have the volume control, a button to toggle the Ultra Bass Response feature and a switch to turn the large LED lights on and off, should you desire a glow in the dark head for some reason. The frequency response matches the competition at 20Hz- 20KHz, the two sub-drivers are enabled in UBR mode and do add some vibration along with more bass volume. At $60 it is reasonably priced and the the two year warranty should ensure you get your money's worth. Check out the full review at Modders Inc.
"Patriot is known for its memory and mobile products, and has just recently started selling peripherals. It might seem like an unusual jump, but their new headset proves that Patriot is prepared to expand and succeed in this new market. Patriot's initial headset offering is the Viper V360, a virtual 7.1 capable gaming peripheral that plugs in via USB."
Here is some more Tech News from around the web:
- Corsair VOID Wireless Dolby 7.1 Gaming Headset Review @ Madshrimps
- Corsair Void USB Dolby 7.1 RGB Gaming Headset @ eTeknix
- Corsair Void 3.5mm Stereo Gaming Headset @ eTeknix
- AKG K845 BT Closed-Back Over Ear Bluetooth Headset Review @ NikKTech
- TDK TREK Flex A28 Wireless Speaker Review @ NikKTech
- Inateck Mercury Box wireless Bluetooth speaker @ Kitguru
Subject: General Tech | October 21, 2015 - 01:01 PM | Jeremy Hellstrom
Tagged: killer nic, bigfoot
When the Killer NIC was first released in 2006 the PC Perspective crew were not overly impressed, it seemed a solution in search of a problem and initially it was far more expensive than it was effective. Over the years the way the solution was implemented changed from running on an embedded Freescale PowerPC SoC to using part of the CPU to handle the processing which both reduced the price as well as offering better overall performance. More recently the acquisition by Qualcomm has helped Bigfoot develop a far more effective product, the one seen on many Z170 boards and which has received far more positive reviews. The Tech Report recently had a chance to sit down and talk with Killer's CEO Mike Cubbage and the Chief Marketing Officer Bob Grim about how their product has changed over the years. You can read about what they learned as well as learn more about how the current generation of Killer NIC performs its various tasks in their article here.
"Killer-powered Gigabit Ethernet ports can be found on many gaming-focused motherboards and laptops these days. We talked to Killer Networking about the details of its latest hardware and software, and then we put those features to the test with a Killer-equipped motherboard."
Here is some more Tech News from around the web:
- Silicon quantum logic gate is a first @ Nanotechweb
- Ultrasonic Power Transfer: uBeam’s Curious Engineering @ Hack a Day
- Self-Encrypting Western Digital Hard Drives Easy To Crack @ Slashdot
- Global desktop shipments in 2015 expected to fall over 15% @ DigiTimes
- How Scientists Are Circumventing Journal Paywalls @ Slashdot
- Tips and Tricks for Using the Two Best E-Readers for Linux @ Linux.com
- ARM floats power-sipping Mali-470 GPU for Internet of Things things @ The Register
- Vertagear S-Line SL4000 Gaming Chair @ eTeknix
- Overclocking Pros and Cons @ Hardware Secrets
- D-Link DCS-2630L: 180 Degree, HD WiFi 802.11ac Camera @ Phoronix
Subject: Storage | October 21, 2015 - 09:22 AM | Sebastian Peak
Tagged: western digital, WD, sandisk, ssd, hard drives, solid-state drive
Western Digital has agreed to purchase Sandisk for $19 billion in cash and stock, a deal which values Sandisk at $86.50 per share and represents a 12% premium over yesterday's closing price. Current Western Digital CEO Steve Milligan will remain in charge of the company, which retains its headquarters in Irvine, California, while SanDisk's CEO Sanjay Mehrotra is expected to remain with Western Digital and join their board of directors.
Sandisk had reportedly been looking for a buyer, with Micron the other likely candidate according to this morning's report from The Wall Street Journal. The move should help to better position Western Digital in the SSD space, something rival Seagate appeared to be focused on when purchasing LSI last year. Neither company has any significant presence in the consumer solid-state market dominated by Samsung, and it will be interesting to see where WD goes with the Sandisk brand.
Subject: Graphics Cards | October 21, 2015 - 07:18 AM | Sebastian Peak
Tagged: water cooling, nvidia, liquid cooled, GTX 980 WATERFORCE, GTX 980, GPU Water Block, gigabyte, AIO
Gigabyte has announced the GeForce GTX 980 WATERFORCE water-cooled graphics card, and this one is ready to go out of the box thanks to an integrated closed-loop liquid cooler.
In addition to full liquid cooling, the card - model GV-N980WAOC-4GD - also features "GPU Gauntlet Sorting", meaning that each card has a binned GTX 980 core for better overclocking performance.
"The GTX 980 WATERFORCE is fitted with only the top-performing GPU core through the very own GPU Gauntlet Sorting technology that guarantees superior overclocking capabilities in terms of excellent power switching and thermal efficiency. Only the strongest processors survived can be qualified for the GTX 980 WATERFORCE, which can fulfill both gaming enthusiasts’ and overclockers’ expectations with greater overclocking headroom, and higher, stable boost clocks under heavy load."
The cooling system for the GTX 980 WATERFORCE begins with a full-coverage block that cools the GPU, RAM, power delivery, without the need for any additional fan for board components. The tubes carrying liquid to the radiator are 45 cm SFP, which Gigabyte says "effectively prevent...leak(s) and fare a lower coolant evaporation rate", and the system is connected to a 120 mm radiator.
Gigabyte says both the fan and the pump offer low noise output, and claim that this cooling system allows the GTX 980 WATERFORCE to "perform up to 38.8% cooler than the reference cooling" for cool and quiet gaming.
The WATERFORCE card also features two DVI outputs (reference is one dual-link output) in addition to the standard three DisplayPort 1.2 and single HDMI 2.0 outputs of a GTX 980.
Pricing and availability have not been announced.