All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: Cases and Cooling | January 21, 2016 - 10:44 PM | Tim Verry
Tagged: modular psu, gigabyte, ATX PSU
Gigabyte made an announcement teasing two new power supplies last week. The G750H and B700H are 80 PLUS rated models topping out at 750W and 700W respectively. A company most well-known for its motherboards, it was somewhat surprising to see it tease power supplies and to discover that these PSUs are not even the first to be sold by Gigabyte with its branding.
The G750H and B700H are ATX form factor and use a semi-modular design that leaves the 24-pin ATX and 8-pin CPU power cables permanently attached and uses modular cables for all other connections (see below). One neat thing is that Gigabyte is using all black flat individually sleeved cables which may make it easier to hide and route them behind the motherboard tray (which on some cases can be an especially narrow channel). Both models are rated for SLI and Crossfire multi-GPU setups, use at least some Japanese capacitors (the G750H uses all Japanese capacitors), have a MTBF of 100,000 hours, and five year warranties.
In addition to the motherboard and CPU power, users can install two eight pin PCI-E, five SATA power, three Molex, and one floppy power connector. The modular cable configuration is the same on both PSU models.
The G750H is up to 90% efficient (80+ Gold) and uses a 140mm temperature controlled fan to keep noise levels low and the internal components cool (and efficient). Gigabyte has opted for a single rail design that sees the 12V rail rated at up to 62 amps.
On the other hand, the B700H is up to 85% efficient (80+ Bronze) at typical loads. It has a smaller 120mm temperature controlled fan for cooling. This model also uses a single 12V rail, but it tops out at 54 amps.
Several sites around the Internet have indicated (including Maximum PC) that Gigabyte has made the G750H and B700H available now, but they do not seem to be for sale yet in the US. I have tried to unearth pricing as well as the identity of the ODM Gigabyte is using for these new units, but no such luck so far. From my research, it appears that Gigabyte has used a number of different ODM/OEMs of varying quality for their past power supplies. It seems that we will have to wait for reviews to know for sure how these new PSUs will perform. I hope that Gigabyte has stepped up its power supply game as it has quite a bit of competition these days!
Subject: General Tech | January 21, 2016 - 06:39 PM | Scott Michaud
Tagged: windows 10, microsoft
I wasn't planning on reporting every Windows 10 Insider build, but I actually have something to say about this one. The last couple of builds were examples of Microsoft switching to a faster release cycle for preview users, although the known issues list were quite benign. The semi-frequent upgrade cycles would shake users away from the Insider program, or transition to Slow.
They now seem ready to start rolling back to less QA for Fast users.
Five issues are known about build 11102. First, internal changes to the Windows Graphics architecture cause some games to crash on launch, full screen, or resolution changes. The Witcher 3, Fallout 4, Tomb Raider, Assassin's Creed, and Metal Gear Solid V are known to be affected by this bug, but other games (and maybe even other software) could be affected too. Second, screen readers and other accessibility software may crash randomly. If you require those accommodations, then this build could make your device functionally unusable to you.
Beyond those two, big issues, three other ones are present. There is an error message on login with a workaround, a breaking change for old wireless drivers that you should probably upgrade beforehand if you rely upon wireless to download drivers, and “The Connect button does not show up in Action Center.”
Microsoft is currently updating the deep insides of the OS, which means that they will be poking around the source code in weird places. Once it's completed, this should make Windows more maintainable, especially for multiple types of hardware. But again, if you're not wanting to be a part of this, switch to Slow or leave Insider.
Subject: Mobile | January 21, 2016 - 06:14 PM | Jeremy Hellstrom
Tagged: toshiba, Satellite Radius 12
Ah, the old Toshiba Satellite; like a Volvo it was never the best nor the prettiest but short of a major collision nothing could kill it. Since those times Toshiba has had a rough go of it, The Inquirer states they have predicted a $4.5bn loss, just after being caught cooking the books. That has not stopped them from improving their Satellite lineup and the Satellite Radius 12 ultraportable is a great example of that.
The screen on this 300x209x15.4mm (11.8x8.2x0.6") and 1.32kg (2.9lb), 12.1" convertible laptop is an impressive 3840x2160 IPS display which can be fully flipped open to a tablet like form factor. An i7-6500U, 8GB RAM and an unspecified 256GB SSD offer great performance, although battery life does suffer somewhat due to the screen and components. Toshiba also included a dedicated Cortana button, cellphone like volume rocker, 0.9MP webcam and an infrared camera which works with Windows Hello but is not a RealSense camera. The Inquirer found a lot to like about this laptop as well as some fairly serious shortcomings, read about them all in their review.
"This is the latest in Toshiba's rotating display convertible line, and the first of its kind to include a so-called 4K screen, making it an interesting proposition regardless of its creator's misfortunes."
Here are some more Mobile articles from around the web:
- HP Pavilion Gaming Notebook 15-ak020NB Review @ Madshrimps
- MSI GT80S 6QF Titan (SLI GTX980’s) @ Kitguru
- Microsoft Lumia 950 XL @ The Inquirer
- ASUS ZenFone Zoom Offers 3X Optical Zoom & OIS @ Tech ARP
Subject: General Tech | January 21, 2016 - 02:34 PM | Ken Addison
Tagged: x99-m, X170, X150, video, Silent Base 800, Q4 2015, Predator X34, podcast, gigabyte, g-sync, freesync, earnings, be quiet, asus, amd, acer
PC Perspective Podcast #383 - 01/21/2016
Join us this week as we discuss the Acer Predator X34, ASUS X99-M, AMD Q4 Earnings and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the Store (audio only)
- RSS - Subscribe through your regular RSS reader (audio only)
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano
Program length: 1:25:53
Subject: General Tech | January 21, 2016 - 12:52 PM | Jeremy Hellstrom
Tagged: Intel, intel driver update utility, security
The Intel Driver Update Utility is not the most commonly found application on PCs but someone you know may have stumbled upon it or had it installed by Geek Squad or the local equivalent. Since Windows Vista the tool has been available, it checks your system for any Intel parts, from your APU to your NIC and then looks for any applicable drivers that are available. Unfortunately it was doing so over a non-SSL URL which leaves the utility wide open to a man in the middle attack and you really do not want a compromised NIC driver. The Inquirer reports today that Intel quietly updated the tool on January 19th to resolve the issue, ensuring all communication and downloads are over SSL. If you know anyone using this tool, recommend they update it immediately.
"Intel has issued a fix for a major security vulnerability in a driver utility tool that could have allowed a man-in-the-middle attack and a malware maelstrom on victims' computers."
Here is some more Tech News from around the web:
- Eighteen year old server trumped by functional 486 fleet! @ The Register
- Solus Project: No Longer Just A Chrome OS Alternative @ Linux.com
- Google Chrome is getting a speed boost of up to 25 percent - and soon @ The Inquirer
- E-Mail Spam Goes Artisanal @ Slashdot
- Facebook Messenger: All your numbers are belong to us @ The Register
- IRS 'inadvertently' wiped hard drive Microsoft demanded in audit row @ The Register
- Synaptics IronVeil Fingerprint Security Technology @ eTeknix
- Through The Looking Glass - Questions Of VR's Viability On The PC @ Techgage
Subject: General Tech | January 21, 2016 - 02:59 AM | Scott Michaud
Tagged: google, chrome
Web browsers are typically on rapid release cycles so they can get features out frequently. The Web is changing on a constant basis to help it become an effective application platform, which is cross-compatible with competing implementations. A common complaint is that the cycle is to yield high version numbers for marketing, to give a false sense of maturity, but I'd expect that frequent, breaking changes are kind-of necessary to synchronize features between implementations. If Google lands a feature a month after Mozilla publishes a new version, should they really wait two years for their next one? Granted, they probably knew about it pre-release, but you get the idea. Also, even if the theory is true, artificially high version numbers is one of the most benign things a company could do.
Some versions introduce some fairly interesting features, though. This one, Google Chrome 48, deprecates RC4 encryption for HTTPS, which forces web servers to use newer cyphers or they will fail to load.
Another major one, and probably more interesting for our audience, is the introduction of VP9 to WebRTC. This video codec is Google's open competitor to H.265. At similar quality settings, VP9 will use about half of the bandwidth (or storage) as VP8. WebRTC is mostly used for video conferencing, but it's really an open platform for webcam, microphone, audio, video, and raw, peer-to-peer data connections. There are even examples of it being used to synchronize objects in multiplayer video games, which has nothing to do with video or audio streaming. I'm not sure what is possible with this support, but it might even lead to web applications that can edit video.
Google Chrome 48 is available today. Also, as a related note, Firefox 44 should release next week with its own features, like experimental rendering of WebGL images offscreen and multi-threaded. The full changelog for Google Chrome 48 from Git is about 42 MB large and, ironically, tends to crash Firefox.
Subject: General Tech | January 20, 2016 - 07:06 PM | Scott Michaud
Tagged: vulkan, ue4, nvidia, Intel, gdc 2016, GDC, epic games, DirectX 12, Codemasters, arm, amd
The 30th Game Developers Conference (GDC) will take place on March 14th through March 18th, with the expo itself starting on March 16th. The sessions have been published at some point, with DX12 and Vulkan prominently featured. While the technologies have not been adopted as quickly as advertised, the direction is definitely forward. In fact, NVIDIA, Khronos Group, and Valve have just finished hosting a developer day for Vulkan. It is coming.
One interesting session will be hosted by Codemasters and Intel, which discusses bringing the F1 2015 engine to DirectX 12. It will highlight a few features they implemented, such as voxel based raytracing using conservative rasterization, which overestimates the size of individual triangles so you don't get edge effects on pixels that are partially influenced by an edge that cuts through a tiny, but not negligible, portion of them. Sites like Game Debate (Update: Whoops, forgot the link) wonder if these features will be patched in to older titles, like F1 2015, or if they're just R&D for future games.
Another keynote will discuss bringing Vulkan to mobile through Unreal Engine 4. This one will be hosted by ARM and Epic Games. Mobile processors have quite a few cores, albeit ones that are slower at single-threaded tasks, and decent GPUs. Being able to keep them loaded will bring their gaming potential up closer to the GPU's theoretical performance, which has surpassed both the Xbox 360 and PlayStation 3, sometimes by a factor of 2 or more.
Many (most?) slide decks and video recordings are available for free after the fact, but we can't really know which ones ahead of time. It should be an interesting year, though.
Subject: Graphics Cards | January 20, 2016 - 03:26 PM | Scott Michaud
Tagged: nvidia, linux, tesla, fermi, kepler, maxwell
It's nice to see long-term roundups every once in a while. They do not really provide useful information for someone looking to make a purchase, but they show how our industry is changing (or not). In this case, Phoronix tested twenty-seven NVIDIA GeForce cards across four architectures: Tesla, Fermi, Kepler, and Maxwell. In other words, from the GeForce 8 series all the way up to the GTX 980 Ti.
Image Credit: Phoronix
Nine years of advancements in ASIC design, with a doubling time-step of 18 months, should yield a 64-fold improvement. The number of transistors falls short, showing about a 12-fold improvement between the Titan X and the largest first-wave Tesla, although that means nothing for a fabless semiconductor designer. The main reason why I include this figure is to show the actual Moore's Law trend over this time span, but it also highlights the slowdown in process technology.
Performance per watt does depend on NVIDIA though, and the ratio between the GTX 980 Ti and the 8500 GT is about 72:1. While this is slightly better than the target 64:1 ratio, these parts are from very different locations in their respective product stacks. Swapping the 8500 GT for the following year's 9800 GTX, which leads to a comparison between top-of-the-line GPUs of their respective times, and you see a 6.2x improvement in performance per watt versus the GTX 980 Ti. On the other hand, that part was outstanding for its era.
I should note that each of these tests take place on Linux. It might not perfectly reflect the landscape on Windows, but again, it's interesting in its own right.
Subject: Cases and Cooling | January 20, 2016 - 02:51 PM | Jeremy Hellstrom
Tagged: azza, Nova 8000
You have to take a look at several pictures of the Azza Nova 8000 before you truly understand just how the orange and black case looks; to then decide if it is hot or not. The swing out doors which allow you to access your drives are a unique feature but arguable one of limited usage. Leaving the aesthetics behind, the case supports up to E-ATX boards at 589x221x574mm (21.6x8.7x22.6") in size and supports up to four 120mm fans or up to a 360mm radiator at the top and a 240mm one on the bottom. With up to 13 drives supported the case is certainly aimed towards the data pack rat and helps to explain the drive chambers somewhat, but not so much the colour scheme. Check out the full review at Overclockers Club if the picture below doesn't immediately scare you off.
"The fit and finish of this case is top notch. All the panels lined up and fit together nicely. The top I/O panel gives you two USB 3.0 and two USB 2.0 ports, which is fairly standard on a case this size, and the colorful LEDs break up the monotony you find with many cases that use single-color LEDs. And while I am talking about LEDs, the gentle orange glow from the front fan adds a nice touch."
Here are some more Cases & Cooling reviews from around the web:
- Phobya WaCoolT Black Owl PC Case Review @ NikKTech
- be quiet! Silent Base 600 ATX Case @ Benchmark Reviews
- Corsair Carbide 400Q Mid-Tower @ eTeknix
- ARCTIC Liquid Freezer 120 AIO Liquid CPU Cooler Review @ NikKTech
- Silverstone Tundra TD02-Slim AIO Cooler @ eTeknix
- Noctua NH-C14S Low-Profile CPU Cooler Review @ NikKTech
- Noctua NH-D9L Dual-Tower CPU Cooler @ eTeknix
Subject: General Tech | January 20, 2016 - 01:42 PM | Jeremy Hellstrom
Tagged: gaming, Kickstarter, consortium
That is a hell of a tagline which will be hard to live up to as trying to live up to the fond memories of the original Deus Ex while figuring out how to make Die Hard playable is not a trivial task. The original ($2.99 on GoG right now) had mixed reviews, some loving the way you are dumped into an immersive story with no introduction and others frustrated by a lack of tutorial. It was certainly different than your usual game in 2014. The second game in the developers planned trilogy is Consortium: The Tower and seems to be more focused on being a single person stealth/action game than the team based experience of the first but the trailer does seem to have some of the same flavour as the first when it comes to the story and dialogue. You don't often see a player choose to drop his weapons and then attempt to dialogue with the bad guys, except in cut scenes of which there will be none in this game. Check out the trailer, read what the gang at Rock, Paper, SHOTGUN had to say about it and see if you think it is worth tossing in on the Kickstarter campaign.
We could have talked about FarCry Primal but Ubisoft is too busy being themselves by taking down trailers and generally making it hard to have nice things.
"As pitches go, Consortium: The Tower has a bloody good line up its sleeve. Like its predecessor, it’s a science fiction game set in a single environment. You can talk, fight or sneak your way past or through encounters, and many events will happen even if you’re not there to see or influence them."
Here is some more Tech News from around the web:
- Wot I Think: Tharsis @ Rock, Paper, SHOTGUN
- Fallout 4 amazes and annoys @ The Tech Report
- Humble Store Winter Sale
- Humble Firaxis Games Bundle offers $211 worth of games @ HEXUS
- Far Cry Primal Trailer Confirms It’s A Far Cry Game @ Rock, Paper, SHOTGUN
Subject: General Tech | January 20, 2016 - 12:19 PM | Jeremy Hellstrom
Tagged: fingerprint, synaptics, ironveil, security
Synaptics, the company most likely responsible for the trackpad on your laptop has released a new product, a 4x10mm fingerprint sensor which goes by the name of IronVeil. The idea behind the product is to incorporate it into peripherals and pair it with Windows Passport to allow you to log in by touching your mouse or keyboard, similar to the current generation of cellphones. Synaptics also suggests it could be used in eSports to ensure that the person behind the mouse is indeed who they claim to be. The Tech Report tried out a Thermaltake Black V2 mouse with the sensor embedded and talk about their experiences with the mouse as well as introduce you to the FIDO Alliance and some of the authentication process which occurs behind the scenes in their recent article.
One cannot help but point out that while passwords can be hashed and salted, the same cannot be said for fingerprints which leads us back to previously mentioned concerns about the security of the online storage databases these prints would be stored in. The eternal battle of convenience versus security rages on.
"Synaptics' IronVeil is a tiny fingerprint sensor module that serves as the foundation for a variety of new authentication techniques for home and business users alike. We've spent a couple weeks with a pre-production IronVeil mouse, and we've explored how it might be used in practice."
Here is some more Tech News from around the web:
- Hey, Intel and Micron: XPoint is phase-change memory, right? Or is it? Yes. No. Yes @ The Register
- Big reader? Toshiba tweaks endurance, wrings out low-write SSD @ The Register
- Microsoft plans Surface Pro charger recall due to fire risk @ The Inquirer
- Digitimes Research: Global notebook shipments in 2016 to fall 2.5% on year @ DigiTimes
- D-Link DIR-890 AC3200 Ultra Triple-Band Wi-Fi Router @ eTeknix
Subject: Graphics Cards, Processors | January 19, 2016 - 11:38 PM | Scott Michaud
Digitimes is reporting on statements that were allegedly made by TSMC co-CEO, Mark Liu. We are currently seeing 16nm parts come out of the foundry, which is expected to be used in the next generation of GPUs, replacing the long-running 28nm node that launched with the GeForce GTX 680. (It's still unannounced whether AMD and NVIDIA will use 14nm FinFET from Samsung or GlobalFoundries, or 16nm FinFET from TSMC.)
Update (Jan 20th, @4pm EST): Couple minor corrections. Radeon HD 7970 launched at 28nm first by a couple of months. I just remember NVIDIA getting swamped in delays because it was a new node, so that's probably why I thought of the GTX 680. Also, AMD announced during CES that they will use GlobalFoundries to fab their upcoming GPUs, which I apparently missed. We suspect that NVIDIA will use TSMC, and have assumed that for a while, but it hasn't been officially announced yet (if ever).
According to their projections, which (again) are filtered through Digitimes, the foundry expects to have 7nm in the first half of 2018. They also expect to introduce extreme ultraviolet (EUV) lithography methods with 5nm in 2020. Given that Silicon in a solid has a lattice spacing of ~0.54nm at room temperature, 7nm transistors will consist of about 13 atoms, and 5nm transistors will have features containing about 9 atoms.
We continue the march toward the end of silicon lithography.
Even if the statement is correct, much can happen between then and now. It wouldn't be the first time that I've seen a major foundry believe that a node would be available, but end up having it delayed. I wouldn't hold my breath, but I might cross my fingers if my hands were free.
At the very least, we can assume that TSMC's roadmap is 16nm, 10nm, 7nm, and then 5nm.
Subject: Graphics Cards, Memory | January 19, 2016 - 11:01 PM | Scott Michaud
Tagged: Samsung, HBM2, hbm
Samsung has just announced that they have begun mass production of 4GB HBM2 memory modules. When used on GPUs, four packages can provide 16GB of Video RAM with very high performance. They do this with a very wide data bus, which trade off frequency for transferring huge chunks. Samsung's offering is rated at 256 GB/s per package, which is twice what the Fury X could do with HBM1.
They also expect to mass produce 8GB HBM2 packages within this calendar year. I'm guessing that this means we'll see 32GB GPUs in the late-2016 or early-2017 time frame unless "within this year" means very, very soon (versus Q3/Q4). They will likely be for workstation or professional cards, but, in NVIDIA's case, those are usually based on architectures that are marketed to high-end gaming enthusiasts through some Titan offering. There's a lot of ways this could go, but a 32GB Titan seems like a bit much; I wouldn't expect that this affects the enthusiast gamer segment. It might mean that professionals looking to upgrade from the Kepler-based Tesla K-series might be waiting a little longer, maybe even GTC 2017. Alternatively, they might get new cards, just with a 16GB maximum until a refresh next year. There's not enough information to know one way or the other, but it's something to think about when more of it starts rolling in.
Samsung's HBM2 are compatible with ECC, although I believe that was also true for at least some HBM1 modules from SK Hynix.
Subject: Systems, Storage | January 19, 2016 - 09:44 PM | Scott Michaud
Tagged: M.2 SATA, M.2, LIVA, ECS
Back in November, Sebastian reviewed the ECS LIVA X2. While the device always had an M.2 slot, its storage options were soldered eMMC chips with capacities of their 32GB or 64GB. They were also pretty slow, with 150MB/s reads and 40MB/s writes in his testing. To exceed that, you need to install your own M.2-based SSD, which was a bit of a difficult process.
According to Links International, via FanlessTech, we are now seeing options that include M.2 SSDs without eMMC. In this case, they are using an Intel-based, 120GB drive. Its signal is M.2 SATA though, which is slower than M.2 PCIe, but a device with this performance characteristic will probably not care about that extra bump in performance. You probably couldn't do much high-bandwidth data crunching with the Braswell processor, and just about every other way on or off of the device is limited to less than or equal to a gigabit of bandwidth. You might be able to find a use case, but it's unlikely to affect anyone interested in this PC.
The jump from eMMC, on the other hand, might.
Subject: Motherboards | January 19, 2016 - 07:27 PM | Jeremy Hellstrom
Tagged: Kabini, biostar, amd, A68N-5200
We don't often think of Kabini based systems lately, focusing on systems of significantly more power but it is worth remembering that the low powered AMD processor and motherboard combo still exists. The motherboard comes with an integrated A6-5200 with HD 8400 Graphics for a grand total of $68 leaving you short only a DIMM and storage device from having a fully functional system. You will not be playing Crysis on this system but if you pick up a low cost GPU you would certainly be able to play online games and older titles, if you wanted to go that direction. You could instead look at building a low powered, low cost system for Internet browsing and emailing for a friend or relative for very little cost, especially if you have an old disk lying around somewhere unused to install in the system. It can also manage decent encoding performance for its price, check out the review at MadShrimps to see more.
"The A68N-5200 board, by incorporating one A6-5200 Kabini-based APU, is bringing to the table even more raw performance when compared to its A4-5000, while the GPU component gets a 100Mhz boost. While the increased 3D performance is minimal versus the A4, other tasks which require CPU performance will get up to 25% performance boost."
Here are some more Motherboard articles from around the web:
- Asus Maximus VIII Impact (Z170) @ Kitguru
- ASUS ROG Maximus VIII Impact Review @ OCC
- MSI Z170A Gaming M7 @ Kitguru
- GIGABYTE Z170X-Gaming 5 Motherboard Review @ Hardware Canucks
Subject: Displays | January 19, 2016 - 04:44 PM | Jeremy Hellstrom
Tagged: XR3501, mva, benq, 2560x1080, 144hz
Benq made some interesting design choices on the XR3501 which some will love and some will absolutely despise. A 35" MVA panel at 144Hz is impressive to behold and one with "2000R Ultra Curve Technology" is even more so as it is a significantly higher curve than most other monitors. The 2000R is actually an industry standard and denotes the radius, in millimetres, of the circle this monitor would describe which in this case is 2 metres. Most other curved monitors are 4000-4500R, as in 4 to 4.5 metres radius.
On the other hand, the monitor does not have adaptive sync technology and the resolution of 2560x1080 will cause some disappointment, as may the ~$1000 price tag. You can either check out Hardware Canucks' full review here or just scroll on in disgust.
"Massive curved gaming monitors seem to be the flavor of the day and BenQ's XR3501 may be one of the most insane. It boasts a 35" curved MVA panel with a 144Hz refresh rate."
Here are some more Display articles from around the web:
- Philips Brilliance BDM3490UC Ultra-wide 21:9 Curved Display @ Kitguru
- Acer Predator X34 Review @ OCC
- Acer Predator XB270HU G-Sync Display @ Kitguru
- Dell S2716DG G-SYNC Gaming Monitor Review @ Hardware Canucks
- AOC U3277PQU 32-inch 4K display @ Kitguru
Subject: General Tech | January 19, 2016 - 03:44 PM | Jeremy Hellstrom
Tagged: STT-MRAM, pcram, RRAM, memristor
With the rise of new memory technologies such as HBM for volatile memory and new applications for non-volatile memory there has been a lot of research on new types of memory. In the past we have discussed such technologies as Spin Transfer Torque Magnetic (STT-MRAM), Phase Change(PCRAM), Ferroelectric (FeRAM), and Resistive RAM known alternatively as either RRAM or Memristors which are all being developed to push performance past the limits of the current materials used in manufacturing RAM.
These advances are not the only ones being worked on, for instance two groups in Korea are working on using indium zinc oxide (IZO) alongside nanowires and other transparent conducting oxides to create flexible ReRAM devices with 80% transparency. This is perfect for usage in flexible displays and wearable devices which would benefit from being flexible. It could even create the case modders dream, transparent SSDs Read more about their research at Nanotechweb.
"Alternative, non-volatile memory devices are in high demand as miniaturization brings the current state-of-the art silicon-based flash systems to their scaling limit. Now researchers in Korea have used indium zinc oxide (IZO) electrodes to produce non-volatile resistive memory (ReRAM) devices that are also transparent and flexible, offering benefits for applications in displays."
Here is some more Tech News from around the web:
- AMD accuses Intel of VW-like results fudging @ The Register
- The Best Linux Distros of 2016 @ Linux.com
- Asustek Computer pushing new motherboard series for China @ DigiTimes
- Synology Router RT1900ac High-Speed Wireless Router @ eTeknix
Subject: Graphics Cards | January 19, 2016 - 10:31 AM | Sebastian Peak
Tagged: rumor, report, nvidia, GTX 980MX, GTX 980M, GTX 970MX, GTX 970M, geforce
NVIDIA is reportedly preparing faster mobile GPUs based on Maxwell, with a GTX 980MX and 970MX on the way.
The new GTX 980MX would sit between the GTX 980M and the laptop version of the full GTX 980, with 1664 CUDA cores (compared to 1536 with the 980M), 104 Texture Units (up from the 980M's 96), a 1048 MHz core clock, and up to 8 GB of GDDR5. Memory speed and bandwidth will reportedly be identical to the GTX 980M at 5000 MHz and 160 GB/s respectively, with both GPUs using a 256-bit memory bus.
The GTX 970MX represents a similar upgrade over the existing GTX 970M, with CUDA Core count increased from 1280 to 1408, Texture Units up from 80 to 88, and 8 additional raster devices available (56 vs. 48). Both the 970M and 970MX use 192-bit GDDR5 clocked at 5000 MHz, and available with the same 3 GB or 6 GB of frame buffer.
WCCFtech prepared a chart to demonstrate the differences between NVIDIA's mobile offerings:
|Model||GeForce GTX 980 Laptop Version||GeForce GTX 980MX||
GeForce GTX 980M
|GeForce GTX 970MX||GeForce GTX 970M||GeForce GTX 965M||
GeForce GTX 960M
|Clock Speed||1218 MHz||1048 MHz||1038 MHz||941 MHz||924 MHz||950 MHz||1097 MHz|
|Frame Buffer||8 GB GDDR5||8/4 GB GDDR5||8/4 GB GDDR5||6/3 GB GDDR5||6/3 GB GDDR5||4 GB GDDR5||4 GB GDDR5|
|Memory Frequency||7008 MHz||5000 MHz||5000 MHz||5000 MHz||5000 MHz||5000 MHz||5000 MHz|
|Memory Bandwidth||224 GB/s||160 GB/s||160 GB/s||120 GB/s||120 GB/s||80 GB/s||80 GB/s|
These new GPUs will reportedly be based on the same Maxwell GM204 core, and TDPs are apparently unchanged at 125W for the GTX 980MX, and 100W for the 970MX.
We will await any official announcement.
Subject: Graphics Cards | January 18, 2016 - 09:44 PM | Scott Michaud
Tagged: Polaris, amd
When AMD announced their Polaris architecture at CES, it was focused on mid-range applications. Their example was an add-in board that could compete against an NVIDIA GeForce GTX 950, 1080p60 medium settings in Battlefront, but do so at 39% less wattage than this 28nm, Maxwell chip. These Polaris chips are planned for a “mid 2016” launch.
Raja Koduri, Chief Architect for the Radeon Technologies Group, spoke with VentureBeat at the show. In his conversation, he mentioned two architectures, Polaris 10 and Polaris 11, in the context of a question about their 2016 product generation. In the “high level” space, they are seeing “the most revolutionary jump in performance so far.” This doesn't explicitly state that the high-end Polaris video card will launch in 2016. That said, when combined with the November announcement, covered by us as “AMD Plans Two GPUs in 2016,” it further supports this interpretation.
We still don't know much about what the actual performance of this high-end GPU will be, though. AMD was able to push 8 TeraFLOPs of compute throughput by creating a giant 28nm die and converting the memory subsystem to HBM, which supposedly requires less die complexity than a GDDR5 memory controller (according to a conference call last year that preceded Fury X). The two-generation jump will give them more complexity to work with, but that could be partially offset by a smaller die because of the potential differences in yields (and so forth).
Also, while the performance of the 8 TeraFLOP Fury X was roughly equivalent to NVIDIA's 5.6 TeraFLOP GeForce GTX 980 Ti, we still don't know why. AMD has redesigned a lot of their IP blocks with Polaris; you would expect that, if something unexpected was bottlenecking Fury X, the graphics manufacturer wouldn't overlook it the next chance that they are able to tweak it. This could have been graphics processing or something much more mundane. Either way, upcoming benchmarks will be interesting.
And it seems like that may be this year.
Subject: General Tech | January 18, 2016 - 01:35 PM | Jeremy Hellstrom
Tagged: input, patriot, viper v560, gaming mouse
Patriot's Viper V560 is a mere $50 but offers many of the same features as mice half again that price which leaves one wondering how they pulled it off. The mouse has nine buttons, five profiles you can quickly switch between as well as swappable side grips and weights. The Avago 9800 laser sensor can be switched between 800 to 8200 DPI, with four sensitivity presets you can customize. It even has a tiltable wheel for those who are dexterous enough to take advantage of that feature. The software is impressive, Modders Inc liked the way that you could import and export macros via the .mf file format. As far as negatives go, the red, green, blue, purple, and aqua LEDs associated with the various profiles could not be disabled and the mouse did not sit perfectly level on flat surfaces, perhaps in part because of the use of ceramic pads. If those issues do not concern you, head on over for a look at the full review.
"Patriot is a company known for its memory and mobile products, and has just recently started selling peripherals. The V560 is the first-ever mouse released under the Patriot Viper brand, and it continues the trend of excellent design set by its first-ever headset."
Here is some more Tech News from around the web:
- Tt eSPORTS Theron Plus Smart Gaming Mouse Review @ NikKTech
- Razer Orichi Mobile Bluetooth and Wired Hybrid Gaming Mouse @ eTeknix
- G.SKILL RIPJAWS KM780 RGB Mechanical Gaming Keyboard Review @ Madshrimps
- Logitech G910 Orion Spark RGB Mechanical Keyboard Review @ NikKTech