Subject: General Tech | April 1, 2015 - 08:12 PM | Jeremy Hellstrom
Tagged: gaming, battlefield hardline
[H]ard|OCP has put together an exhaustive look at what you will need to play Battlefield Hardline, using both Mantle and D3D. They tested 5 GPUs from each vendor including the Titan X as well as the GTX 980 and R9 290X in both single and dual GPU configurations. The Mantle version appears to require some more optimization as it was outperformed by D3D 11 in testing, hopefully this will change over time. At 1440p the Titan is unable to keep up with the GTX 980 SLI configuration and outperforms the Crossfired R9 290X setup, when they increased the resolution to 4k the Titan came out on top. At release the game will support dual GPUs but as you can see in [H]'s testing the scaling is not as good as it could be, again this will hopefully be addressed over time.
"Battlefield Hardline has been released finally. We gather twelve video card comparisons and find out what you need to enjoy this game in multiplayer and campaign modes. We will look at multi-GPU scaling, D3D vs. Mantle, and VRAM usage to find out where the best money is spent to enjoy this new game."
Here is some more Tech News from around the web:
- Wot I Think: Pillars Of Eternity @ Rock, Paper, SHOTGUN
- Pretty 17 – Half-Life 2: Update Comes To Steam @ Rock, Paper, SHOTGUN
- Bloodborne: An immersively thick cut above its gaming rivals @ The Register
- Classic Team Fortress: Steam Release For Fortress Forever @ Rock, Paper, SHOTGUN
- Dragon Age II 4-Years Later Review @ OCC
- GOG refund policy: "Hitting 'Buy' doesn't waive your rights" @ HEXUS
- Battlefield Hardline DRM Follow-Up @ [H]ard|OCP
Subject: General Tech | April 1, 2015 - 07:56 PM | Jeremy Hellstrom
Tagged: celeron, N3000, N3050, pentium, Intel, 14 nm, N3150, N3700, Airmont
Intel has released four low powered 14 nm Braswell SoCs, with Airmont cores and Generation 8 graphics to replace the current Bay Trail-D processors currently being sold. There are two Celeron models with two cores as well as Celeron and Pentium model with 4 cores, that is also the number of threads available as these processors do not support HyperThreading. The base frequencies range from 1.04GHz base and 2.08GHz boost clock to the top end Pentium running at 1.6GHz base and 2.4 GHz boost. All but the low end Celeron model will run at a 6W TDP, with the lowest clocked Celeron running at 4W. You can expect to see these in lower end laptops and desktops very soon. Follow the links from The Register for a bit more information on Intel's new low powered SoCs.
"CPU World reports that Intel will offer four new Atom products based on its 14-nanometer "Braswell" process, to be marketed under the Celeron and Pentium brands."
Here is some more Tech News from around the web:
- Windows 7 is still gaining users while Windows 8 plateaus @ The Inquirer
- Microsoft to slash price of top-level MSDN subs for Visual Studio 2015 @ The Register
- Ethernet Alliance plots 1.6 terabit-per-second future @ The Register
- KitGuru TV: 3D NAND and SSD interfaces
- NFV will revolutionise telecoms, and we won't even know @ The Inquirer
Process Technology Overview
We have been very spoiled throughout the years. We likely did not realize exactly how spoiled we were until it became very obvious that the rate of process technology advances hit a virtual brick wall. Every 18 to 24 months we were treated to a new, faster, more efficient process node that was opened up to fabless semiconductor firms and we were treated to a new generation of products that would blow our hair back. Now we have been in a virtual standstill when it comes to new process nodes from the pure-play foundries.
Few expected the 28 nm node to live nearly as long as it has. Some of the first cracks in the façade actually came from Intel. Their 22 nm Tri-Gate (FinFET) process took a little bit longer to get off the ground than expected. We also noticed some interesting electrical features from the products developed on that process. Intel skewed away from higher clockspeeds and focused on efficiency and architectural improvements rather than staying at generally acceptable TDPs and leapfrogging the competition by clockspeed alone. Overclockers noticed that the newer parts did not reach the same clockspeed heights as previous products such as the 32 nm based Sandy Bridge processors. Whether this decision was intentional from Intel or not is debatable, but my gut feeling here is that they responded to the technical limitations of their 22 nm process. Yields and bins likely dictated the max clockspeeds attained on these new products. So instead of vaulting over AMD’s products, they just slowly started walking away from them.
Samsung is one of the first pure-play foundries to offer a working sub-20 nm FinFET product line. (Photo courtesy of ExtremeTech)
When 28 nm was released the plans on the books were to transition to 20 nm products based on planar transistors, thereby bypassing the added expense of developing FinFETs. It was widely expected that FinFETs were not necessarily required to address the needs of the market. Sadly, that did not turn out to be the case. There are many other factors as to why 20 nm planar parts are not common, but the limitations of that particular process node has made it a relatively niche process node that is appropriate for smaller, low power ASICs (like the latest Apple SOCs). The Apple A8 is rumored to be around 90 mm square, which is a far cry from the traditional midrange GPU that goes from 250 mm sq. to 400+ mm sq.
The essential difficulty of the 20 nm planar node appears to be a lack of power scaling to match the increased transistor density. TSMC and others have successfully packed in more transistors into every square mm as compared to 28 nm, but the electrical characteristics did not scale proportionally well. Yes, there are improvements there per transistor, but when designers pack in all those transistors into a large design, TDP and voltage issues start to arise. As TDP increases, it takes more power to drive the processor, which then leads to more heat. The GPU guys probably looked at this and figured out that while they can achieve a higher transistor density and a wider design, they will have to downclock the entire GPU to hit reasonable TDP levels. When adding these concerns to yields and bins for the new process, the advantages of going to 20 nm would be slim to none at the end of the day.
Features and Specifications
If you are looking for the biggest, baddest, power supply on the planet, then we have an exclusive review for you today. But you better have a truck, a couple of strong friends, and very deep pockets!
Miller Electric has been manufacturing large, industrial-grade power supplies since 1929. They are one of the few power supply manufactures who actually design and build their own products, right here in the USA. Miller’s The Power of Blue series of high-capacity power supplies includes models that go all the way up to an astounding 10,500 watts!
While we were not able to obtain a review sample of Miller’s flagship 10.5kW unit, we do have an exclusive review of the Miller XMT 300 PC, which can deliver up to 4,500 watts of pure DC power. Talk about having some extra reserve capacity… wow!
The Miller XMT 300 PC is an external power supply that is about twice the size of a typical mid-tower case and is designed to normally sit on the floor. It features a unique single, high-capacity +12V rail capable of delivering up to 375A (4,500W). A power distribution module mounts inside the PC where the normal ATX power supply would go and breaks down the incoming +12V to the minor rails +3.3V, +5V, etc., along with providing a standard set of cables and connectors. This allows one XMT 300 PC to power multiple PCs at the same time; up to twenty computers.
Miller XMT 300 PC Key Features:
• Monstrous, single rail +12V output (up to 375A peak)
• External main power unit sits on the floor
• Can support multiple PCs at the same time
• #6AWG copper cables for minimal voltage drop
• Automatic fan speed control for optimal cooling and minimal noise
• High efficiency operation (up to 87%)
• Active Power Factor Correction
• 1-Phase or 3-Phase line voltage
• 3-Year warranty
Miller XMT 300 PC Specifications:
Subject: Storage | March 31, 2015 - 07:01 PM | Jeremy Hellstrom
Tagged: ssd, sata, Samsung, msata, M.2 SATA, 850 EVO, 500gb, 1TB, 120gb
As Al's review of the 850 EVO exists in a cat like superposition of being biased both for and against Samsung, perhaps you would like a second opinion. That is where The Tech Report's review comes in handy, which was published just a few short hours ago. Their findings were perfectly in line with the others, exactly the same performance as the 2.5" drives but in a nice bite sized form factor. The only drawback is the size, the new M.2's are missing the 1TB model at the moment.
"Samsung's 850 EVO SSD debuted in December inside the usual 2.5" case. Now, the drive is spreading to smaller mSATA and M.2 form factors. We've examined the new drives to see how the mini lineup compares to its full-sized forbear."
Here are some more Storage reviews from around the web:
- Samsung 850 EVO mSATA and M.2 @ Bjorn3d
- Samsung 850 Evo mSATA/M.2 SSD @ HardwareHeaven
- Samsung 850 EVO M.2 SATA SSD @ The SSD Review
- 480GB HyperX Predator M.2 PCIe SSD @ Bjorn3d
- Crucial BX100 SSD @ Benchmark Reviews
- The OCZ Vector 180 SSD Review @ Hardware Canucks
- Enermax 2.5 and 3.5-Inch Mobile Drive Rack Roundup @ eTeknix
- Kingston SDXC UHS-1 Memory Card @ The SSD Review
- Silicon Power Stream S06 4TB USB 3.0 HDD Review @ Madshrimps
- Synology DS1815+ @ techPowerUp
Subject: General Tech | March 31, 2015 - 06:43 PM | Scott Michaud
Tagged: raptr, pc gaming
A little late this month, but there is yet another Top 20 PC Games of the month from Raptr. This one counts the amount of person-hours played per title over the month of February and presents them as a percent of their recorded total. This month finally shows some shake-up in the top five games, but not a lot. Still, this was the first time since November that we saw anything other than: League of Legends, World of Warcraft, DOTA 2, Counter Strike: Global Offensive, and Smite -- in that order -- at the top of the list.
Basically, Diablo III rose about a percent and a half to take the fifth-place spot from Smite. First through fourth maintain the exact same order as they have been in since October. In terms of percentage, League of Legends jumped back above 20% total usage and World of Warcraft dropped below 10% total time. Both Valve titles in the top five, DOTA 2 and CS:GO, saw an increase due to content patches.
There was a lot of shuffling in the bottom fifteen, but each have less total play time than the amount that WoW lost. Heroes of the Storm, a MOBA from Blizzard, joined the list at 1.09%. This launches them onto the list at lucky number thirteen. The only game to fall off the list was Elder Scrolls V: Skyrim. Don't worry, fans of DirectX 9, Star Wars: The Old Republic is still on the list.
Oh, and Dragon Age: Inquisition fell ten places, but caught itself at #19.
Subject: Editorial | March 31, 2015 - 06:04 PM | Ryan Shrout
Tagged: video, pcper, hwc, hardware canucks, giveaway, contest
One of our favorite holidays is creeping up on us: St. Patrick's Day! What's better than an open excuses to skip work and down some beers? How about getting a boat load of free PC hardware as well?!?
That's right, PC Perspective and Hardware Canucks have teamed up with sponsors NVIDIA, EVGA, ASUS, Crucial and Phanteks to bring our readers and YouTube subscribers a a mega-epic prize pack you are going to have to see to believe!
Here is the total list:
- Grand Prize
- ASUS ROG Swift G-Sync Monitor
- EVGA GTX 980 ACX 2.0
- ASUS Maximus VII Hero
- ASUS Strix Claw Mouse
- ASUS Strix Tactic Pro KB
- ASUS Strix Pro Headset
- Crucial MX200 1TB SSD
- Phanteks Enthoo Luxe Case
- Phanteks PH-TC14S
- 2 x Phanteks PH-F140MP
- 2 x Phanteks PH-F140SP
- 2nd Prize
- EVGA GTX 960 ACX 2.0
- ASUS Maximus VII Hero
- ASUS Gladius Mouse
- Crucial BX100 1TB SSD
- Phanteks PH-TC12LS
- 2 x Phanteks PH-F140MP
- 3rd Prize
- EVGA GTX 960 ACX 2.0
- ASUS Gladius Mouse
We are hosting this contest on our YouTube channels, so here are the rules for entry:
- Subscribe to PCPer's YouTube channel
- Leave a comment on PCPer's contest video
- Subscribe to Hardware Canucks' YouTube channel
- Leave a comment on Hardware Canucks' contest video
Sorry, this is open only to US and Canada users; the cost and complexity of international shipping is just to drastic for this much hardware.
You have until March 30th to enter and we'll announce the winner on March 31st.
Subject: General Tech | March 31, 2015 - 04:47 PM | Scott Michaud
Tagged: windows 10, microsoft, build 10049
Less than two weeks after releasing the last preview build, 10041, Microsoft has pushed an update for users in the “Fast” ring. We have been asking for more rapid releases and we are beginning to get them. I spent quite a bit of Monday downloading, installing, and rebooting to install Build 10049. Now that I have used it for a bit, I can give my opinion.
Before we get to what's new, I would like to get into what is fixed (and broken). First, apparently Visual Studio 2015 has some issues, particularly with deploying to external devices. On the other hand, my usage of Visual Studio 2013 seems fine and stable. Second, a bug is preventing Hyper-V from being enabled for users who want to create a virtual machine. If you upgrade to 10049 from a previous build, where Hyper-V has been activated, then “everything works fine” when you update.
One of the listed bugs for Build 10041 (the previous build) was that Windows Update would tell you to restart to complete updates even if nothing was installed, and that the messages could be “ignored safely”. I never had that happen in 10041, but have seen it this afternoon in 10049. No big deal.
As for fixed? When I upgraded to 10041, StarCraft II stopped working and apparently the bug extended to Borderlands 2 and The Pre-Sequel, League of Legends, and others. This has been fixed in 10049. I can play StarCraft II without problems. Yay! Also, many sections of the new Settings app crashed when I attempted to open them. This nuisance has been bugging me since one of the earlier builds from last year. It has mostly been fixed now. The only hiccup is “Apps & features”, which sometimes (but not always) crashes after the loading bar completes.
Apparently Cortana has been given some non-descript update. They might be referring to its integration with Spartan, which I have yet to test, but it is still unable to, for instance, set a timer or launch Photoshop.
It took me two installs to get it actually on my system, but it seems to be very stable for a pre-release operating system with a bunch of unfinished APIs and drivers. Looking good (but I'm still scared of Windows Dev Certification)!
Subject: General Tech | March 31, 2015 - 12:18 PM | Jeremy Hellstrom
Tagged: skybridge, HPC, arm, amd
The details are a little sparse but we now have hints of what AMD's plans are for next year and 2017. In 2016 we should see AMD chips with ARM cores, the Skybridge architecture which Josh described almost a year ago, which will be pin compatible allowing the same motherboard to run with either an ARM processor or an AMD64 depending on your requirements. The GPU portion of their APUs will move forward on a two year cycle so we should not expect any big jumps in the next year but they are talking about an HPC capable part by 2017. The final point that The Register translated covers that HPC part which is supposed to utilize a new memory architecture which will be nine times faster than existing GDDR5.
"Consumer and commercial business lead Junji Hayashi told the PC Cluster Consortium workshop in Osaka that the 2016 release CPU cores (an ARMv8 and an AMD64) will get simultaneous multithreading support, to sit alongside the clustered multithreading of the company's Bulldozer processor families."
Here is some more Tech News from around the web:
Introduction, Specifications and Packaging
Following the same pattern that Samsung led with the 840 Pro and 840 EVO, history has repeated itself with the 850 Pro and 850 EVO. With the 850 EVO launching late last year and being quite successful, it was only a matter of time before Samsung expanded past the 2.5" form factor for this popular SSD. Today is that day:
Today we will be looking at the MSATA and M.2 form factors. To clarify, the M.2 units are still using a SATA controller and connection, and must therefore be installed in a system capable of linking SATA lanes to its M.2 port. As both products are SATA, the DRAM cache based RAPID mode included with their Magician value added software is also available for these models. We won't be using RAPID for this review, but we did take a look at it in a prior article.
Given that 850 EVOs use VNAND - a vastly different technology than the planar NAND used in the 840 EVO, we suspect it is not subject to the same flash cell drift related issues (hopefully to be corrected soon) in the 840 EVO. Only time will tell for sure on that front, but we have not see any of those issues present in 850 EVO models since their launch.
Cross sectional view of Samsung's 32-layer VNAND. Photo by TechInsights.
Samsung sampled us the M.2 SATA in 120GB and 500GB, and the MSATA in 120GB and 1TB. Since both are SATA-based, these are only physical packaging differences. The die counts are the same as the 2.5" desktop counterparts. While the pair of 120GB models should be essentially identical, we'll throw both in with the results to validate the slight differences in stated specs below.
Subject: Cases and Cooling | March 30, 2015 - 07:32 PM | Jeremy Hellstrom
Tagged: noctua, NH-U9S, NH-D9L, cpu cooler, air cooling
You might want to hold off reading this review as these coolers are very likely to grace our pages in the near future but if you can't wait then HiTech Legion is the place to go to check out these small coolers from Noctua. The 125mm NH-U9S fits in 4U cases while the 110mm NH-D9L can fit in 3U spaces, making them perfect for not only rack mounted cases but also for SFF builds. The weights are also smaller than usual, with the 92mm fan installed they weigh 618g and 531g respectively. For small builds with processors with a moderate TDP these are certainly worth your consideration.
"While Noctua’s new NH-U9S and NH-D9L were designed to comply with rack mount system standards, their low profile and horizontal airflow make them a natural choice for SFF and HTPC systems where CPU cooler space is limited. The Noctua NH-U9S meets 4U standards at 125mm tall, while the NH-D9L takes it a step further to meet the 110mm requirement of 3U standard."
Here are some more Cases & Cooling reviews from around the web:
- Noctua NH-L9x65 CPU Cooler Review: A New Low-Profile King @ Modders-Inc
- CRYORIG H5 Universal @ techPowerUp
- CRYORIG H5 Universal CPU Cooler @ eTeknix
- Cooler Master Hyper D92 CPU Cooler Review @ Neoseeker
- Gigabyte Waterforce @ Kitguru
- Scythe Ashura CPU Cooler Review @ NikKTech
- Phanteks Enthoo Evolv Computer Case @ Benchmark Reviews
- Corsair Graphite Series 780T tower @ HardwareOverclock
- Raven RV01 Chassis (SST-RV01B-W), Elegance updated @ Bjorn3d
- In Win 703 @ techPowerUp
A monitor for those that like it long
It takes a lot to really impress someone that sits in front of dual 2560x1600 30-in IPS screens all day, but the LG 34UM95 did just that. With a 34-in diagonal 3440x1440 resolution panel forming a 21:9 aspect ratio, built on LG IPS technology for flawless viewing angles, this monitor creates a work and gaming experience that is basically unmatched in today's market. Whether you need to open up a half-dozen Excel or Word documents, keep an eye on your Twitter feed while looking at 12 browsers or run games at near Eyefinity/Surround levels without bezels, the LG 34UM95 is a perfect option.
Originally priced north of $1200, the 34UM95 and many in LG's 21:9 lineup have dropped in price considerably, giving them more avenues into users' homes. There are obvious gaming advantages to the 34-in display compared to a pair of 1920x1080 panels (no bezel, 20% more pixels) but if you have a pair of 2560x1440 screens you are going to be giving up a bit. Some games might not handle 21:9 resolutions well either, just as we continue to see Eyefinity/Surround unsupported occasionally.
Productivity users will immediately see an improvement, both for those us inundated with spreadsheets, web pages and text documents as well as the more creative types with Adobe Premiere timelines. I know that Ken would definitely have approved us keeping this monitor here at the office for his use.
Check out the video above for more thoughts on the LG 34UM95!
Subject: Motherboards | March 30, 2015 - 03:57 PM | Jeremy Hellstrom
Tagged: gigabyte, Z97X Gaming 7, z97, Intel
The Gigabyte Z97X Gaming 7 motherboard offers a nice balance between price and performance, at $172 it is a bit pricey but when on sale for around the $150 mark it is a great deal. The onboard audio provided by Realtek's ALC1150 produced very good sound and while the Qualcomm Killer NIC E2201 doesn't offer benefits over a more generic NIC there are those who prefer the software which comes with it. The board overclocked very well for [H] manually, providing noticeable boosts with solid performance, however the EasyTune software provided them with some issues. Check out the full review right here.
"GIGABYTE's Z97X Gaming 7 promises solid overclocking and performance. The feature list for the Z97X Gaming 7 is long and includes gamer focused features like a dedicated audio amplifier, Sound Blaster X-Fi MB3 support and more. We've had mixed results with the GIGABYTE lately, so the real question is; does it work?"
Here are some more Motherboard articles from around the web:
- Gigabyte X99 SOC Champion Overclocking Motherboard Review @ Hardware Asylum
- igabyte X99 SOC Champion, Budget Overclocker? @ Bjorn3d
- ASRock X99 WS-E @ The SSD Review
- Asus X99 Deluxe Redux, Battle Of the BIOS! @ Bjorn3d
- MSI 970 GAMING @ Modders-Inc
Subject: Mobile | March 30, 2015 - 03:43 PM | Ryan Shrout
Tagged: Tegra X1, tegra, shield portable, shield, portable, nvidia
UPDATE (3/31/15): Thanks to another tip we can confirm that the new SHIELD P2523 will have the Tegra X1 SoC in it. From this manifest document you'll see the Tegra T210 listed (the same part marketed as X1) as well as the code name "Loki." Remember that the first SHIELD Portable device was code named Thor. Oh, so clever, NVIDIA.
Based on a rumor posted by Brad over at Lilliputing, it appears we can expect an updated NVIDIA SHIELD Portable device sometime later in 2015. According to both the Bluetooth and Wi-Fi certification websites, a device going by the name "NVIDIA Shield Portable P2523" has been submitted. There isn't a lot of detail though:
- 802.11a/b/g/n/ac dual-band 2.4 GHz and 5 GHz WiFi
- Bluetooth 4.1
- Android 5.0
- Firmware version 3.10.61
We definitely have a new device here as the initial SHIELD Portable did not includ 802.11ac support at all. And though no data is there to support it, you have to assume that NVIDIA would be using the new Tegra X1 processor in any new SHIELD devices coming out this year. I already previewd the new SHIELD console from GDC that utilizes that same SoC, but it might require a lower clocked, lower power version of the processor to help with heat and battery life on a portable unit.
There’s no information about the processor, screen, or other hardware. But if the new Shield portable is anything like the original, it’ll probably consist of what looks like an Xbox-style game controller with an attached 5 inch display which you can fold up to play games on the go.
And if it’s anything like the new NVIDIA Shield console, it could have a shiny new NVIDIA Tegra X1 processor to replace the aging Tegra 4 chip found in the original Shield Portable.
I wouldn’t be surprised if it also had a higher-resolution display, more memory, or other improvements.
Keep an eye out - NVIDIA may be making a push for even more SHIELD hardware this summer.
Subject: General Tech | March 30, 2015 - 01:35 PM | Jeremy Hellstrom
Tagged: Surface Pro 3, microsoft
While the ARM based Surface model seemed likely to disappear there are many hints that the Surface Pro models powered by x86 processors are going nowhere and that even Windows RT will stick around. More evidence came today from The Register who read through a Microsoft post and highlighted several updates to the UEFI in the Surface Pro 3 aimed at Enterprise users. Some of the updates are minor but very useful, you can now set the boot device for the device in the UEFI instead of needing to physically push a button during boot. One security feature which is key to the adoption of this device in the Enterprise is as being able to control what devices are functional on the Surface and with this update you can disable various connections as well as the USB ports. The final feature, being able to make changes to the UEFI remotely has been enabled but the tool needed to do so is not yet available.
The device originally seemed doomed to failure but Microsoft has found a market for their tablet and we will be seeing new models soon.
"As explained in a blog post by Redmond's JC Hornbeck, the latest update to the Surface Pro 3's Unified Extensible Firmware Interface (UEFI) adds new features for enterprise customers but only minor improvements for consumers."
Here is some more Tech News from around the web:
- Chip rumor-gasm: Intel to buy Altera! Samsung to buy AMD! ... or not @ The Register
- Intel reportedly in talks to buy Altera for £7bn in IoT push @ The Inquirer
- If Samsung bought AMD would it be a good move? @ Kitguru
- Samsung Galaxy S6 & S6 Edge Launch Date & Prices @ Tech ARP
- What To Do If You Destroyed Your Apple iPhone @ Tech ARP
- Did we just wake up in an alternate universe? BlackBerry turns a profit @ The Register
- Turning A Basement Into A Big Linux Server Room @ Phoronix
- Tech ARP 2015 Mega Giveaway
Subject: General Tech | March 29, 2015 - 08:59 PM | Scott Michaud
Tagged: renegade x, UE3, udk, unreal engine 3
If you are looking for a game to play, Totem Arts has released their fourth public beta for Renegade X. You have probably heard of the game by now but, if not, it is a third-person shooter that is in the style of Command & Conquer: Renegade. Each team has a base that provides weapons, vehicles, and upgrades in exchange for credits. Players then use this superiority to destroy each others bases (and each other directly of course).
The game is powered by Unreal Engine 3 through the UDK program. From what I can tell, they are using the February 2014 build, which is a little over a year old but relatively up-to-date technologically. Epic Games added most DirectX 11 functionality back in the March 2011 UDK beta release. It looks quite good too.
And it's free (not free-to-play). Check it out.
Subject: Graphics Cards | March 27, 2015 - 04:02 PM | Jeremy Hellstrom
Tagged: gtx titan x, linux, nvidia
Perhaps somewhere out there is a Linux user who wants a TITAN X and if there is they will like the results of Phoronix's testing. The card works perfectly straight out of the box with the latest 346.47 driver as well as the 349.12 Beta; if you want to use Nouveau then don't buy this card. The TITAN did not win any awards for power efficiency but for OpenCL tests, synthetic OpenGL benchmarks and Unigine on Linux it walked away a clear winner. Phoronix, and many others, hope that AMD is working on an updated Linux driver to accompany the new 300 series of cards we will see soon to help them be more competitive on open source systems.
If you are sick of TITAN X reviews by now, just skip to their 22 GPU performance roundup of Metro Redux.
"Last week NVIDIA unveiled the GeForce GTX TITAN X during their annual GPU Tech Conference. Of course, all of the major reviews at launch were under Windows and thus largely focused on the Direct3D performance. Now that our review sample arrived this week, I've spent the past few days hitting the TITAN X hard under Linux with various OpenGL and OpenCL workloads compared to other NVIDIA and AMD hardware on the binary Linux drivers."
Here are some more Graphics Card articles from around the web:
- Nvidia Geforce GTX Titan X 12GB @ Kitguru
- Nvidia GeForce GTX Titan X @ Legion Hardware
- Asus GeForce GTX 970 DirectCU Mini @ Kitguru
- ASUS STRIX GTX 960 DirectCU II OC @ [H]ard|OCP
- Zotac GeForce GTX980 AMP Omega Edition @ Bjorn3d
- PowerColor R9 285 2GB Turbo Duo @ Modders-Inc
- The Best Graphics Solution You Can Buy For Around £1000: Sapphire 295X2’s @ eTeknix
Subject: General Tech | March 27, 2015 - 02:23 PM | Jeremy Hellstrom
Tagged: Xeon Phi, silvermont, knights landing, Intel
Today a bit more information about Intel's upcoming Knights Landing platform appeared at The Register. The 60 core and 240 thread figure is quoted once again though now we know there is over 8 billion transistors on the chip, which does not include the 16 GB of near memory also present on the package. The processor will support six memory channel, three each in two memory controllers on the die, with a total of 384 GB of far memory. The terms near and far are new, representing onboard and external memory respectively. There is a lot more information you can dig into by following the link on The Register to this long article posted at The Platform.
"Intel has set some rumours to rest, giving a media and analyst briefing outlining details of its coming 60-plus core Knights Landing Xeon Phi chip."
Here is some more Tech News from around the web:
- Silent server monitoring: A neat little cure that doesn't kill the patient @ The Register
- Qualcomm, MediaTek shifting some 28nm chip orders away from TSMC @ DigiTimes
- Quebec Plans To Require Website Blocking, Studies New Internet Access Tax @ Slashdot
- Intel and Micron team up on 3D NAND flash memory for 10TB SSDs @ The Inquirer
- Toshiba and Sandisk announce BiCS as the first 48-layer 3D flash chip @ The Inquirer
It's more than just a branding issue
As a part of my look at the first wave of AMD FreeSync monitors hitting the market, I wrote an analysis of how the competing technologies of FreeSync and G-Sync differ from one another. It was a complex topic that I tried to state in as succinct a fashion as possible given the time constraints and that the article subject was on FreeSync specifically. I'm going to include a portion of that discussion here, to recap:
First, we need to look inside the VRR window, the zone in which the monitor and AMD claims that variable refresh should be working without tears and without stutter. On the LG 34UM67 for example, that range is 48-75 Hz, so frame rates between 48 FPS and 75 FPS should be smooth. Next we want to look above the window, or at frame rates above the 75 Hz maximum refresh rate of the window. Finally, and maybe most importantly, we need to look below the window, at frame rates under the minimum rated variable refresh target, in this example it would be 48 FPS.
AMD FreeSync offers more flexibility for the gamer than G-Sync around this VRR window. For both above and below the variable refresh area, AMD allows gamers to continue to select a VSync enabled or disabled setting. That setting will be handled as you are used to it today when your game frame rate extends outside the VRR window. So, for our 34UM67 monitor example, if your game is capable of rendering at a frame rate of 85 FPS then you will either see tearing on your screen (if you have VSync disabled) or you will get a static frame rate of 75 FPS, matching the top refresh rate of the panel itself. If your game is rendering at 40 FPS, lower than the minimum VRR window, then you will again see the result of tearing (with VSync off) or the potential for stutter and hitching (with VSync on).
But what happens with this FreeSync monitor and theoretical G-Sync monitor below the window? AMD’s implementation means that you get the option of disabling or enabling VSync. For the 34UM67 as soon as your game frame rate drops under 48 FPS you will either see tearing on your screen or you will begin to see hints of stutter and judder as the typical (and previously mentioned) VSync concerns again crop their head up. At lower frame rates (below the window) these artifacts will actually impact your gaming experience much more dramatically than at higher frame rates (above the window).
G-Sync treats this “below the window” scenario very differently. Rather than reverting to VSync on or off, the module in the G-Sync display is responsible for auto-refreshing the screen if the frame rate dips below the minimum refresh of the panel that would otherwise be affected by flicker. So, in a 30-144 Hz G-Sync monitor, we have measured that when the frame rate actually gets to 29 FPS, the display is actually refreshing at 58 Hz, each frame being “drawn” one extra instance to avoid flicker of the pixels but still maintains a tear free and stutter free animation. If the frame rate dips to 25 FPS, then the screen draws at 50 Hz. If the frame rate drops to something more extreme like 14 FPS, we actually see the module quadruple drawing the frame, taking the refresh rate back to 56 Hz. It’s a clever trick that keeps the VRR goals and prevents a degradation of the gaming experience. But, this method requires a local frame buffer and requires logic on the display controller to work. Hence, the current implementation in a G-Sync module.
As you can see, the topic is complicated. So Allyn and I (and an aging analog oscilloscope) decided to take it upon ourselves to try and understand and teach the implementation differences with the help of some science. The video below is where the heart of this story is focused, though I have some visual aids embedded after it.
Still not clear on what this means for frame rates and refresh rates on current FreeSync and G-Sync monitors? Maybe this will help.
Subject: General Tech | March 27, 2015 - 07:03 AM | Scott Michaud
Tagged: mx master, mx, mouse, logitech
In the universe of computer mice, Logitech is one of the best known manufacturers. This one, the Logitech MX Master Wireless Mouse, is not part of their “G Series”. At a price of $99.99 USD, or $119.99 CAD, it is their most expensive offering in that class.
The MX Master is a five button, right handed mouse. While that is not particularly exciting, one interesting feature is the horizontal scrolling tumbler on the thumb rest. The wheel on top scrolls up and down, while the one on the side can scroll left and right (or be reconfigured with Logitech's software). It is also a laser mouse that is capable of tracking on many types of surfaces, including thicker sheets of glass. It can be paired to three separate devices at once, either by Bluetooth or Logitech's proprietary receiver. Its rechargeable battery lasts about 40 days of 6 hour per day usage. Four minutes of charging yields about six hours of usage, and you can apparently even use the mouse while tethered.
The Logitech MX Master will be available in April for $99.99 USD.