Subject: Graphics Cards | June 20, 2017 - 02:02 PM | Jeremy Hellstrom
Tagged: msi, LIGHTNING Z, gtx 1080 ti, factory overclocked
MSI have expanded their Lightning line with a new GTX 1080 Ti GPU. The Lightning line comes with three profiles, including one which bears the name of the family, which will set your GPU to a boost clock of 1721 MHz, 1607 MHz base. The other two modes are Gaming, which runs at 1695 MHz boost, 1582 MHz base and a Silent mode running at 1582 MHz/1480MHz.
This GPU shares the high end features appearing on many MSI cards, the TRI-FROZR cooler with TORX 2.0 fans and SuperPipes as well as Military Class 4 components and a 10-layer PCB with 14 power phases for the GPU and 3 for the memory. What is somewhat new is the RGB infection, which can be controlled by MSI's Mystic Light app to create your own personalized light show.
Check out the full PR below.
MSI is proud to officially announce the latest of its legendary LIGHTNING graphics cards. Built to be perfect, the new GeForce® GTX 1080 Ti LIGHTNING Z combines cutting edge new technology with proven features such as TRI-FROZR design with TORX 2.0 Fans, SuperPipe technology and Military Class 4 components. The GeForce® GTX 1080 Ti LIGHTNING Z is nothing short of an engineering masterpiece.
Unmatched Thermal Design
MSI Torx Fan 2.0MSI’s reputation in thermal design is well-known to be excellent. The improved TRI-FROZR design on the GTX 1080 Ti LIGHTNING Z utilizes two 10cm and one 9cm TORX 2.0 Fans combining the advantages of both traditional fan blade and dispersion fan blade, generating huge amounts of airflow while remaining virtually silent. Two 8mm SuperPipes transfer heat much faster to the fins, enabling up to a whopping 700W of heat dissipation.
Mystic Light Sync with Brilliant RGB Effect
MSI’s Mystic Light enables you tocustomize the RGB effects of your hardware to give your system a different look whenever you feel like it. Using the MSI Mystic Light software, you can even synchronize colors and effects of your graphics card, motherboard, case-fans and peripherals. Give yourself or the audience a show!
Dual BIOS and Enhanced Power Design
The special LN2 BIOS on the card provides extreme overclockers more capibility for overclocking records without special hardware modifications. By removing restrictions, the full potential of the graphics card is unlocked. The enhanced power design contains more power phases than other models to ensure plenty of power is available for record-breaking performance. LIGHTNING’s custom 10-layer PCB is fitted with 14 phases for GPU and 3 phases for Memory to ensure power delivery can handle the most extreme loads.
Military Class 4 Components
Equipped with Military Class 4 components, the MSI GTX 1080 Ti LIGHTNING Z is built to deliver the best quality and stability. The components have gone through rigorous testing by a third-party laboratory to satisfy the MIL-STD-810G standard. Featuring DrMOS 60A power phases, the highest rated available ensuring plenty of power. Hi-C CAP cores, Super Ferrite Choke, and Solid CAP, each aspect of the LIGHTNING Z ensures the best possible performance.
On-board and in control
With MSI's exclusive OC kits you're in complete control of the GTX 1080 Ti LIGHTNING Z. V-Check points allow you to accurately measure GPU, Memory and PLL voltages. Multiple Temp Monitor checks the real-time temperatures of the GPU, Memory and PLL while Quadruple Overvoltage allows you to overvolt those same components in order to achieve higher clock speeds.
Subject: Graphics Cards | June 18, 2017 - 07:33 AM | Scott Michaud
Tagged: graphics drivers, amd
During E3, AMD released a new graphics driver, 17.6.2. It doesn’t list any general improvements, and just a single fixed issue: improved performance on DiRT 4 with 8xMSAA with the most recent game build. If you play this game, then you should consider updating your graphics driver.
If you aren't intending to play DiRT 4, and you're already on 17.6.1, then you could probably skip it. That said, if you have an issue, and it’s not listed in the Known Issues section, because those definitely aren’t fixed yet, then you can give it a try.
Subject: General Tech, Graphics Cards | June 17, 2017 - 09:23 PM | Ken Addison
Tagged: nicehash, mining, cryptocurrency
Over the last several weeks, we have been experimenting with the most recent GPU-shortage-inducing coin mining craze, with Ken's article as a jumping off point. On a recent podcast, I mentioned the idea of running a community coin mining group that would be used as a way for individuals to contribute to PC Perspective. I received several requests for the wallet and setup information to make this happen, so I thought it would be worth while to gather all the necessary links and info in a single location.
We have been running a Patreon campaign for a couple of years now on the site as a way to provide an avenue for those readers and viewers that find PC Perspective a useful resource to the community and directly contribute. It might be because you want to keep the PCPer staff stable, it could be because you use an ad blocker and are looking for a way to even things out, etc. But there are always some that don't have the ability or desire to sign up for a new service so contributing your empty GPU cycles is another option if you want to donate to the PCPer team.
How do you do it? Ken has created a step by step guide below - thanks for your support in this and all of our previous endeavors!
- Bitcoin: 1HHhVWPRpCUst9bDYtLstMdD7o5SzANk1W
- Ethereum: 0xa0294763261aa85eB5f1dA3Ca0f03E1B672EED87
For those of you who may be curious to try out this mining stuff on your personal computer, we would recommend looking into the NiceHash application.
For those of you who haven't read our previous article, NiceHash is a service that connects buyers of GPU mining power to sellers who have spare hardware that they are looking to put to use.
As a warning, if you are planning to mine please be aware of your power consumption. To get a good idea of this, you can look up the TDP of your given graphics card, multiply that wattage by the hours you plan to mine, divide by 1000 to translate from watts to kilowatts, and multiply that by the rate you pay for electricity (this can be found on your power bill in cents per Kilowatt/Hour in the US). (So it's watts*hours*days/1000*kw/hr rate - Thanks CracklingIce)
Given the current rates of value for these cryptocurrencies, power is a small portion of the gross profit made by mining, but it is important to be aware of this before you are presented with a huge power bill that you weren't expecting.
First, download the latest version of the NiceHash miner application from their website.
After your download has finished, extract the ZIP file and load the NiceHashMiner.exe program.
Once the application has been launched and you've accepted the terms of the EULA, the NiceHash Miner will start to download the appropriate mining applications for your given hardware.
Note: during this installation process, your antivirus program might detect malware. These miner executables that are being downloaded are safe, but many antivirius programs flag them as malware because if they are found on your PC without your permission they are a telltale sign of malicious software.
After the installation process is completed, you be brought to the main screen of the application.
From here, choose the server location closest to you, add the Bitcoin address (in this case: 1HHhVWPRpCUst9bDYtLstMdD7o5SzANk1W), and choose a unique worker name (up to 7 characters long).
From here, hit the benchmark button, select the devices you want to mine on (we would recommend GPUs only, CPUs don't earn very much), and hit the Start button.
Once the benchmarking is done, you'll be brought back to the main screen of the application where you can hit the Start button.
Once you hit the start button, a command prompt window will launch where you can see the miner at work (this can be hidden from the NiceHash setting pane), and you can view the stats of your computer in the original NiceHash application window.
And that's it, your computer will now be mining towards the PCPER community pool!
Subject: Graphics Cards | June 14, 2017 - 08:42 PM | Tim Verry
Tagged: zotac, gtx 1080 ti, factory overclocked, gp102, SFF
Zotac recently unveiled a slimmed down GTX 1080 Ti graphics card that uses a dual slot and dual fan cooler with a short PCB. The aptly named Zotac GTX 1080 Ti Mini measures 8.3” (211mm) long and will be the smallest GTX 1080 Ti on the market. Despite the miniaturization, Zotac is still offering a decent factory overclock on the Pascal GPU (but not the memory) with a boost clock of 1620 MHz versus the reference boost clock of 1582.
Zotac uses two 8-pin PCI-E power connectors to drive the card with its GTX 1080 Ti GPU (3584 CUDA cores) and 11GB of GDDR5X memory clocked at 11 GHz. The slimmed down graphics card features a metal backplate, dual shrouded fans, and a heatsink with aluminum fins and five 6mm heat pipes. The card has three DisplayPort 1.4 ports, one HDMI 2.0b port, and one DL-DVI output with the card supporting up to four simultaneous displays.
The Zotac GTX 1080 Ti Mini should enable quite a bit of horsepower in small form factor systems. The graphics card is model number ZT-P10810G-10P and Zotac has it listed on its website. Unfortunately, Zotac is not yet talking pricing or availability for the shortened card.
It appears that overclocking is not out of the question, but I am curious just how far it could be pushed especially in a small case with tight quarters and less airflow.
Subject: General Tech, Graphics Cards | June 13, 2017 - 01:39 PM | Jeremy Hellstrom
Tagged: nvidia, free games, evga, destiny 2
Were you a fan of the original Destiny or simply a fan of free games and happen to be shopping for a new NVIDIA GPU? EVGA have just launched a new giveaway, if you pick up one of their GTX 1080 or 1080 Ti's they will provide you with a code that not only provides you with a free copy of Destiny 2 but also allows you access to the beta.
As usual you need to have an EVGA account so you can register your GPU and so the code can be provided to your account. From there head on over to NVIDIA to redeem the code and patiently await the start of the beta and final release of the game.
June 13th, 2017 - Get Game Ready with EVGA GeForce GTX 10 Series and experience Destiny 2 on PC. For a limited time, buy a select EVGA GeForce GTX 1080 Ti or EVGA GeForce GTX 1080 graphics card and get Destiny 2 at PC Launch and [Early] Access to the PC Beta!
GeForce GTX 10 Series GPUs brings the beautiful world of Destiny 2 to life in stunning 4K. Experience incredibly smooth, tear-free gameplay with NVIDIA G-SYNC™ and share your greatest gameplay moments with NVIDIA ShadowPlay using GeForce Experience.
About Destiny 2:
Humanity's last safe city has fallen to an overwhelming invasion force, led by Ghaul, the imposing commander of the brutal Red Legion. He has stripped the city's Guardians of their power, and forced the survivors to flee. You will venture to mysterious, unexplored worlds of our solar system to discover an arsenal of weapons and devastating new combat abilities. To defeat the Red Legion and confront Ghaul, you must reunite humanity's scattered heroes, stand together, and fight back to reclaim our home.
Learn more and see qualifying EVGA cards at https://www.evga.com/articles/01112/destiny-2-game-ready/
Subject: Graphics Cards | June 13, 2017 - 01:17 PM | Jeremy Hellstrom
Tagged: nvidia, gtx 1080 ti, GTX 1080 Ti GAMING X, msi, Twin Frozr VI, 4k
MSI's latest version of the GeForce GTX 1080 Ti is their GAMING X 4K and has the design features you would expect, Twin Frozr VI, Hi-C CAPs, Super Ferrite Chokes and Japanese Solid Caps. When benchmarking the card [H]ard|OCP saw performance significantly higher than the quoted 1657MHz boost speed, the average was 1935MHz before they overclocked and an impressive 2038MHz for the highest stable in game frequency. They tested both the default and overclocked frequencies against a battery of benchmarks, including the newly released Prey. The card performed admirably at 4k, with many games still performing will with all graphics options at maximum, drop by for a look.
"We review a custom GeForce GTX 1080 Ti based video card with custom cooling and a factory overclock built for overclocking. Can the MSI GeForce GTX 1080 Ti GAMING X truly deliver a consistent enjoyable high-end graphics setting gameplay experience in games at 4K finally? Is a single card viable for current generation gaming at 4K?"
Here are some more Graphics Card articles from around the web:
- Asus ROG Strix GeForce GTX 1080 OC Edition 8GB 11Gbps Video Card Review @ Bjorn3d
- 15-Way NVIDIA/AMD OpenCL GPU Linux Benchmarks Of Ethereum Ethminer @ Phoronix
- XFX RX 460 4GB Heatsink Edition Review @ Bjorn3d
- XFX Rs XXX Edition Rx 570 4GB OC Review @ Bjorn3d
Astute readers of the site might remember the original story we did on Bitcoin mining in 2011, the good ole' days where the concept of the blockchain was new and exciting and mining Bitcoin on a GPU was still plenty viable.
However, that didn't last long, as the race for cash lead people to developing Application Specific Integrated Circuits (ASICs) dedicated solely to Bitcoin mining quickly while sipping power. Use of the expensive ASICs drove the difficulty of mining Bitcoin to the roof and killed any sort of chance of profitability from mere mortals mining cryptocurrency.
Cryptomining saw a resurgence in late 2013 with the popular adoption of alternate cryptocurrencies, specifically Litecoin which was based on the Scrypt algorithm instead of AES-256 like Bitcoin. This meant that the ASIC developed for mining Bitcoin were useless. This is also the period of time that many of you may remember as the "Dogecoin" era, my personal favorite cryptocurrency of all time.
Defenders of these new "altcoins" claimed that Scrypt was different enough that ASICs would never be developed for it, and GPU mining would remain viable for a larger portion of users. As it turns out, the promise of money always wins out, and we soon saw Scrypt ASICs. Once again, the market for GPU mining crashed.
That brings us to today, and what I am calling "Third-wave Cryptomining."
While the mass populous stopped caring about cryptocurrency as a whole, the dedicated group that was left continued to develop altcoins. These different currencies are based on various algorithms and other proofs of works (see technologies like Storj, which use the blockchain for a decentralized Dropbox-like service!).
As you may have predicted, for various reasons that might be difficult to historically quantify, there is another very popular cryptocurrency from this wave of development, Ethereum.
Ethereum is based on the Dagger-Hashimoto algorithm and has a whole host of different quirks that makes it different from other cryptocurrencies. We aren't here to get deep in the woods on the methods behind different blockchain implementations, but if you have some time check out the Ethereum White Paper. It's all very fascinating.
Subject: Graphics Cards | June 8, 2017 - 05:26 PM | Jeremy Hellstrom
Tagged: radeon, Crimson Edition 17.6.1, amd
In the very near future AMD will be releasing an updated driver, focused on improving performance in Prey and DiRT 4.
For DiRT 4 it will enable a Multi GPU profile and up to 30% performance improvement when using 8xMSAA on a Radeon RX 580 8GB compared to the previous release.
Subject: Graphics Cards, Displays | June 6, 2017 - 06:06 PM | Scott Michaud
Tagged: hdr, sdr, nvidia, computex
Dmitry Novoselov of Hardware Canucks saw an NVIDIA SDR vs HDR demo, presumably at Computex based on timing and the intro bumper, and noticed that the SDR monitor looked flat. According to his post in the YouTube comments, he asked NVIDIA to gain access to the monitor settings, and they let him... and he found that the brightness, contrast, and gamma settings were way off. He then performed a factory reset, to test how the manufacturer defaults hold up in the comparison, and did his video based on those results.
I should note that video footage of HDR monitors will not correctly describe what you can see in person. Not only is the camera not HDR, and thus not capable of showing the full range of what the monitor is displaying, but also who knows what the camera’s (and later video processing) exposure and color grading will actually correspond to. That said, he was there and saw it in person, so his eyewitness testimony is definitely valid, but it may or may not focus on qualities that you care about.
Anywho, the test was Mass Effect: Andromeda, which has a native HDR profile. To his taste, he apparently prefers the SDR content in a lot of ways, particularly how the blown out areas behave. He claims that he’s concerned about game-to-game quality, because there will be inconsistency between how one color grading professional chooses to process a scene versus another, but I take issue with that. Even in standard color range, there will always be an art director that decides what looks good and what doesn’t.
They are now given another knob, and it’s an adjustment that the industry is still learning how to deal with, but that’s not a downside to HDR.
Subject: Graphics Cards | June 2, 2017 - 03:02 PM | Jeremy Hellstrom
Tagged: amd, radeon, linux
When Phoronix does a performance round up they do not mess around. Their latest look at the performance of AMD cards on Linux stretches all the way back to the HD 2900XT and encompasses almost every single GPU released between that part and the RX 580, with a pair of Firepro cards and the Fury included as well. For comparative performance numbers you will see 28 NVIDIA cards on these charts, which makes the charts some of the longest you have seen. Drop by to check out the state of AMD performance on Linux in a variety of games as well as synthetic benchmarks.
"It's that time of the year where we see how the open-source AMD Linux graphics driver stack is working on past and present hardware in a large GPU comparison with various OpenGL games and workloads. This year we go from the new Radeon RX 580 all the way back to the Radeon HD 2900XT, looking at how the mature Radeon DRM kernel driver and R600 Gallium3D driver is working for aging ATI/AMD graphics hardware. In total there were 51 graphics cards tested for this comparison of Radeon cards as well as NVIDIA GeForce hardware for reference."
Here are some more Graphics Card articles from around the web:
- PowerColor Red Devl Radeon RX 580 Video Card Review @ Hardware Asylum
- 21-Way NVIDIA Fermi/Kepler/Maxwell/Pascal OpenCL GPU Comparison @ Phoronix
- 28-Way NVIDIA GeForce GPU Comparison On Ubuntu: From GeForce 8 To GeForce 1080 @ Phoronix
- ASUS GTX 1080 ROG Strix OC 11Gbps @ Kitguru
- MSI GTX 1080 Gaming X Plus 8GB @ Kitguru