Subject: Graphics Cards | October 25, 2016 - 01:21 PM | Jeremy Hellstrom
Tagged: pascal, nvidia, msi, GTX 1050 Ti, gtx 1050, GP107
The Guru of 3D tested out MSI's GeForce GTX 1050 and 1050 Ti, with MSRP's of $109 and $139 respectively. The non-Ti version has the lowest count of Texture Mapping Units of this generation but a higher GPU frequency that the Ti model, it also has the smallest amount of memory at 2GB though at least it is clocked the same in both models. DirectX 12 testing offers variable results, in many games the two are bookends to the RX 460 with the GTX 1050 a bit slower and the 1050 Ti a bit faster but this does not hold true in all games. DirectX 11 results were more favourable for this architecture, the two cards climbed in the rankings with the 1050 Ti offering acceptable performance. Check out their full review here.
"Last week Nvidia announced the GeForce GTX 1050 series, with two primary models. In this article we'll review the MSI GeForce GTX 1050 and 1050 Ti Gaming X, two graphics cards aimed at the budget minded consumer. We say budget minded as these cards are very affordable and positioned in an attractive 109 and 139 dollar (US) segment."
Here are some more Graphics Card articles from around the web:
- Sub-$150 Pascal: NVIDIA GeForce GTX 1050 & GTX 1050 Ti Review @ Techgage
- The NVIDIA GTX 1050 Ti & GTX 1050 Review @ Hardware Canucks
- MSI GTX 1050 Gaming X 2G Review @ OCC
- MSI GTX 1050 Ti Gaming X 4 GB @ techPowerUp
- MSI GTX 1050 Gaming X 2 GB @ techPowerUp
- Nvidia's GeForce GTX 1060 @ The Tech Report
- Gigabyte GTX 1070 Xtreme Gaming 8 GB @ techPowerUp
- MSI RX 470 Gaming X 8G Review @ OCC
- The XFX Radeon RX 470 RS Black Edition @ Tech ARP
Subject: General Tech | October 18, 2016 - 09:01 AM | Scott Michaud
Tagged: pascal, nvidia, GTX 1050 Ti, gtx 1050
NVIDIA has just announced that the GeForce GTX 1050 ($109) and GeForce GTX 1050 Ti ($139) will launch on October 25th. Both of these Pascal-based cards target the 75W thermal point, which allows them to be powered by a PCIe bus without being tethered directly to the power supply. Like the GTX 750 Ti before it, this allows users to drop it into many existing desktops, upgrading it with discrete graphics.
Most of NVIDIA's press deck focuses on the standard GTX 1050. This $109 SKU contains 2GB of GDDR5 memory and 640 CUDA cores, although the core frequency has not been announced at the time of writing. Instead, NVIDIA has provided a handful of benchmarks, comparing the GTX 1050 to the earlier GTX 650 and the Intel Core i5-4760k integrated graphics.
It should be noted that, to hit their >60FPS targets, Gears of War 4 and Grand Theft Auto V needed to be run at medium settings, and Overwatch was set to high. (DOTA2 and World of Warcraft were maxed out, though.) As you might expect, NVIDIA reminded the press about GeForce Experience's game optimization setting just a few slides later. The implication seems to be that, while it cannot max out these games at 1080p, NVIDIA will at least make it easy for users to experience its best-case scenario, while maintaining 60FPS.
So yes, while it's easy to claim 60 FPS is you're able to choose the settings that fit this role, it's a much better experience than the alternative parts they list. On the GTX 650, none of these titles are able to hit an average of 30 FPS, and integrated graphics cannot even hit 15 FPS. This card seems to be intended for users that are interested in playing eSports titles maxed out at 1080p60, while enjoying newer blockbusters, albeit at reduced settings, but have an old, non-gaming machine they can salvage.
Near the end of their slide deck, they also mention that the GTX 1050 Ti exists. It's basically the same use case as above, with its 75W TDP and all, but with $30 more performance. The VRAM doubles from 2GB to 4GB, which should allow higher texture resolutions and more mods, albeit still targeting 1080p. It also adds another 128 CUDA cores, a 20% increase, although, again, that is somewhat meaningless until we find out what the card is clocked at.
Update: Turns out we did find clock speeds! The GTX 1050 will have a base clock of 1354 MHz and a Boost clock of 1455 MHz while the GTX 1050 Ti will run at 1290/1392 MHz respectively.
NVIDIA's promotional video
Obviously, numbers from a vendor are one thing, and a third-party benchmark is something else entirely (especially when the vendor benchmarks do not compare their product to the latest generation of their competitor). Keep an eye out for reviews.
Subject: Graphics Cards | October 6, 2016 - 03:17 PM | Tim Verry
Tagged: windforce, pascal, nvidia, GTX 1080, gigabyte
Gigabyte is launching a new graphics card with a blower style cooler that it is calling the GTX 1080 TT. The card, which is likely based on the NVIDIA reference PCB, uses a lateral-blower style single “WindForce Turbo Fan” fan. The orange and black shrouded fan takes design cues from the company’s higher end Xtreme Gaming cards and it has a very Mass Effect / Halo Forerunners vibe to it.
The GV-N1080TTOC-8GD is powered by a single 8-pin PCI-E power connector and has a 180W TDP. Despite not using more than one external power connector, the card does still have a bit of overclocking headroom (a total of 225W from the PCI-E spec, though overdrawing on the 8-pin has been done before if the card is not locked in the BIOS to not do so heh). External video outputs include one DVI, one HDMI, and three DisplayPorts. I wish that the DVI port had been cut so that the blower cooler could have a much larger vent to exhaust air out of the case with, but it is what it is.
Out of the box the Gigabyte GTX 1080 TT runs the Pascal-based 2560 CUDA core GPU at 1632 MHz base and 1772 MHz boost. In OC Mode the GPU runs at 1657 MHz base and 1797 MHz boost. The 8 GB of GDDR5X memory is left untouched at the stock 10 GHz in either case. For comparison, reference clock speeds are 1607 MHz base and 1733 MHz boost. As far as factory overclocks go, these are not bad (they are usually at least this conservative).
The heatsink uses three direct contact 6mm copper heat pipes for the GPU and aluminum plates on the VRM and memory chips that transfer heat to an aluminum fin channels that the blower fan at the back of the card uses to push case air over and out of the case. It may be possible to push the card beyond the OC mode clocks though it is not clear how stable boost clocks will be under load (or how loud the fan will be). We will have to wait for reviews on that. If you have a cramped case this may be a decent GTX 1080 option that is cheaper than the Founder's Edition desgin.
There is no word on pricing or an exact release date yet, but I would estimate it at around $640 at launch.
Subject: General Tech | October 4, 2016 - 05:45 PM | Jeremy Hellstrom
Tagged: powerlink, pascal, evga, deals
EVGA sent along a newsletter which is worth mentioning as there are a few good deals to be had, even if you have already picked up one of their cards. Anyone who recently bought a Pascal based EVGA card or is planning to in the near future can get up to four EVGA Powerlink cable management ... thingies. It is for your PCIe power connectors and wraps around your GPU, allowing you to power your card without exposing those wires and connectors, great for modders or those who prefer a clean looking build. You do need to create an EVGA account and register your card, do keep that in mind.
The PCIe power connectors on the Powerlink are adjustable, no matter which card you purchased you will be able to use the adapter. There are capacitors inside which are intended to help ensure smooth power delivery, so this not simply an extenstion cord. They also have some deals on previous generation NVIDIA cards as well as their TORQ mouse.
There is also a rather unique deal for those who game on the go as well as at home. As it says below, every purchase of an EVGA SC17 980m laptop (758-21-2633-T1) comes with a free GTX 1070 FTW as long as supplies last.
Subject: Graphics Cards | October 2, 2016 - 12:12 PM | Sebastian Peak
Tagged: rumor, report, pascal, nvidia, GTX 1050 Ti, graphics card, gpu, GP107, geforce
A report published by VideoCardz.com (via Baidu) contains pictures of an alleged NVIDIA GeForce GTX 1050 Ti graphics card, which is apparently based on a new Pascal GP107 GPU.
Image credit: VideoCardz
The card shown is also equipped with 4GB of GDDR5 memory, and contains a 6-pin power connector - though such a power requirement might be specific to this particular version of the upcoming GPU.
Image credit: VideoCardz
Specifications for the GTX 1050 Ti were previously reported by VideoCardz, with a reported GPU-Z screenshot. The card will apparently feature 768 CUDA cores and a 128-bit memory bus, with clock speeds (for this particular sample) of 1291 MHz base, 1392 MHz boost (with some room to overclock, from this screenshot).
Image credit: VideoCardz
An official announcement for the new GPU has not been made by NVIDIA, though if these PCB photos are real it probably won't be far off.
Subject: Graphics Cards | September 27, 2016 - 10:04 PM | Tim Verry
Tagged: water cooling, pascal, hybrid cooler, gtx 1070, GP104, evga
EVGA is preparing to launch the GTX 1070 FTW Hybrid which is a water cooled card that pairs NVIDIA's GTX 1070 GPU with EVGA's Hybrid cooler and custom FTW PCB. The factory overclocked graphics card is currently up for pre-order for $500 on EVGA's website.
The GTX 1070 FTW Hybrid uses EVGA's custom PCB that features two 8-pin power connectors that drive a 10+2 power phase and dual BIOS chips. The Hybrid cooler includes a shrouded 100mm axial fan and a water block that directly touches both the GPU and the memory chips. The water block connects to an external 120mm radiator and a single fan that can be swapped out and/or powered by a motherboard using a standard four pin connector. Additionally, the cooler has a metal back plate and RGB LED back-lit EVGA logos on the side and windows on the front. Display outputs include one DVI, one HDMI, and three DisplayPort connectors.
As far as specification go, EVGA did not get too crazy with the factory overclock, but users should be able to push it quite far on their own assuming they get a decent chip from the silicon lottery. The GP104 GPU has 1920 CUDA cores clocked at 1607 MHz base and 1797 MHz boost. However, the 8 GB of memory is clocked at the stock 8,000 MHz. For comparison, reference clock speeds are 1506 MHz base and 1683 MHz boost.
Interestingly, EVGA rates the GTX 1070 FTW Hybrid at 215 watts versus the reference card's 150 watts. It is also the same TDP rating as the GTX 1080 FTW Hybrid card.
The table below outlines the specifications of EVGA's water cooled card compared to the GTX 1070 reference GPU and the GTX 1080 FTW Hybrid.
|GTX 1070||GTX 1070 FTW Hybrid||GTX 1080 FTW Hybrid|
|Rated Clock||1506 MHz||1607 MHz||1721 MHz|
|Boost Clock||1683 MHz||1797 MHz||1860 MHz|
|Memory Clock||8000 MHz||8000 MHz||10000 MHz|
|TDP||150 watts||215 watts||215 watts|
|MSRP (current)||$379 ($449 FE)||$500||$730|
According to EVGA, the Hybrid cooler offers up GPU and memory temperatures to 45°C and 57°C respectively compared to reference temperatures of 80°C and 85°C. Keeping in mind that these are EVGA's own numbers (you can see our Founder's Edition temperature results here), the Hybrid cooler seems to be well suited for keeping Pascal GPUs in check even when overclocked. In reviews of the GTX 1080 FTW Hybrid, reviewers found that the Hybrid cooler allowed stable 2GHz+ GPU clock speeds that let the card hit their maximum boost clocks and stay there under load. Hopefully the GTX 1070 version will have similar results. I am interested to see whether the memory chips they are using will be capable of hitting at least the 10 GHz of the 1080 cards if not more since they are being cooled by the water loop.
You can find more information on the factory overclocked water cooled graphics card on EVGA's website. The card is available for pre-order at $500 with a 3 year warranty.
Pricing does seem a bit high at first glance, but looking around at other custom GTX 1070 cards, it is only at about a $50 premium which is not too bad in my opinion. I will wait to see actual reviews before I believe it, but if I had to guess the upcoming card should have a lot of headroom for overclocking and I'm interested to see how far people are able to push it!
Subject: General Tech | September 14, 2016 - 01:06 PM | Jeremy Hellstrom
Tagged: pascal, tesla, p40, p4, nvidia, neural net, m40, M4, HPC
The Register have package a nice explanation of the basics of how neural nets work in their quick look at NVIDIA's new Pascal based HPC cards, the P4 and P40. The tired joke about Zilog or Dick Van Patten stems from the research which has shown that 8-bit precision is most effective when feeding data into a neural net. Using 16 or 32-bit values slows the processing down significantly while adding little precision to the results produced. NVIDIA is also perfecting a hybrid mode, where you can opt for a less precise answer produced by your local, presumably limited, hardware or you can upload the data to the cloud for the full treatment. This is great for those with security concerns or when a quicker answer is more valuable than a more accurate one.
As for the hardware, NVIDIA claims the optimizations on the P40 will make it "40 times more efficient" than an Intel Xeon E5 CPU and it will also provide slightly more throughput than the currently available Titan X. You can expect to see these arrive in the market sometime over then next two months.
"Nvidia has designed a couple of new Tesla processors for AI applications – the P4 and the P40 – and is talking up their 8-bit math performance. The 16nm FinFET GPUs use Nv's Pascal architecture and follow on from the P100 launched in June. The P4 fits on a half-height, half-length PCIe card for scale-out servers, while the beefier P40 has its eyes set on scale-up boxes."
Here is some more Tech News from around the web:
- Windows 10 Anniversary Update might not arrive on your PC until November @ The Inquirer
- iOS 10 reviewed: There’s no reason not to update @ Ars Technica
- iOS 10 rollout goes titsup as update 'bricks' iPhones and iPads @ The Inqurier
- DevOps and the Art of Secure Application Deployment @ Linux.com
- HTC to unveil new Desire smartphones on September 20 @ DigiTimes
- Using a thing made by Microsoft, Apple or Adobe? It probably needs a patch today @ The Register
- New York Fines Viacom, Mattel and Hasbro For Tracking Kids Online @ Slashdot
- Microsoft's Service Fabric for Linux hits public preview @ The Register
Subject: Graphics Cards | September 6, 2016 - 05:45 PM | Scott Michaud
Tagged: nvidia, pascal, gtx 1050, geforce
I don't know why people insist on encoding screenshots from form-based windows in JPEG. You have very little color variation outside of text, which is typically thin and high-contrast from its surroundings. JPEG's Fourier Transform will cause rippling artifacts in the background, which should be solid color, and will almost definitely have a larger file size. Please, everyone, at least check to see how big a PNG will be before encoding it as JPEG. (In case you notice that I encoded it in JPEG too, that's because re-compressing JPEG artifacts makes PNG's file-size blow up, forcing me to actually need to use JPEG.)
It also makes it a bit more difficult to tell whether a screenshot has been manipulated, because the hitches make everything look suspect. Regardless, BenchLife claims to have a leaked GPU-Z result for the GeForce GTX 1050. They claim that it will be using the GP107 die at 75W, although the screenshot claims neither of these. If true, this means that it will not be a further cut-down version of GP106, as seen in the two GTX 1060 parts, which would explain a little bit why they wanted both of them to remain in the 1060 level of branding. (Although why they didn't call the 6GB version the 1060 Ti is beyond me.)
What the screenshot does suggest, though, is that it will have 4GB of GDDR5 memory, on a 128-bit bus. It will have 768 shaders, the same as the GTX 950, although clocked about 15% higher (boost vs boost) and 15W lower, bringing it back into the range of PCIe bus power (75W). That doesn't mean that it will not have a six-pin external power connector, but that could be the case, like the 750 Ti.
This would give it about 2.1 TeraFLOPs of performance, which is on part with the GeForce GTX 660 from a few generations ago, as well as the RX 460, which is also 75W TDP.
Subject: Systems | September 3, 2016 - 12:10 AM | Scott Michaud
Tagged: razer, blade, blade stealth, kaby lake, pascal
The Razer Blade and the Razer Blade Stealth seem to be quite different in their intended usage. The regular model is slightly more expensive than its sibling, but it includes a quad-core (eight thread) Skylake processor and an NVIDIA GTX 1060. The Stealth model, on the other hand, uses a Kaby Lake (the successor to Skylake) dual-core (four thread) processor, and it uses the Intel HD Graphics 620 iGPU instead of adding a discrete part from AMD or NVIDIA.
The Stealth model weighs about 2.84 lbs, while the regular model is (relatively) much more heavy at 4.1 - 4.3 lbs, depending on the user's choice of screen. The extra weight is likely due in part to the much larger battery, which is needed to power the discrete GPU and last-generation quad-core CPU. Razer claims that the Stealth's 53.6 Wh battery will power the device for 9 hours. They do not seem to make any claims about how long the non-Stealth's 70Wh battery will last. Granted, that would depend on workload anyway.
This is where the interesting choice begins. Both devices are compatible with the Razer Core, which allows externally-attached desktop GPUs to be plugged into Razer laptops. If you look at their website design, the Razer Blade Stealth promotes the Core more prominently, even including a “Buy Now” button for it on the header. They also advertise 100% AdobeRGB color support on the Stealth, which is useful for graphics designers because it can be calibrated to either sRGB (web and video) or print (magazines) color spaces.
To me, the Stealth seems more for a user who wants to bring their laptop to work (or school) on a daily basis, and possibly plug it into a discrete GPU when they get home. Alternatively, the Razer Blade without a suffix is for someone who wants a strong, powerful PC that, while not as fast as a full desktop, is decently portable and even VR ready without external graphics. The higher resolution choices, despite the slower internal graphics, also suggests that the Stealth is more business, while the Blade is more gaming.
Before we go, Razer has also included a license of Fruity Loops Studio 12 Producer Edition. This is a popular piece of software that is used to create music by layering individual instruments and tracks. Even if you license Adobe Creative Cloud, this is one of the areas that, while Audition technically can overlap with, it's really not designed to. Instead, think GarageBand.
The Razer Blade Stealth is available now, from $999.99 (128GB QHD) to $1999.00 (1TB 4K).
The Razer Blade is also available now, from $1799.99 (256GB 1080p) to $2699.99 (1TB QHD+).
Specifications and Card Breakdown
The flurry of retail built cards based on NVIDIA's new Pascal GPUs has been hitting us hard at PC Perspective. So much in fact that, coupled with new gaming notebooks, new monitors, new storage and a new church (you should listen to our podcast, really) output has slowed dramatically. How do you write reviews for all of these graphics cards when you don't even know where to start? My answer: blindly pick one and start typing away.
Just after launch day of the GeForce GTX 1060, ASUS sent over the GTX 1060 Turbo 6GB card. Despite the name, the ASUS Turbo line of GTX 10-series graphics cards is the company's most basic, most stock iteration of graphics cards. That isn't necessarily a drawback though - you get reference level performance at the lowest available price and you still get the promises of quality and warranty from ASUS.
With a target MSRP of just $249, does the ASUS GTX 1060 Turbo make the cut for users looking for that perfect mainstream 1080p gaming graphics card? Let's find out.