Rumor: NVIDIA GeForce GTX TITAN Black and GTX 790 Incoming

Subject: Graphics Cards | January 21, 2014 - 03:49 PM |
Tagged: rumor, nvidia, kepler, gtx titan black, gtx titan, gtx 790

How about some fresh graphics card rumors for your Tuesday afternoon?  The folks at VideoCardz.com have collected some information about two potential NVIDIA GeForce cards that are going to hit your pocketbook hard.  If the mid-range GPU market was crowded wait until you see the changes NVIDIA might have for you soon on the high-end.

First up is the NVIDIA GeForce GTX TITAN Black Edition, a card that will actually have the same specifications as the GTX 780 Ti but with full performance double precision floating point and a move from 3GB to 6GB of memory.  The all-black version of the GeForce GTX 700-series cooler is particularly awesome looking.  

gtx790pic.jpg

Image from VideoCardz.com

The new TITAN would sport the same GPU as GTX 780 Ti, only TITAN BLACK would have higher double precision computing performance, thus more FP64 CUDA cores. The GTX TITAN Black Edition is also said to feature 6GB memory buffer.This is twice as much as GTX 780 Ti, and it pretty much confirms we won’t be seeing any 6GB Ti’s.

The rest is pretty much well known, TITAN BLACK has 2880 CUDA cores, 240 TMUs and 48 ROPs.

VideoCardz.com says this will come in at $999.  If true, this is a pure HPC play as the GTX 780 Ti would still offer the same gaming performance for enthusiasts.  

Secondly, there looks to be an upcoming dual-GPU graphics card using a pair of GK110 GPUs that will be called the GeForce GTX 790.  The specifications that VideoCardz.com says they have indicate that each GPU will have 2496 enabled CUDA cores and a smaller 320-bit memory interface with 5GB designated for each GPU.  Cutting back on the memory interface, shader counts and even clocks speeds would allow NVIDIA to manage power consumption at the targeted 300 watt level.  

gtx790table.jpg

Image from VideoCardz.com

Head over to VideoCardz.com for more information about these rumors but if all goes as they expect, you'll hear about these products quite a bit more in February and March.

What do you think?  Are these new $1000 graphics cards something you are looking forward to?  

Source: VideoCardz

GeForce GTX 750 Ti May Be Maxwell... and February?

Subject: General Tech, Graphics Cards | January 20, 2014 - 04:19 AM |
Tagged: maxwell, nvidia

Well this is somewhat unexpected (and possibly wrong). Maxwell, NVIDIA's new architecture to replace Kepler, is said to appear in Feburary with the form of a GeForce GTX 750 Ti. The rumors, which sound iffy to me, claims that this core will be produced at TSMC on a 28nm fabrication technology and later transition to their 20nm lines.

As if the 700-series family tree was not diverse enough.

nvidiamaxwellroadmap.jpg

2013 may have been much closer than expected.

Swedish site, Sweclockers, have been contacted by "sources" which claim that NVIDIA has already alerted partners to prepare a graphics card launch. Very little information is given beyond that. They do not even have access to a suggested GM1## architecture code. They just claim that partners should expect a new videocard on the 18th of February (what type of launch that is is also unclear).

This also raises questions about why the mid-range card will come before the high-end. If the 28nm rumor is true, it could just be that NVIDIA did not want to wait around until TSMC could fabricate their high-end part if they already had an architecture version that could be produced now. It could be as simple as that.

The GeForce GTX 750 Ti is rumored to arrive in February to replace the GTX 650 Ti Boost.

Source: SweClockers

NVIDIA G-Sync DIY Kit For ASUS VG248QE Monitor Now Available for $199

Subject: Displays | January 17, 2014 - 06:35 PM |
Tagged: vg248qe, nvidia, gaming, g-sync, DIY, asus

NVIDIA's new G-Sync variable refresh rate technology is slowly being rolled out to consumers in the form of new monitors and DIY upgrade kits that can be used to add G-Sync functionality to existing displays. The first G-Sync capable monitor to support the DIY upgrade kit path is the ASUS VG248QE which is a 24" 1080p 144Hz TN panel. The monitor itself costs around $270 and the you can now purchase a G-Sync DIY upgrade kit from NVIDIA for $199.

The upgrade kit comes with a replacement controller board, power supply, HDMI cable, plastic spudger, IO shields, and installation instructions. Users will need to take apart the VG248QE monitor, remove the old PCBs and install the G-Sync board in its place. According to NVIDIA the entire process takes about 30 minutes though if this is your first time digging into monitor internals it will likely take closer to an hour to install.

The NVIDIA G-Sync DIY kit below the ASUS VG248QE monitor.

For help with installation, NVIDIA has posted a video of the installation process on YouTube. If you find text and photos easier, you can follow the installation guides written up for PC Perspective by Allyn Malventano and reader Levi Kendall. Both DIY kit reviews stated that the process, while a bit involved, was possible for most gamers to perform with a bit of guidance.

You can order the DIY upgrade kit yourself from this NVIDIA page.

Alternatively, ASUS is also releasing an updated version of the VG248QE monitor with the G-Sync board pre-installed in the first half of this year. This updated G-Sync monitor will have an MSRP of $399.

With the G-Sync kit at $199, will you be going the DIY path or waiting for a new monitor with the technology pre-installed?

Read more about NVIDIA's G-Sync display technology at PC Perspective including first impressions, installation, and more!

Source: NVIDIA

NVIDIA's take on AMD's under documented free sync

Subject: General Tech | January 8, 2014 - 01:20 PM |
Tagged: tom petersen, nvidia, g-sync, free sync, CES 2014, amd

AMD's free sync has been getting a lot of well deserved attention at this years CES, Ryan had a chance to see it in action if you haven't checked out his look at AMD's under reported and under utilized feature.  AMD missed an opportunity with this technology which NVIDIA picked up on with their G-Sync.  NVIDIA has responded to The Tech Report's comments from yesterday, Tom Petersen stated that while free sync may be an alternative on laptops, desktop displays are a different beast.  They utilize different connections and there is generally a scaler chip between the GPU and the display.  Read his full comments here.

gsync.jpg

"AMD demoed its "free sync" alternative to G-Sync on laptops. Desktop displays are different, Nvidia says, and they may not support the variable refresh rate tech behind AMD's solution—at least not yet."

Here is some more Tech News from around the web:

Tech Talk

CES 2014: NVIDIA Shows Modified ASUS PQ321Q 4K Monitor with G-Sync

Subject: Graphics Cards, Displays | January 8, 2014 - 04:01 AM |
Tagged: pq321q, PQ321, nvidia, gsync, g-sync, CES 2014, CES, asus, 4k

Just before CES Allyn showed you the process of modifying the ASUS VG248QE to support NVIDIA G-Sync variable refresh rate technology.  It wasn't the easiest mod we have ever done but even users without a lot of skill will be able to accomplish it.  

But at the NVIDIA booth at CES this year the company was truly showing off G-Sync technology to its fullest capability.  By taking the 3840x2160 ASUS PQ321Q monitor and modifying it with the same G-Sync module technology we were able to see variable refresh rate support in 4K glory.

4kgsync1.jpg

Obviously you can't see much from the photo above about the smoothness of the animation, but I can assure you that in person this looks incredible.  In fact, 4K might be the perfect resolution for G-Sync to shine as running games at that high of a resolution will definitely bring your system to its knees, dipping below that magical 60 Hz / FPS rate.  But when it does with this modified panel, you'll still get smooth game play and a a tear-free visual experience.

4kgsync2.jpg

The mod is actually using the same DIY kit that Allyn used in his story though it likely has a firmware update for compatibility.  Even with the interesting debate from AMD about the support for VRR in the upcoming DisplayPort 1.3 standard, it's impossible to not see the ASUS PQ321Q in 4K with G-Sync and instantly fall in love with PCs again.

Sorry - there are no plans to offer this upgrade kit for ASUS PQ321Q owners!

Coverage of CES 2014 is brought to you by AMD!

PC Perspective's CES 2014 coverage is sponsored by AMD.

Follow all of our coverage of the show at http://pcper.com/ces!

Author:
Manufacturer: AMD

DisplayPort to Save the Day?

During an impromptu meeting with AMD this week, the company's Corporate Vice President for Visual Computing, Raja Koduri, presented me with an interesting demonstration of a technology that allowed the refresh rate of a display on a Toshiba notebook to perfectly match with the render rate of the game demo being shown.  The result was an image that was smooth and with no tearing effects.  If that sounds familiar, it should.  NVIDIA's G-Sync was announced in November of last year and does just that for desktop systems and PC gamers.

Since that November unveiling, I knew that AMD would need to respond in some way.  The company had basically been silent since learning of NVIDIA's release but that changed for me today and the information discussed is quite extraordinary.  AMD is jokingly calling the technology demonstration "FreeSync".

slides04.jpg

Variable refresh rates as discussed by NVIDIA.

During the demonstration AMD's Koduri had two identical systems side by side based on a Kabini APU . Both were running a basic graphics demo of a rotating windmill.  One was a standard software configuration while the other model had a modified driver that communicated with the panel to enable variable refresh rates.  As you likely know from our various discussions about variable refresh rates an G-Sync technology from NVIDIA, this setup results in a much better gaming experience as it produces smoother animation on the screen without the horizontal tearing associated with v-sync disabled.  

Obviously AMD wasn't using the same controller module that NVIDIA is using on its current G-Sync displays, several of which were announced this week at CES.  Instead, the internal connection on the Toshiba notebook was the key factor: Embedded Display Port (eDP) apparently has a feature to support variable refresh rates on LCD panels.  This feature was included for power savings on mobile and integrated devices as refreshing the screen without new content can be a waste of valuable battery resources.  But, for performance and gaming considerations, this feature can be used to initiate a variable refresh rate meant to smooth out game play, as AMD's Koduri said.

Continue reading our thoughts on AMD's initial "FreeSync" variable refresh rate demonstration!!

CES 2014 Podcast Day 3 - Corsair, Coolermaster, NVIDIA, and more!

Subject: General Tech | January 8, 2014 - 12:30 AM |
Tagged: video, podcast, corsair, coolermaster, nvidia, Samsung, exynos, Allwinner, AX1500i

CES 2014 Podcast Day 3 - 01/07/14

It's time for podcast fun at CES!  Join us as we talk about the third day of the show including exciting announcements from Corsair, Coolermaster, NVIDIA, and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Josh Walrath, Allyn Malventano and Ken Addison

Program length: 49:24

 

ASUS ROG Poseidon Series Liquid Cooled Graphics Cards

Subject: Graphics Cards, Cases and Cooling | January 6, 2014 - 04:00 PM |
Tagged: water cooling, ROG, nvidia, gtx 780, gtx 770, CES 2014, CES, asus

In keeping with their tradition of pushing the innovation and performance boundaries through their ROG product line, ASUS today released NVIDIA-based 7-seried video cards featuring as part of their Poseidon series of liquid cooled products. ASUS released both a GTX 780 and GTX 770-based product with the hybrid Poseidon cooling solution.

ASUS_POSEIDON-GTX780-P-3GD5_a.jpg

Courtesy of ASUS

All Poseidon series graphics cards come with a hybrid cooling solution, using a combination of fan-based and water-based cooling to propel these cards to new performance heights. The card's GPU is cooled with the DirectCU H20 cooler with water pathed through the integrated barbs to the copper-based cooler. The water inlets are threaded, accepting G1/4" sized male fittings. The memory and VRM components are cooled by a massive dual-fan heat pipe, exhausting air through the rear panel port. Both cards feature the red and black ROG coloration with the Poseidon series name displayed prominently along the right front edge of the card.

ASUS_POSEIDON-GTX780-P-3GD5_b.jpg

Courtesy of ASUS

Both Poseidon series graphics cards, the ROG Poseidon GTX 770 and ROG Poseidon GTX 780, include ASUS' DIGI+ VRM and Super Allow Power power circuitry to ensure stability and component life under the most grueling conditions. When paired with the Poseidon cooling, the GPU ran 20% cooler and 3 times quieter than a comparable reference card with card operating temperatures 16 C lower than the same reference solution.

More after the break.

Coverage of CES 2014 is brought to you by AMD!

PC Perspective's CES 2014 coverage is sponsored by AMD.

Follow all of our coverage of the show at http://pcper.com/ces!

Source: ASUS

Nvidia's renamed Tegra K1 SoC uses Denver and Kepler

Subject: General Tech | January 6, 2014 - 02:08 PM |
Tagged: tegra k1, tegra, SoC, nvidia, kepler, k1, cortex a15, CES, arm, A15

Project X Logan K1 is the first big news out of CES from NVIDIA and represents a bit of a change from what we were expecting.  The current belief was that the SoC would have four 28nm Cortex A15 processors but that will only be one flavour of K1, a Denver based dual core version will also be released.  Those ARMv8 64-bit processors will natively handle 64 bit applications while the A15 version that The Tech Report had taken pictures of will be limited to 32 bit applications, though that will not matter in many mobile applications.   You should also check out Ryan's deep dive into the new Denver and Kepler version here.

die.jpg

"In early 2011, during a CES press event, Nvidia revealed its Project Denver CPU initiative. On Sunday evening, at another CES press conference, the company provided a glimpse of the first Denver-based processor: the Tegra K1. This next-generation SoC features dual Denver CPU cores clocked at up to 2.5GHz. The cores were designed by Nvidia, and they're compatible with the 64-bit ARMv8 instruction set. They have a seven-way superscalar pipeline and a hefty 192KB of L1 cache."

Here is some more Tech News from around the web:

Tech Talk

 

Coverage of CES 2014 is brought to you by AMD!

PC Perspective's CES 2014 coverage is sponsored by AMD.

Follow all of our coverage of the show at http://pcper.com/ces!

Author:
Subject: Mobile
Manufacturer: NVIDIA

Once known as Logan, now known as K1

NVIDIA has bet big on Tegra.  Since the introduction of the SoC's first iteration, that much was clear.  With the industry push to mobile computing and the decreased importance of the classic PC design, developing and gaining traction with a mobile processor was not only an expansion of the company’s portfolio but a critical shift in the mindset of a graphics giant. 

The problem thus far is that while NVIDIA continues to enjoy success in the markets of workstation and consumer discrete graphics, the Tegra line of silicon-on-chip processors has faltered.  Design wins have been tough to come by. Other companies with feet already firmly planted on this side of the hardware fence continue to innovate and seal deals with customers.  Qualcomm is the dominant player for mobile processors with Samsung, MediaTek, and others all fighting for the same customers NVIDIA needs.  While press conferences and releases have been all smiles and sunshine since day one, the truth is that Tegra hasn’t grown at the rate NVIDIA had hoped.

Solid products based on NVIDIA Tegra processors have been released.  The first Google Nexus 7 used the Tegra 3 processor, and was considered the best Android tablet on the market by most, until it was succeeded by the 2013 iteration of the Nexus 7 this year.  Tegra 4 slipped backwards, though – the NVIDIA SHIELD mobile gaming device was the answer for a company eager to show the market they built compelling and relevant hardware.  It has only partially succeeded in that task.

denver2.jpg

With today’s announcement of the Tegra K1, previously known as Logan or Tegra 5, NVIDIA hopes to once again spark a fire under partners and developers, showing them that NVIDIA’s dominance in the graphics fields of the PC has clear benefits to the mobile segment as well.  During a meeting with NVIDIA about Tegra K1, Dan Vivoli, Senior VP of marketing and a 16 year employee, equated the release of the K1 to the original GeForce GPU.  That is a lofty ambition and puts of a lot pressure on the entire Tegra team, not to mention the K1 product itself, to live up to.

Tegra K1 Overview

What we previously knew as Logan or Tegra 5 (and actually it was called Tegra 5 until just a couple of days ago), is now being released as the Tegra K1.  The ‘K’ designation indicated the graphics architecture that powers the SoC, in this case Kepler.  Also, it’s the first one.  So, K1.

The processor of the Tegra K1 look very familiar and include four ARM Cortex-A15 “r3” cores and 2MB of L2 cache with a fifth A15 core used for lower power situations.  This 4+1 design is the same that was introduced with the Tegra 4 processor last year and allows NVIDIA to implement a style of “big.LITTLE” design that is unique.  Some slight modifications to the cores are included with Tegra K1 that improve performance and efficiency, but not by much – the main CPU is very similar to the Tegra 4.

NVIDIA also unveiled late last night that another version of the Tegra K1 that replaces the quad A15 cores with two of the company's custom designs Denver CPU cores.  Project Denver, announced in early 2011, is NVIDIA's attempt at building its own core design based on the ARMv8 64-bit ISA.  This puts this iteration of Tegra K1 on the same level as Apple's A7 and Qualcomm's Krait processors.  When these are finally available in the wild it will be incredibly intriguing to see how well NVIDIA's architects did in the first true CPU design from the GPU giant.

Continue reading about NVIDIA's new Tegra K1 SoC with Kepler-based graphics!

Coverage of CES 2014 is brought to you by AMD!

PC Perspective's CES 2014 coverage is sponsored by AMD.

Follow all of our coverage of the show at http://pcper.com/ces!