NVIDIA launches the GTX 760 @ $250; let the price wars begin again!

Subject: Graphics Cards | June 25, 2013 - 01:28 PM |
Tagged: geforce, GK104, gtx 760, nvidia, msi, MSI N760 TF 2GD5/OC

To start off with the good news, the GTX 760 is now available between $250 to $260 for the MSI model that [H]ard|OCP reviewed.  No paper launch this, nor another $400+ card for you to dream about but instead a solid performing card at a decent price.  Power is provide by an 8-pin and a 6-pin PCIe power connector, perhaps a little more than the card needs but perfect for overclockers who need the extra juice.  Performance wise the card trumps the GTX 660Ti and matches the GTX 670 and HD7950 boost in almost every test, for a good $50-75 less to pick up.  Even better news is that some certain sites testing Frame Rating and SLI performance saw great scaling in real performance. 

Read on to get the whole picture from [H]ard|OCP.

H_gtx760.jpg

"Today NVIDIA is launching the GeForce GTX 760. The GeForce GTX 760 will be replacing a video card and offering what use to be high-end memory performance, at a mainstream price. We will evaluate a retail MSI N760 TF 2GD5/OC video card with comparisons to find out whether or not this is a true value."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

Palit Releases GTX 780 Super JetStream Graphics Card With Triple Fan JetStream Cooler

Subject: Graphics Cards | June 24, 2013 - 11:57 PM |
Tagged: palit, gtx 780, gaming, super jetstream, jetstream

AIB partner Palit has announced a speedy GTX 780 of its own with the GTX 780 Super JetStream graphics card. This card has a triple fan cooler and is one of the fastest GTX 780’s announced so far (matching the GPU clocks of the Gainward Phantom GLH).

The Palit GTX 780 Super JetStream clocks the GPU’s 2,304 CUDA cores to 980 MHz base and 1033 MHz boost. Palit has also slightly overclocked the 3GB GDDR5 memory at 6200 MHz. For comparison, NVIDIA clocks the reference card at 863 MHz base, 900 MHz boost, and 6008 MHz memory. Palit is also producing a non-Super JetStream card clocked at 902 MHz base and 954 MHz boost.

Palit Factory Overclocked GTX 780 Super JetStream.jpg

The differentiating factor here beyond the factory overclock is Palit’s own JetStream cooler. This cooler, well, cools an aluminum fin stack (copper base) using two 80mm fans on either side of a single center-mounted 90mm fan. The fans sit beneath a black and gold colored shroud. According to Palit, the JetStream cooler is rated at 6 dB quieter and 10-degrees Celsius cooler than the reference NVIDIA cooler.

Additionally, the GTX 780 Super JetStream comes with an 8-phase PWM with DrMOS technology.

Palit has not yet released details on where and when the GPU will be available, or how much it will cost.

Source: HEXUS

Gainward GTX 780 Phantom GLH Features Massive HSF and Factory Overclock

Subject: Graphics Cards | June 24, 2013 - 03:28 PM |
Tagged: phantom glh, gtx 780, gk110, gaming, gainward

The rumored GTX 760 graphics cards are still not available, but graphics enthusiasts do have a number of new factory overclocked GTX 780 cards with custom coolers to drool over. One such new GTX 780 card is the so-called GTX 780 Phantom GLH card from Gainward. This card is 2.5 slot monster that pairs the GTX 780 GPU with custom power phases and a giant block of aluminum and copper to support a healthy factory overclock.

Gainward GTX 780 Phantom GLH Graphics Card.jpg

This new Gainward Phanton GLH card pushes the GTX 780 GPU farther than the company's own GTX 780 Phantom. It has a base clock of 980MHz, boost clock of 1033 MHz, and slightly overclocked 6200 MHz memory. Of course, being based on NVIDIA's GTX 780 chip, the Phanton GLH features 2,304 CUDA cores and 192 Texture Units within 12 SMX units. The Phantom GLH's 3GB of overclocked GDDR5 memory affords the card 297.6 GB/s of memory bandwidth. Gainward claims that the new card is up to 19% faster than NVIDIA's reference GTX 780 graphics card.

To put that in perspective, the Gainward GTX 780 Phantom (non-GLH) is clocked at 902 MHz base and 954 MHz boost. Further, NVIDIA"s stock GTX 780 is has GPU clockspeeds of 863 MHz base, 900 MHz boost, and 6008 MHz for the memory. In other words, it is an impressive factory overclock, and I'm interested to see how much headroom is left for enthusiasts to push the chip further with the included cooler.

Gainward GTX 780 Phantom GLH with Box.jpg

Other features of the upcoming Gainward GTX 780 Phanton GLH include an 8-phase PWM with DrMOS technology, a large aluminum fin stack with removable fans that is connected to a copper GPU block via five 8mm heatpipes, and an EXPERTmode option in the company's overclocking utility. Video outputs are the same as the reference design, with two DVI, one DisplayPort, and one HDMI port.

There is no word on pricing or when (and where) it will be available, but expect this beastly card to come at a premium. Although, as one of the fastest factory overclocked GTX 780 cards (soon to be) available, it may be worth it!

Also read: NVIDIA GeForce GTX 780 3GB Graphics Card Review - GK110 Mini

Source: Gainward

Frame Rating: AMD plans driver release to address frame pacing for July 31st

Subject: Graphics Cards | June 20, 2013 - 04:05 PM |
Tagged: radeon, nvidia, geforce, frame rating, fcat, crossfire, amd

Well, the date has been set.  AMD publicly stated on its @AMDRadeon Twitter account that a new version the prototype driver we originally previewed with the release of the Radeon HD 7990 in April will be released to the public on July 31st.  For a problem that many in the industry didn't think existed.  

 

 

Since that April release AMD has been very quiet about its driver changes and actually has refused to send me updated prototypes over the spring.  Either they have it figured out or they are worried they haven't - but it looks like we'll find out at the end of next month and I feel pretty confident that the team will be able to address the issues we brought to light.

For those of you that might have missed the discussion, our series of Frame Rating stories will tell you all about the issues with frame pacing and stutter in regards to AMD's CrossFire multi-GPU technology. 

AMD gave the media a prototype driver in April to test with the Radeon HD 7990, a card that depends on CrossFire to work correctly, and the improvements were pretty drastic.

BF3_2560x1440_PLOT_0.png

So what can we expect on July 31st?  A driver that will give users the option to disable or enable the frame pacing technology they are developing - though I am still of the mindset that disabling is never advantageous.  More to come in the next 30 days!

Source: Twitter

Rumored NVIDIA GTX 760 Specifications Leaked: The First Mid-Range 700-Series GPU Option

Subject: Graphics Cards | June 20, 2013 - 12:28 PM |
Tagged: nvidia, kepler, gtx 760, GK104, gk-104, gaming

There have been rumors of a new mid-range Kepler-based graphics card coming that will be next entry in the GTX 700-series. This new GPU is rumored to be called the GeForce GTX 760. If the specifications are true, the card will fit between the existing GTX 660 and GTX 660 Ti graphics cards as far as hardware specifications and pricing. While it will be under the GTX 700-series, it will not have the faster 7Gbps memory clockspeed of the other 700-series cards.

As far as specifications, Videocardz claims to have the final specifications list in a recent news post. The GTX 760 is rumored to be the latest graphics card to use NVIDIA's GK-104 "Kepler" GPU. The GTX 760 will have some units disabled for a GPU with 1,152 CUDA cores, 96 Texture Manipulation Units (TMUs), and 32 Raster Operations Processors (ROPs). The GPU supports NVIDIA's latest GPU Boost 2.0 technology which will automatically ratchet up the Boost clockspeed so long as temperature allows. It has a base clockspeed of 980 MHz and a boost clockspeed of 1,033 Mhz.

GTX 760 graphics cards will allegedly come in both 2GB and 4GB GDDR5 memory flavors. NVIDIA is clocking the memory at 6 Gbps (1502 MHz), which makes it the fist 700-series part to not take advantage of faster memory chips. However, there is a bit of saving grace as NVIDIA has moved to a 256-bit memory bus. This allows the card to still see a respectable bump in memory bandwidth of 192 GB/s on the GTX 760 versus the GTX 660/GTX 660 Ti's 144.2 GB/s bandwidth.

Compared to the existing mid-range 600-series cards, the GTX 760 has base and boost GPU clockspeeds equal to the GTX 660 (and faster than the GTX 660 TI). Memory clockspeed is also unchanged on the new card, though it has a wider memory bus. The GTX 760 has 192 more CUDA cores than the GTX 660, but 192 fewer CUDA cores versus the GTX 660 Ti. TMUs are also sit evenly between the two 600-series cards, but the GTX 760 does have 8 more ROPs enabled than both the 660 and 660 Ti.

Graphics cards with the upcoming GTX 760 GPU will be powered by two 6-pin PCI-E power connectors, and it has a 170W TDP. That power consumption puts the card between the 150W GTX 660 Ti and the higher-end 230W GTX 770. It appears that the card will not come with the high-end stock metallic cooler used in the other 700-series cards, though the various AIBs are likely to fit the GPU with their own custom aftermarket coolers. Video outputs on the cards will incluce DVI-I, DVI-D, HDMI, and DisplayPort.

The chart below compares the specifications between the GTX 660, GTX 660 Ti, GTX 770, and the rumored GTX 760.

  GTX 760 GTX 660 GTX 660 Ti GTX 770
CUDA Cores 1,152 960 1,344 1536
TMUs 96 80 112 128
ROPs 32 24 24 32
GPU Base 980 MHz 980 Mhz 915 Mhz 1046 Mhz
GPU Boost 1033 MHz 1033 MHz 980 Mhz 1085 MHz
Memory Bus 256-bit 192-bit 192-bit 256-bit
Memory Clock 1502 MHz 1502 MHx 1502 Mhz 1752 Mhz
Bandwidth 192 GB/s 144.2 GB/s 144.2 GB/s 224 GB/s
TDP 170 W 140 W 150 W 230 W
Architecture GK-104 GK-106 GK-104 GK-104

The card is supposedly going to be released on June 25th for around $300. It will compete with AMD's 7950 with boost graphics card. Further, the card will be an alternative to NVIDIA's own GTX 660 Ti and an upgrade for gamers still running GTX 560 cards with the company's older Fermi-based GPU.

Source: Videocardz

Rumor: AMD Gets Exclusive Optimization for all Frostbite 3 Games

Subject: Graphics Cards | June 18, 2013 - 03:39 PM |
Tagged: radeon, nvidia, geforce, frostbite 3, ea, dice, amd

UPDATE #3

The original source article at IGN.com has been updated with some new information.  Now they are saying the agreement between AMD and EA is "non-exclusive and gamers using other components will be supported." 

The quote from an EA rep says as follows:

DICE has a partnership with AMD specifically for Battlefield 4 on PC to showcase and optimize the game for AMD hardware," an EA spokesperson said. "This does not exclude DICE from working with other partners to ensure players have a great experience across a wide set of PCs for all their titles.

END UPDATE #3

This could be a huge deal for NVIDIA and AMD in the coming months - according to a story at IGN.com, AMD has entered into an agreement with EA that will allow them exclusive rights to optimization for all games based around the Frostbite 3 engine.  That includes Battlefield 4, Mirror's Edge 2, Need for Speed Rivals and many more games due out this year and in 2014.  Here is the quote that is getting my attention:

Starting with the release of Battlefield 4, all current and future titles using the Frostbite 3 engine — Need for Speed Rivals, Mirror's Edge 2, etc. — will ship optimized exclusively for AMD GPUs and CPUs. While Nvidia-based systems will be supported, the company won't be able to develop and distribute updated drivers until after each game is released.

bf4.jpg

Battlefield 4 will be exclusive optimized for AMD hardware.

This is huge news for AMD as the Frostbite 3 engine will be used for all EA published games going forward with the exception of sports titles.  The three mentioned above are huge but this also includes Star Wars Battlefront, Dragon Age and even the next Mass Effect so I can't really emphasize enough how big of a win this could be for AMD's marketing and developer relations teams. 

I am particularly interested in this line as well:

While Nvidia-based systems will be supported, the company won't be able to develop and distribute updated drivers until after each game is released.

The world of PC optimizations and partnerships has been around for a long time so this isn't a huge surprise for anyone that follows PC gaming.  What is bothersome to me is that both EA and AMD are saying are rumored to have agreed that NVIDIA won't get access to the game as it is being developed - something that is CRUCIAL for day-of driver releases and performance tweaks for GeForce card owners.  In most cases, both AMD and NVIDIA developer relations teams get early access to game builds for PC titles in order to validate compatibility and to improve performance of these games for the public release.  Without these builds, NVIDIA would be at a big disadvantage.  This is exactly what happend with the recent Tomb Raider release.

UPDATE

AMD called me to reiterate their stance that competition does not automatically mean cutting out the other guy.  In the Tomb Raider story linked above, Neil Robison, AMD's Senior Director of Consumer and Graphics Alliances, states quite plainly: "The thing that angers me the most is when I see a request to debilitate a game. I understand winning, I get that, and I understand aggressive companies, I get that. Why would you ever want to introduce a feature on purpose that would make a game not good for half the gaming audience?"

So what do we take away from that statement, made in a story published in March, and today's rumor?  We have to take AMD at its word until we see solid evidence otherwise, or enough cases of this occurring to feel like I am being duped but AMD wants us all to know that they are playing the game the "right way."  That stance just happens to be counter to this rumor. 

END UPDATE

tombraider.jpg

NVIDIA had performance and compatibility issues with Tomb Raider upon release.

The irony in all of this is that AMD has been accusing NVIDIA of doing this exact thing for years - though without any public statements from developers, publishers or NVIDIA.  When Batman: Arkham Asylum was launched AMD basically said that NVIDIA had locked them out of supporting antialiasing.  In 2008, Assassin's Creed dropped DX 10.1 support supposedly because NVIDIA asked them too, who didn't have support for it at the time in GeForce cards.  Or even that NVIDIA was disabling cores for PhysX CPU support to help prop up GeForce sales.  At the time, AMD PR spun this as the worst possible thing for a company to do in the name of gamers, that is was bad for the industry, etc.  But times change as opportunity changes.

The cold truth is that this is why AMD decided to take the chance that NVIDIA was allegedly unwilling to and take the console design wins that are often noted as being "bad business."  If settling for razor thin margins on the consoles is a risk, the reward that AMD is hoping to get is exactly this: benefits in other markets thanks to better relationships with game developers.

ps4controller.jpg

Will the advantage be with AMD thanks to PS4 and Xbox One hardware?

At E3 I spoke in-depth with both NVIDIA and AMD executives about this debate and as you might expect both have very different opinions about what is going to transpire in the next 12-24 months.  AMD views this advantage (being in the consoles) as the big bet that is going to pay off for the more profitable PC space.  NVIDIA thinks that AMD still doesn't have what it takes to truly support developers in the long run and they don't have the engineers to innovate on the technology side.  In my view, having Radeon-based processors in the Xbox One and Playstation 4 (as well as the Wii U I guess) gives AMD a head start but won't win them the race for the hearts and minds of PC gamers. There is still a lot of work to be done for that.

Before this story broke I was planning on outlining another editorial on this subject and it looks like it just got promoted to a top priority.  There appear to be a lot of proverbial shoes left to drop in this battle, but it definitely needs more research and discussion. 

batmanaa.jpg

Remember the issues with Batman: Arkham Asylum?  I do.

I asked both NVIDIA and AMD for feedback on this story but only AMD has replied thus far.  Robert Hallock, PR manager for gaming and graphics, Graphics Business Unit at AMD sent me this:

It makes sense that game developers would focus on AMD hardware with AMD hardware being the backbone of the next console generation. At this time, though, our relationship with EA is exclusively focused on Battlefield 4 and its hardware optimizations for AMD CPUs, GPUs and APUs.

Not much there, but he is also not denying of the original report coming from IGN.  It might just be too early for a more official statement.  I will update this story with information from NVIDIA if I hear anything else.

What do YOU think about this announcement though?  Is this good news for AMD and bad news for NVIDIA?  Is it good or bad for the gamer and in particular, the PC gamer?  Your input will help guide or upcoming continued talks with NVIDIA and AMD on the subject. 

UPDATE #2

Just so we all have some clarification on this and on the potential for validity of the rumor, this is where I sourced the story from this afternoon:

taylorquote.png

END UPDATE #2

Source: IGN

E3 2013: AMD tells the press their gaming initiatives

Subject: General Tech, Graphics Cards, Processors, Shows and Expos | June 13, 2013 - 02:26 AM |
Tagged: E3, E3 13, amd

The Electronic Entertainment Expo (E3) is the biggest event of the year for millions of gamers. The majority of coverage ends up gawking over the latest news out of Microsoft, Sony, or Nintendo, and we certainly will provide our insights in those places if we believe they have been insufficiently explained, but E3 is also a big time for PC gamers too.

AMD_fx.jpg

5 GHz and unlocked to go from there.

AMD, specifically, has a lot to say this year. In the year of the next-gen console reveals, AMD provides the CPU architecture for two of the three devices and have also designed each of the three GPUs. This just leaves a slight win for IBM, who is responsible for the WiiU main processor, for whatever that is worth. Unless the Steam Box comes to light and without ties to AMD, it is about as close to a clean sweep as any hardware manufacturer could get.

But for the PCs among us...

For those who have seen the EA press conference, you have probably seen lots of sports. If you stuck around after the sports, you probably saw Battlefield 4 being played by 64 players on stage. AMD has been pushing, very strongly, for developer relations over the last year. DICE, formerly known for being an NVIDIA-friendly developer, did not exhibit Battlefield 4 "The Way It's Meant to be Played" at the EA conference. According to one of AMD's Twitter accounts:

 

 

On the topic of "Gaming Evolved" titles, AMD is partnering with Square Enix to optimize Thief for GCN and A-Series APUs. The Press Release specifically mentioned Eyefinity and Crossfire support along with a DirectX 11 rendering engine; of course, the enhancements with real, interesting effects are the seemingly boring ones they do not mention.

The last major point from their E3 event was the launch of their 5 GHz FX processors. For more information on that part, check out Josh's thoughts from a couple of days ago.

Source: AMD

MSI's Lightning strikes thrice; overclocking the GTX 770

Subject: Graphics Cards | June 10, 2013 - 07:28 PM |
Tagged: gtx 770, msi N770 Lightning, overclocking

[H]ard|OCP liked the new GTX 770 Lightning from MSI but thought they would like it better overclocked, perhaps even more than a GTX 680 or HD7970.   The triplets below are, from top to bottom, the GTX 680, the GTX 770 and the HD7970, all from the overclocked Lightning family.  By using MSI's Afterburner utility [H] pushed the card to 1241MHz on the core and 7.8GHz effective for the RAM, higher than the factory overclock.  That speed boost put its performance on par with the overclocked GTX680 but it seems that the impressive speeds that the 7970 Lightning is capable of leaves it comfortably in the lead.

lightning_thrice.jpg

"We take the new MSI N770 Lightning and overclock it to its maximum potential. We will compare it with a highly overclocked MSI GeForce GTX 680 Lightning and GIGABYTE Radeon HD 7970. Each GPU is getting its best chance to show us how well it can perform, as all of these GPUs are highly overclocked."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

Computex 2013: The Comedic Return of the Ultra GPUs

Subject: Editorial, General Tech, Graphics Cards, Shows and Expos | June 10, 2013 - 02:49 AM |
Tagged: Ultra, geforce titan, computex

So long to Computex 2013, we barely knew thee. You poured stories all over our news feed for more than a whole week. What say you, another story for the... metaphorical road... between here... and... Taipei? Okay, so the metaphorical road is bumpy and unpaved, work with me.

It was substantially more difficult to decipher the name of a video card a number of years ago. Back then, products would be classified by their model numbers and often assigned a suffix like: "Ultra", "Pro", or "LE". These suffixes actually meant a lot, performing noticeably better (or maybe worse) than the suffix-less number and possibly even overlapping with other number-classes.

colorful-gtx-titan-ultra-edition,B-V-387931-13.png

Image Credit: zol.com.cn via Tom's Hardware

Just when they were gone long enough for us to miss them, the suffixes might make some measure of a return. On the show floor, Colorful exhibited the NVIDIA GeForce GTX Titan Ultra Edition. This card uses a standard slightly-disabled GK110-based GeForce GTX Titan GPU, with the usual 2688 CUDA cores, and 6GB of GDDR5. While the GK110 chip has potential for 2880 CUDA cores, NVIDIA has not released any product (not even Tesla or Quadro) with more than 2688 CUDA cores enabled. Colorful's Titan Ultra and the reference Titan are electrically identical; this "Ultra" version just adds a water block for a cooling system and defaults to some amount of a factory overclock.

But, this is not the first time we have heard of a Titan Ultra...

Back in April, ExtremeTech found a leak for two official products: the GTX Titan LE and the GTX Titan Ultra. While the LE would be slightly stripped down compared to the full GTX Titan, the GTX Titan Ultra would be NVIDIA's first release of a GK110 part without any CUDA cores disabled.

So if that rumor ends up being true, you could choose between Colorful's GTX Titan Ultra with its partially disabled GK110 based on the full GTX Titan design; or, you could choose the reference GTX Titan Ultra based on a full GK110 GPU unlike the partially disabled GK110 on the full GTX Titan.

If you are feeling nostalgic... that might actually be confusion... as this is why suffixes went away.

AMD wants you to know there is a Radeon HD 7970 GHz Edition for $419

Subject: Graphics Cards | June 7, 2013 - 02:33 PM |
Tagged: amd, radeon, hd 7970 ghz edition, HD 7970, never settle

AMD just passed me a note that I found to be very interesting.  In an obvious response to the release of the NVIDIA GeForce GTX 770 that offers the GK104 GPU (previously only in the GTX 680) for a lower price of $399, AMD wants you to know that at least ONE Radeon HD 7970 GHz Edition card is priced lower than the others.

sapphire7970ghz.png

The Sapphire Vapor-XHD 7970 GHz Edition is currently listed on Newegg.com for $419, a cool $30 less than the other HD 7970 GHz Edition cards.  This is not a card-wide price drop to $419 though.  AMD had this to say:

In late May I noted that we would be working with our partners to improve channel supply of the AMD Radeon™ HD 7970 GHz Edition to North American resellers like Newegg.com. Today I’m mailing to let you know that this process has begun to bear fruit, with the Sapphire Vapor-X HD 7970 GHz Edition now listing for the AMD SEP of $419 US. Of course, this GPU is also eligible for the Never Settle Reloaded AND Level Up programs!

Improving supply is an ongoing process, of course, but we’re pleased with the initial results of our efforts and hope you might pass word to your readers if you get a chance.

This "ongoing process" might mean that we'll see other partners' card sell for this lower price but it also might not.  In AMD's defense, our testing proves that in single GPU configurations, the Radeon HD 7970 GHz Edition does very well compared to the GTX 770, especially at higher resolutions.

I did ask AMD for some more answers in regards to what other partners think about a competitor getting unique treatment with AMD to offer this lower price unit, but I haven't received an answer yet.  I'll update here when we do!

For today though, if you are looking for a Radeon HD 7970 GHz Edition that also comes with the AMD Never Settle game bundle (Crysis 3, Bioshock Infinite, Far Cry 3: Blood Dragon and Tomb Raider), it's hard to go wrong with that $419 option.

Source: Newegg.com

Computex 2013: Gigabyte Preparing Custom GTX Titan With WindForce 450W Cooler... Some Assembly Required

Subject: Graphics Cards | June 7, 2013 - 01:20 PM |
Tagged: windforce 450w, windforce, gtx titan, gk110, gigabyte

Back in April, Gigabyte showed off its new custom WindForce 450W GPU HSF, but did not name which specific high end graphics cards it would be used with. So far, NVIDIA's highest-end single GPU solution, the GTX Titan, has been off limits for GPU manufacturers as far as putting custom air coolers on the cards (NVIDIA has restricted designs to its reference cooler or factory installed water blocks).

It seems that Gigabyte has found a solution to the cooler restriction, however. The company will be selling a GTX TITAN with model number GV-NTITAN-6GDB later this year that will come with NVIDIA's reference cooler pre-installed along with a bundled WindForce 3X 450W cooler and instructions for switching out the coolers.

Gigabyte GTX Titan GPU With Windforce 450W Cooler.jpg

Gigabyte Is Showing Off the custom GTX Titan at Computex, as discovered by TechPowerUp.

Users that do take Gigabyte up on its offer to switch to the custom WindForce cooler will still be covered under the company's standard warranty policy, which is a good thing. The kit is likely to be more expensive than your standard TITAN though as Gigabyte is having to sell the card with two coolers and increased support costs. On the other hand, users could swap out the coolers and then sell the unused TITAN reference cooler to offset some of the cost of the kit.

Gigabyte is actually showing off the new graphics card with WindForce 3X 450W cooler at Computex this week. The dual slot WindForce cooler is said to keep a GTX 680 2°C and 23.3 dB quieter than the reference cooler when running the Furmark benchmark suite. The major benefit of the WindForce is having three large fans that can spin at lower RPMs to give you the same cooling performance as the reference NVIDIA design at a much lower noise volume rather than pure cooling performance. Should you be looking to push the TITAN to the extreme, a water block would be your best bet, but for many users i think the allure of a quieter air cooled TITAN may be enough for Gigabyte to snag a few adventurous enthusiasts willing to put up with assembling the new card themselves.

More information on the WindForce 3X 450W cooler can be found here.

Source: Gigabyte

Computex 2013: ASUS Working On GTX 770 Poseidon With Hybrid Waterblock and Air Cooler HSF

Subject: Graphics Cards | June 4, 2013 - 12:04 AM |
Tagged: poseidon, nvidia, kepler, gtx 770, gk-104, computex 2013, computex, ASUS ROG, asus

NVIDIA took the wraps off of its latest-generation Geforce GTX 770 GPU last week, and manufacturers have begun announcing not only reference designs but custom and factory overclocked versions of this GK-104 "Kepler" GPU refresh. One card in particular that caught my attention was the ASUS GTX 770 Poseidon graphics card, which combines NVIDIA's GK-104 GPU with a hybrid heatsink and fan combo that allows the simultaneous use of water and air cooling!

ASUS_ROG_Poseidon_GraphicsCard_with_Hybrid_DirectCU_H2O_and_CoolTech_Fan.jpg

According to the branding, and a hands-on report by Tech Power Up at Computex in Taipei, Taiwan, the GTX 770 Poseidon graphics card is part of the company's Republic of Gamers (ROG) line and likely sports beefy VRM hardware and factory GPU overclocks. Of course, the GTX 770 GPU uses NVIDIA's Kepler architecture and is essentially the GTX 680 with some seriously overclocked memory and refined GPU Boost technology. That means 1,536 CUDA cores, 128 texture units, and 32 ROPs (raster operation units) within 4 GPCs (Graphics Processing Clusters). This is the full GK-104 chip, desite the x70 name. For more information on the GTX 770 GPU, check out our recent review of the NVIDIA GTX 770 card.

Update: ASUS has just launched the new ROG graphics cards at a Computex press conference. According to the ASUS press release:

"ROG Poseidon graphics card with hybrid DirectCU H2O cooling
The new ROG Poseidon graphics card features an NVIDIA® GeForce® GTX 700 Series GPU and a hybrid DirectCU H2O thermal design that supports both air and liquid cooling. Developed by ASUS, its CoolTech fan combines blower and axial fans in one design, forcing air in multiple directions over the heatsink to maximize heat dissipation. Liquid cooling reduces operating temperatures by up to 31 degrees Celsius for cooler running with even greater overclocking potential. ROG Poseidon also features a red pulsing ROG logo for a distinctive dash of style."

(end update)

Back on the Poseidon specifcally, the card is a short GTX 770 with a distinctive cooler that uses a full cover water block that covers the entire card and includes the GPU, memory, and VRM areas. ASUS further added a more-traditional air cooler to the area above the GPU itself to help dissapate heat. The air cooler is a circular aluminum fin array with a fan that sits in the middle. The air entire hybrid cooler is then covered by a ROG-themed shroud with a configurable LED-backlit Republic of Gamers logo on the side that can be controlled via software.

ASUS_ROG_Poseidon_GraphicsCard_with_Gquarter_Fitting_and_LED_light.jpg

The water cooling portion acts as any other full cover water block, allowing cool water to move heat away from the metal contact (the bottom of the block) touching the various components. The inlet and outlets poke out from the side of the card, which is a bit odd but the shroud prevents them coming out at 90-degrees like typical blocks. If your case width is tight, you may need to get creative to fit a 90-degree barb extender (I apologize if that's not the technical term) on to the existing tubing connectors (heh). The cooler can be operated with the air cooler's fan running with or without being connected to a water loop. When water cooling is used, the fan can be turned off to reduce noise or left on to allow for higher overclocks and/or lower temperatures.

Unfortunately, that is all of the information that is currently available  as ASUS has not yet officially launched on the custom GTX 770 graphics card. Pricing, availability, and clockspeed details are still unknown.

For more information, stay tuned to the press.asus.com/events livestream page as it might be announced at a Computex press conference this week since the company is showing off the hardware at the show!

Source: ASUS

Samsung Galaxy Tab 3 10.1: Intel inside an Android?

Subject: General Tech, Graphics Cards, Processors, Mobile | June 3, 2013 - 03:00 AM |
Tagged: Intel, atom, Clover Trail+, SoC, Samsung, Galaxy Tab 3 10.1

While Reuters is being a bit cagey with their source, if true: Intel may have nabbed just about the highest profile Android tablet design win possible. The, still currently unannounced, Samsung Galaxy Tab 3 10.1 is expected to embed Intel's Clover Trail+ System on a Chip (SoC). Samsung would not be the largest contract available in the tablet market, their previous tablets ship millions of units each; they are a good OEM vendor to have.

Source: BGR India

Samsung is also known for releasing multiple versions of the same device for various regions and partners. The Galaxy Tab 10.1 and Galaxy Tab 2 10.1 did not have a variety of models with differing CPUs like, for instance, the Galaxy S4 phone did; the original "10.1" contained an NVIDIA Tegra 2 and the later "2 10.1" embed a TI OMAP 4430 SoC. It is entirely possible that Intel won every Galaxy Tab 3 10.1 tablet ever, but it is also entirely possible that they did not.

Boy Genius Report India (BGR India, video above) also claims more specific hardware based on a pair of listings at GLBenchmark. The product is registered under the name Santos10: GT-P5200 being the 3G version, and GT-P5210 being the Wi-Fi version.

These specifications are:

  • Intel Atom Z2560 800-933 MHz dual-core SoC (4 threads, 1600 MHz Turbo)
  • PowerVR SGX 544MP GPU (OpenGL ES 2.0)
  • 1280x800 display
  • Android 4.2.2

I am not entirely sure what Intel has to offer with Clover Trail+ besides, I would guess, reliable fabrication. Raw graphics performance is still about half of Apple's A6X GPU although, if the leaked resolution is true, it has substantially less pixels to push without being attached to an external display.

Maybe Intel made it too cheap to refuse?

Source: Reuters

Galaxy's Factory Overclocked GTX 770 Graphics Card Is Now Available for $400

Subject: Graphics Cards | June 2, 2013 - 12:43 AM |
Tagged: nvidia, gtx 770, graphics card, gk-104, galaxy

Galaxy recently made its custom factory overclocked GTX 770 graphics card available. The new card is not the fastest GTX 770, and doesn't quite embrace the supa-pipe as much (as Josh would say), but it looks to be a good deal all the same, giving you a quieter HSF and a decently-overclocked Geforce GTX 770 GPU for $399.99.

The Galaxy GeForce GTX 770 2GB (77XPH6DV6KXZ) takes NVIDIA's GTX 770 GPU with 1,536 GK-104 based CUDA cores and overclocks it to 1110 MHz base and 1163 MHz boost clockspeeds. The 2GB of GDDR5 memory is only clocked at the reference 7010 MHz, however.

Galaxy GTX 770 Graphics Card.jpg

The card has the same video outputs as other GTX 770 cards: two DL-DVI, one HDMI, and one DisplayPort output. The card with its dual slot, dual fan cooler is 10” in length and requires a 600W PSU at minimum (not solely for the GPU). It needs one 8-pin and one 6-pin PCI-E power connector.

Galaxy provides a two year warranty for the card. It is available now for around $400 at various retailers.

Read more about other factory overclocked GTX 770 graphics cards at PC Perspective!

Source: Newegg

ASUS Launches GTX 770 DirectCU II OC Graphics Card

Subject: Graphics Cards | June 1, 2013 - 06:00 PM |
Tagged: nvidia, kepler, gtx 770, graphics card

NVIDIA recently unveiled its GTX 770 GPU. Sitting between the GTX 680 and GTX 780, the Geforce GTX 770 is a refined GK104 with higher clockspeeds and improved GPU boost. It features 1536 CUDA cores and a 256-bit memory bus.

While the stock GTX 770 comes clocked at 1046 MHz base and 1085 MHz boost, ASUS is factory overclocking its DirectCU II OC card with a maximum boost GPU clockspeed of 1110 MHz. The 2GB of GDDR5 memory on the card will come clocked at 7010 MHz.

ASUS GTX 770 DirectCU II OC Graphics Card.jpg

The differentiating factor here (aside from the overclock) is the custom DirectCU II cooler. ASUS has fitted the overclocked GTX 770 with a DirectCU cooler that uses copper heatpipes that directly contact the GPU and attach to an aluminum fin stack. The heatsink is, in turn, cooled by two 80mm fans. ASUS claims that the GTX 770 DirectCU II OC is up to 20% cooler and three-times quieter than the referrence NVIDIA cooler. Other features include a 10-phase DIGI+ VRM, and “Super Alloy Power” capacitors, chokes, and MOSFETs. The dual slot card is 10.7” long and includes two DL-DVI, one HDMI, and one DisplayPort video ouptut. ASUS' GPU Tweak software will allow users to adjust core and memory clockspeeds, voltage, fan speeds, and the power control target.

The ASUS GTX 770 DirectCU II OC is shipping now and will be available at retailers soon. In fact, the card is avaiable at Newegg right now for just under $410.

Read more about NVIDIA's GTX 770 GPU: NVIDIA GeForce GTX 770 Review - GK104 Speed Bump @ PC Perspective!

Source: Videocardz

NVIDIA Launches New High-Performance 700M Graphics Cards

Subject: Graphics Cards | June 1, 2013 - 05:11 PM |
Tagged: gtx 700M, nvidia, mobile gpu, kepler, 780m, 700m

Earlier this year (beginning of April), NVIDIA introduced the first set of mobile graphics cards in its 700M series. These were relatively low-end cards that features at most 384 CUDA cores and were based on NVIDIA's 600-series Kepler architecture.

NVIDIA is now adding higher-end mobile GPUs to the 700M family with the GTX 760M, GTX 765M, GTX 770M, and GTX 780M. These chips are still based on Kepler (600-series), but feature more CUDA cores, more memory, a wider memory bus, and faster clockspeeds. The GTX 780M is not quite the mobile equivalent to the desktop GTX 680, but NVIDIA is matching it up against AMD's 8970M GPU and claims that it can run games like Sleeping Dogs, Assassins Creed 3, and Borderlands 2 at Ultra settings (1080p). The GTX 770M is also capable of running modern games, though some detail setitng may need to be turned down.

The chart below details the various specifications and compares the new GTX 700M cards to the existing GT 700M GPUs. At the high end, NVIDIA has the GTX 780M with 1,536 CUDA cores, a base clock of 823 MHz, and 4GB of GDDR5 memory (1250 MHz) on a 256-bit bus. The GTX 770M occupies the mid-range mobile gaming slot with 960 CUDA cores, a base clock of 811 MHz, and a memory clock of 1GHz. The GTX 760M and GTX 765M have similar hardware specifications, but the GTX 765M has a higher GPU base clock of 850 MHz versus the GTX 760M's 657 MHz base clock. The low end GTX 700M GPUs (760M and 765M) feature 768 CUDA cores, a 128-bit memory bus, and memory clockspeeds of 1GHz.

  GTX 720M GTX 735M GT 740M GT 750M GTX 760M GTX 765M GTX 770M GTX 780M
CUDA Cores 96 384 384 384 768 768 960 1536
GPU Base Clock 938 MHz 889 MHz 980 MHz 967 MHz 657 MHz 850 MHz 811 MHz 823Mhz
Memory Clock 1000 MHz 1000 MHz 2500 MHz 2500 MHz 1000 MHz 1000 MHz 1000 MHz 1250 MHz
Bus Width 64-bit 64-bit 128-bit 128-bit 128-bit 128-bit 192-bit 256-bit
          New New New New

Further, GPU Boost 2.0, Geforce Experience software, and NVIDIA Optimus support are features of the new GTX 700M graphics cards. You can read more about these NVIDIA technologies in this article by motherboard reviewer Morry Teitelman.

These cards are based on NVIDIA's 600-series despite the 700M moniker. They should provide OEMs with some good gaming options on the NVIDIA side of things and allow for some more competition in the gaming notebook hardware space against the existing AMD cards.

Source: NVIDIA

EVGA Outfits GTX 780 With Hydro Copper Water Block

Subject: Graphics Cards | June 1, 2013 - 01:38 PM |
Tagged: watercooling, nvidia, hydro copper, gtx 780, gpu, gk110, evga

EVGA GTX 780 Hydro Copper GPUs

While NVIDIA restricted partners from going with aftermarket coolers on the company's GTX TITAN graphics card, the recently released NVIDIA GTX 780 does not appear to have the same limits placed upon it. As such, many manufacturers will be releasing GTX 780 graphics cards with custom coolers. One such design that caught my attention was the Hydro Copper full cover waterblock from EVGA.

EVGA GTX 780 with Hydro Copper Water Block (2).jpg

This new cooler will be used on at least two upcoming EVGA graphics cards, the GTX 780 and GTX 780 Classified. EVGA has not yet announced clockspeeds or pricing for the Classified edition, but the GTX 780 Hydro Copper will be a GTX 780 GPU clocked at 980 MHz base and 1033 MHz boost. The 3GB of GDDR5 memory is stock clocked at 6008 MHz, however. It uses a single 8-pin and a single 6-pin PCI-E power connector. This card is selling for around $799 at retailers such as Newegg.

The GTX 780 Classified Hydro Copper will have a factory overclocked GTX 780 GPU and 3GB of GDDR5 memory at 6008 MHz, but beyond that details are scarce. The 8+8-pin PCI-E power connectors do suggest a healthy overclock (or at least that users will be able to push the cards after they get them).

Both the GTX 780 and GTX 780 Classified Hydro Copper graphics cards feature two DL-DVI, one HDMI, and one DisplayPort video outputs.

EVGA GTX 780 Classified with Hydro Copper Water Block (1).jpg

The Hydro Copper cooler itself is the really interesting bit about these cards though. It is a single slot, full cover waterblock that will cool the entire graphics card (GPU, VRM, Memory, ect). It has two inlet/outlet ports that can be swapped around to accommodate SLI setups or other custom water tube routing. A configurable LED-backlit EVGA logo adorns the side of the card and can be controlled in software. A 0.25 x 0.35 pin matrix is used in the portion of the block above the GPU to increase the surface area and aid in cooling. Unfortunately, while the card and cooler are single slot, you will actually need two case PCI expansion slots due to the two DL-DVI connectors.

It looks like a neat card, and it should perform well. I'm looking forward to seeing reviews of the card and how the cooler holds up to overclocking. Buying an overclocked card with a pre-installed waterblock is not for everyone but having a water cooled GPU with a warranty will be worth it more than pairing a stock card with a custom block.

Source: EVGA

Never mind the 780; here comes the GTX 770

Subject: Graphics Cards | May 30, 2013 - 02:55 PM |
Tagged: nvidia, kepler, gtx 770, gtx 680, GK104, geforce, MSI GTX660 HAWK

$400 is a tempting number, much less expensive than the $650 price tag on the GTX 780 and right in line with the existing GTX670 as well as AMD's HD7970.  You will probably not see many at that price, $450 is more likely as there will be very few reference cards released, all manufacturers will be putting there own spins on the design of these cards, which brings the price in line with the GTX680.  Performance wise these cards outpace the two current single GPU flagship cards, not by enough to make it worth upgrading from a 7970 or 680 but certainly enough to attract owners of previous generation cards.  [H]ard|OCP reviewed MSI's Lightning model, with dual fans, an overclock of 104MHz on the base clock and 117MHz boost, plus a completely unlocked BIOS for even more tweaking choices.

If you want to see how well it fares on our new Frame Rating metric you will have to read Ryan's full review here.

H770.jpg

"NVIDIA debuts the "new" GeForce GTX 770 today. The GeForce GTX 770 is poised to provide refreshed performance, for a surprising price. We evaluate a retail MSI GeForce GTX 770 Lightning flagship video card from MSI with specifications that will make any enthusiast smile. The $399 price point just got a kick in the pants."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

AMD Catalyst 13.6 Beta Drivers For Windows and Linux Now Available

Subject: Graphics Cards | May 28, 2013 - 11:32 PM |
Tagged: gpu, drivers, catalyst 13.6 beta, beta, amd

AMD has released its Catalyst 13.6 beta graphics driver, and it fixes a number of issues under both Windows 8 and Linux. The new beta driver is also compatible with the existing Catalyst 13.5 CAP1 (Catalyst Application Profile) which improves performance of several PC games.

As far as the Windows version of the graphics driver, Catalyst 13.6 adds OpenCL GPU acceleration support to Adobe's Premiere Pro CC software and enables AMD Wireless Display technology on systems with the company's A-Series APUs and either Broadcom or Atheros Wi-Fi chipsets. AMD has also made a couple of tweaks to its Enduro technology, including correctly identifying when a Metro app idles and offloading the corresponding GPU tasks to integrated graphics instead of a discrete card. The new beta driver also resolves an issue with audio dropout over HDMI.

AMD Catalyst Drivers.jpg

On the Linux side of things, Catalyst 13.6 beta adds support for the following when using AMD's A10, A8, A6, and A4 APUs:

  • Ubuntu 13.04
  • Xserver 1.14
  • GLX_EXT_buffer age

The driver fixes several bugs as well, including resolving black screen and corruption issues under TF2, an issue with OpenGL applications and VSYNC, and UVD playback issues where the taskbar would disappear and/or the system would experience a noticeable performance drop while playing a UVD in XBMC.

You can grab the new beta driver from the AMD website.

Source: AMD

Trimming the TITAN; NVIDIA's GTX 780

Subject: Graphics Cards | May 24, 2013 - 06:10 PM |
Tagged: nvidia, gtx 780, gk110, geforce

With 768 more CUDA Cores than the 680 but 384 less than the TITAN the 780 offers improvements over the previous generation and will be available for about $350 less than the TITAN.  As you can see in [H]ard|OCP's testing it does outperform the 680 and 7970 but not by a huge margin which hurts the price to performance ratio and makes it more attractive for 680 owners to pick up a second card for SLI.  AMD owners with previous generation cards and deep pockets might be tempted to pick up a pair of these cards as they show very good frame rating results in Ryan's review.

H_780.jpg

"NVIDIA's new GeForce GTX 780 video card has finally been unveiled. We review the GTX 780 with real world gaming with the most intense 3D games, including Metro: Last Light. If the GTX TITAN had you excited but was a bit out of your price range, the GTX 780 should hold your excitement while being a lot less expensive."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP