GK104 takes a step down
While the graphics power found in the new GeForce GTX 690, the GeForce GTX 680 and even the Radeon HD 7970 are incredibly impressive, if we are really honest with ourselves the real meat of the GPU market buys options much lower than $999. Today's not-so-well-kept-secret release of the GeForce GTX 670 attempts to bring the price to entry of the NVIDIA Kepler architecture down to a more attainable level while also resetting the performance per dollar metrics of the GPU world once again.
The GeForce GTX 670 is in fact a very close cousin to the GeForce GTX 680 with only a single SMX unit disabled and a more compelling $399 price tag.
The GTX 670 GPU - Nearly as fast as the GTX 680
The secret is out - GK104 finds its way onto a third graphics card in just two months - but in this iteration the hardware has been reduced slightly.
The GTX 670 block diagram we hacked together above is really just a GTX 680 diagram with a single SMX unit disabled. While the GTX 680 sported a total of 1536 CUDA cores broken up into eight 192 core SMX units, the new GTX 670 will include 1344 cores. This will also drop the texture units to 112 (from 128 on the GTX 680) though the ROP count stays at 32 thanks to the continued use of a 256-bit memory interface.
Subject: Graphics Cards | April 27, 2012 - 08:49 PM | Tim Verry
Tagged: gtx 680 GC 2GB, gtx 680, gpu, galaxy
Popular NVIDIA Add In Board partner Galaxy Microsystems announced their new custom GeForce GTX 680 graphics card. Specifically, the Galaxy GTX 680 GC 2GB is their custom PCB and custom cooled version of the NVIDIA GTX 680 (which we reviewed here) which also comes overclocked from the factory.
The GTX 680 GC 2GB comes with a base clock of 1110 MHz and a boost clock of 1176 MHz which is a healthy overclock compared to the reference clock speeds of 1006 MHz and 1058 MHz respectively. Beyond the factory overclock, Galaxy has implemented a custom PCB design that appears to have eschewed the stacked PCI-E power connectors in favor of the traditional side by side approach which allowed them to use two large fans for the cooler.
The cooler in question uses an aluminum fin array with quad nickel-plated heat pipes to keep the GPU core, memory, and VRMs nice and frosty. Looking somewhat like the XFX 7970 Black Edition, the Galaxy GTX 680 GC 2GB features a custom dual fan cooling solution with brushed aluminum cooling shroud, LED accents, and two large fans (which look to be about 90mm).
Galaxy stated that “overclocking enthusiasts will find the improved cooling of the Galaxy GTX 680 GC to be indispensable for pushing their clocks to the absolute maximum for the best possible performance.”
Galaxy has stated that the new GTX 680 GC card is available now, and is retailing for around $540 USD. [Update: it is already sold out on Newegg but Amazon has 4 left] which puts it a bit more than $40 over Galaxy's referrence version -- not too bad.
When the NVIDIA GeForce GTX 680 launched in March we were incredibly impressed with the performance and technology that the GPU was able to offer while also being power efficient. Fast forward nearly a month and we are still having problems finding the card in stock - a HUGE negative towards the perception of the card and the company at this point.
Still, we are promised by NVIDIA and its partners that they will soon have more on shelves, so we continue to look at the performance configurations and prepare articles and reviews for you. Today we are taking a look at the Galaxy GeForce GTX 680 2GB card - their most basic model that is based on the reference design.
If you haven't done all the proper reading about the GeForce GTX 680 and the Kepler GPU, you should definitely check out my article from March that goes into a lot more detail on that subject before diving into our review of the Galaxy card.
The Card, In Pictures
The Galaxy GTX 680 is essentially identical to the reference design with the addition of some branding along the front and top of the card. The card is still a dual-slot design, still requires a pair of 6-pin power connections and uses a very quiet fan in relation to the competition from AMD.
Subject: Graphics Cards | March 26, 2012 - 11:36 AM | Tim Verry
Tagged: nvidia, gtx680, gpu, galaxy, 4gb gtx 680
A new article over at Chinese site EXP Review suggests that graphics card manufacturer Galaxy is gearing up with three new NVIDIA GTX 680 graphics cards. Among the new GTX 680 GPUs pictured, Galaxy is planning a reference card, a heavily overclocked Hall Of Fame card, and models with 4 GB of GDDR5 memory.
Galaxy is upping the memory ante with a new NVIDIA GTX 680 that has 4 GB of memory - twice that of the reference cards. The card will use Hynix memory chips (8 on the front, 8 on the rear of the card), an improved 5+2 phase power supply with DirectFET replacements for the usual MOSFET design. In addition, the card features 6+8 pin PCI-E power connectors and an aftermarket Galaxy Gemini cooler. The Gemini HSF uses heatpipes and an aluminum fin array cooled by two 90mm fans to cool the card. The extra cooling enabled Galaxy to offer the new card with a 10% factory overclock, for a base clockspeed of 1.1 GHz.
The other interesting card is the upcoming Galaxy GTX 680 Hall of Fame edition. This card is based on a white PCB with two 8 pin PCI-E power connectors (like those of the Asus DirectCU II and MSI Lightning). Further, it is cooled with three 90mm fans and five heatpipes leading to an aluminum fin array. The card will come equipped with dual BIOS support and overclocking software. It is not directly stated, but the Hall of Fame edition should be more overclockable thanks to the expanded cooling solution. Also, it may come with 4 GB of memory like the card above.
In our PCPer Live Review, it was stated that while NVIDIA could do reference cards with 4 GB of memory, they chose not to in order to hit certain price points and profit margins. Instead, they left it up to the Add In Board partners to offer cards with extra RAM and factory overclocks. Galaxy is prepping their 4 GB cards, but theya re not likely to be the only vendor offering cards with increased memory. More photos are available over at EXPReview.
Four Displays for Under $70
Running multiple displays on your PC is becoming a trend that everyone is trying to jump on board with thanks in large part to the push of Eyefinity from AMD over the past few years. Gaming is a great application for multi-display configurations but in truth game compatibility and game benefits haven't reached the level I had hoped they would by 2012. But while gaming still has a way to go, the consumer applications for having more than a single monitor continue to expand and cement themselves in the minds of users.
Galaxy is the only NVIDIA partner that is really taking this market seriously with an onslaught of cards branded as MDT, Multiple Display Technology. Using non-NVIDIA hardware in conjunction with NVIDIA GPUs, Galaxy has created some very unique products for consumers like the recently reviewed GeForce GTX 570 MDT. Today we are going to be showing you the new Galaxy MDT GeForce GT 520 offering that brings support for a total of four simultaneous display outputs to a card with a reasonable cost of under $120.
The Galaxy MDT GeForce GT 520
Long time readers of PC Perspective already likely know what to expect based on the GPU we are using here but the Galaxy MDT model offers quite a few interesting changes.
The retail packaging clearly indicates the purpose of this card for users looking at running more than two displays. The GT 520 is not an incredibly powerful GPU when it comes to gaming but Galaxy isn't really pushing the card in that manner. Here are the general specs of the GPU for those that are interested:
- 48 CUDA cores
- 810 MHz core clock
- 1GB DD3 memory
- 900 MHz memory clock
- 64-bit memory bus width
- 4 ROPs
- DirectX 11 support
Subject: Mobile | January 11, 2012 - 12:04 AM | Matt Smith
Tagged: tablet, Samsung, mobile phone, galaxy, CES
One of the more unusual products debuted by a major manufacturer this CES has to be the Samsung Galaxy Note. It’s a 5.3” device that runs Android 2.3 with the Samsung TouchWiz interface.
The Note is unusual because of its size. The 5” to 7” range is a bit of a no man’s land in the world of mobile devices. Such products are considered too small to be a real tablet, but also too large to be a decent phone. Though there have been attempts to enter products in this range, they haven’t sold in huge numbers.
Apparently, Samsung thinks the market is worth some serious effort. They’re making a big deal of this device – to my eye, it looked as if there were more of these available on the show floor than any product the company offers. And as if to drive the point home, the CES bus I took back from the convention center today – like most of the buses at CES – was wrapped in Samsung Galaxy Note advertisements.
So what’s it like? Well, it’s like a big phone. Or a small tablet. Since it runs Android 2.3 and uses TouchWiz, the interface is basically identical to the rest of Samsung’s massive line of Android phones. Plastic is the material of choice in the construction of the chassis, which doesn’t lend the product a premium feel.
It does make the Samsung Galaxy Note light, however. Official numbers put it at 178g (about .4 pounds) which is less than half the weight of your typical 7” tablet. The thickness of 9.25mm (about .4 inches) doesn’t seem outstanding, but the curved rear cover helps reduce perceived thickness.
Samsung is known for its mobile displays, and the Note doesn’t disappoint. It uses a Samsung AMOLED with a resolution of 1280x800. This allows the small Note to offer as many or more usable pixels then much larger tablets, and it also contributes to an extremely sharp image. Unfortunately there wasn’t streaming video available to view at the time I used the device, but games look excellent. Maximum display brightness was high, as well.
Like the ASUS MeMO, the Note includes a stylus. Useful? Not so far as I can tell. Sure, it does a fine job of accepting handwriting, but I have a hard time seeing this smaller device used as an electronic notepad. Is there really an audience for that outside of some enterprise environments?
Inside there is a 1.5 GHz dual-core processor as well as 1GB of memory and 16 or 32 gigabytes of internal storage. In my use the device felt smooth, but no more so than most other high-end smartphones I looked at both during CES and before.
The Note is equipped for use with cellular networks including HSPA, 4G LTE and EDGE. North American availability will come via AT&T. Pricing is not announced - $199 is of course typical for high-end handsets, but Samsungs have gone for higher prices before. The $249 to $299 price range (with contract) seems more likely.
Will the Note be a success? Perhaps. Samsung has already sold over a million units in Europe, where the Note was introduced late last year. However, the Note so far is planned to ship in North America without Ice Cream Sandwich support built in (an upgrade will bring it, but there’s no release date). That could be a major knock against the Note. Availability will be in spring, so we’ll soon find out the Note’s fate.
PC Perspective's CES 2012 coverage is sponsored by MSI Computer.
Follow all of our coverage of the show at http://pcper.com/ces!
Galaxy Continues the MDT Push
One of the key selling points for the AMD Radeon series of graphics cards the last few generations has been Eyefinity - the ability to run more than two displays off of a single card while also allowing for 3+ display gaming configurations. NVIDIA-based solutions required a pair of GPUs running in SLI for this functionality, either standard SLI or the "SLI-on-a-card" solutions like the GTX 590.
However, another solution has appeared from Galaxy, an NVIDIA partner that has created a series of boards with the MDT moniker - Multi-Display Technology. Using a separate on-board chip the company has created GTX 560 Ti, GTX 570 and GTX 580 cards that can output to 4 or 5 monitors using only a single NVIDIA GPU, cutting down on costs while offering a feature that no other single-GPU solution could.
Today we are going to be reviewing the Galaxy GeForce GTX 570 MDT X4 card that promises 4 display outputs and a triple-panel seamless gaming surface option for users that want to explore gaming on more than a single monitor inside the NVIDIA ecosystem.
Introduction, Specs, Design and Ergonomics
Samsung's Galaxy S II smartphone debuted in the U.S. with Sprint, AT&T, and T-Mobile in September and we finally got our hands on a review sample. The Samsung smartphone runs on Android 2.3 "Gingerbread" operating system and includes an 8 MP camera with LED flash and 1080p video, front facing 2 MP camera, and Samsung’s custom TouchWiz user interface.
T-Mobile and Sprint’s version sports a 4.52-inch display, but AT&T’s version has a 4.3-inch screen that matches the original international version of the Galaxy S II. We are reviewing T-Mobile's Galaxy S II with 16GB of internal memory (there are two options for 16 and 32 GB). The Sprint and AT&T versions are outfitted with a dual-core 1.2 GHz Orion processor, but the T-Mobile version we are reviewing today sports a Qualcomm Snapdragon S3 1.5 GHz dual-core CPU.
Subject: Graphics Cards | September 12, 2011 - 12:26 PM | Jeremy Hellstrom
Tagged: gtx560 ti, nvidia, nvidia surround, gtx560 ti mdt x5, galaxy
The GTX560Ti MDT X5 (Multi Display Technology) can handle sending signal to 5 displays simultaneously which is something that no NVIDIA card could do before. AMD had the only GPUs which could handle multiple displays, you needed an SLI setup to manage NVIDIA Surround. [H]ard|OCP were lucky enough to get to play with this $330 mid-range GPU that knows a new trick and were suitably impressed by its ability to provide good gaming performance on three 1080p monitors. They also mention a GeForce 210 model that can handle up to four monitors for the non-gamer.
"With the new Galaxy GeForce GTX560Ti MDT X5 video card it is possible to output to 5 displays even though this is a single GPU video card. Enjoy NVIDIA multi-display spanned resolution gaming without the need for two cards! Can this GTX 560 Ti based video card stand up in the latest games when spanned across three displays? We are surprised."
Here are some more Graphics Card articles from around the web:
- ASUS ROG MARS 2 3GB Video Card Review w/ NVIDIA Surround @ Legit Reviews
- ASUS GTX560 Ti DirectCU II TOP Graphics Card Review @ OCIA
- Zotac Geforce GTX 590 @ XSReviews
- KFA2 GTX570 @ OC3D
- MSI N580GTX Lightning Xtreme Edition Review @ t-break
- Thermalright Shaman VGA Cooler @ Tweaktown
- DeepCool V6000 VGA Cooler Review @ eTeknix
- AMD Eyefinity and Nvidia Surround Technologies @ X-bit Labs
- Sapphire Radeon HD 6850 Vapor-X OC @ Tweaktown
- Sapphire Radeon HD 6850 Vapor-X Review @ Neoseeker
- Faster Than Radeon HD 6970: Sapphire Radeon HD 6950 2 GB Toxic Edition @ X-bit Labs
- Gigabyte HD 6770 Silent Cell 1GB GDDR5 DirectX11 Video Card Review @ Hi Tech Legion
- PowerColor Radeon HD6770 video card review @ TechwareLabs
- Powercolor LCS HD 6990 4GB Review @ OCC
- AMD Radeon HD 6450 @ Phoronix
Subject: Graphics Cards | August 2, 2011 - 10:46 AM | Tim Verry
Tagged: graphics, gpu, galaxy
Popular maker of NVIDIA graphics cards Galaxy, recently announced that they are extending the warranty of their graphics cards products to three years. "Galaxy has listened to the enthusiast market and we are glad to move from a 2 year warranty to a 3 year warranty by registration." The new extended warranty will apply to all graphics cards purchased after August 1st, 2011 that are then registered with Galaxy. Products will further bear the seal shown below to let customers know that the graphics card qualifies.
Seeing warranties being extended is always a good thing, especially in a world where the once popular lifetime warranty is rare. What do you think of the extended warranty? Will this be enough to push you towards a Galaxy branded card on your next purchase?