All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Introduction and Technical Specifications
Courtesy of EVGA
The EVGA Z87 Stinger is EVGA's Z87-based answer for the small form-factor crowd. Sporting the micro-ITX form factor, the board is featured packed and offers support for the latest generation of Intel LGA1150-based processors. While its MSRP of $229.99 may seem large for its small stature, the Z87 Stinger's feature list makes it well worth the outlay.
Courtesy of EVGA
The EVGA Z87 Stinger board features a 6-phase power delivery system and an impressive 10 layer PCB. Additionally, EVGA designed the CPU socket with a higher amount of gold, as well as use of solid state capacitors throughout the board to ensure problem-free operation under all operational circumstances. The following features are integrated into the Z87 Stinger: 4 SATA 6Gb/s ports; 1 mPCIe/mSATA 6Gb/s port; 1 eSATA 6Gb/s port; an Intel GigE NIC; 1 PCI-Express x16 slot; on board power, reset, and BIOS reset buttons; BIOS Select switch; 2-digit diagnostic LED display; and USB 2.0 and 3.0 port support.
Courtesy of EVGA
Technical Specifications (taken from the EVGA website)
|Based on Intel Z87 chipset|
|2 x 240-pin DIMM sockets
Maximum of 16GB of DDR3 (2666MHz+ in dual channel configuration)
|4 x Serial ATA 600MB/sec (4 Internal) with support for RAID 0 and RAID1|
|Audio connector (Line-in, Line-out, MIC)|
|6 Channel Creative Sound Core3D
1 x 10/100/1000 (Intel i217)
|mITX Form Factor
Length: 6.7in - 170.18mm
Width: 6.7in - 170.18mm
Operating System Support
|Windows 8 32/64bit
Windows 7 32/64bit
Windows Vista 32/64bit
Windows XP 32/64bit
|This product comes with a 3 year warranty. Registration is recommended.|
Quality time with G-Sync
Readers of PC Perspective will already know quite alot about NVIDIA's G-Sync technology. When it was first unveiled in October we were at the event and were able to listen to NVIDIA executives, product designers and engineers discuss and elaborate on what it is, how it works and why it benefits gamers. This revolutionary new take on how displays and graphics cards talk to each other enables a new class of variable refresh rate monitors that will offer up the smoothness advantages of having V-Sync off, while offering the tear-free images normally reserved for gamers enabling V-Sync.
NVIDIA's Prototype G-Sync Monitor
We were lucky enough to be at NVIDIA's Montreal tech day while John Carmack, Tim Sweeney and Johan Andersson were on stage discussing NVIDIA G-Sync among other topics. All three developers were incredibly excited about G-Sync and what it meant for gaming going forward.
Also on that day, I published a somewhat detailed editorial that dug into the background of V-sync technology, why the 60 Hz refresh rate existed and why the system in place today is flawed. This basically led up to an explanation of how G-Sync works, including integration via extending Vblank signals and detailed how NVIDIA was enabling the graphics card to retake control over the entire display pipeline.
In reality, if you want the best explanation of G-Sync, how it works and why it is a stand-out technology for PC gaming, you should take the time to watch and listen to our interview with NVIDIA's Tom Petersen, one of the primary inventors of G-Sync. In this video we go through quite a bit of technical explanation of how displays work today, and how the G-Sync technology changes gaming for the better. It is a 1+ hour long video, but I selfishly believe that it is the most concise and well put together collection of information about G-Sync for our readers.
The story today is more about extensive hands-on testing with the G-Sync prototype monitors. The displays that we received this week were modified versions of the 144Hz ASUS VG248QE gaming panels, the same ones that will in theory be upgradeable by end users as well sometime in the future. These monitors are TN panels, 1920x1080 and though they have incredibly high refresh rates, aren't usually regarded as the highest image quality displays on the market. However, the story about what you get with G-Sync is really more about stutter (or lack thereof), tearing (or lack thereof), and a better overall gaming experience for the user.
Introduction and Design
Contortionist PCs are a big deal these days as convertible models take the stage to help bridge the gap between notebook and tablet. But not everyone wants to drop a grand on a convertible, and not everyone wants a 12-inch notebook, either. Meanwhile, these same people may not wish to blow their cash on an underpowered (and far less capable) Chromebook or tablet. It’s for these folks that Lenovo has introduced the IdeaPad Flex 14 Ultrabook, which occupies a valuable middle ground between the extremes.
The Flex 14 looks an awful lot like a Yoga at first glance, with the same sort of acrobatic design and a thoroughly IdeaPad styling (Lenovo calls it a “dual-mode notebook”). The specs are also similar to that of the x86 Yoga, though with the larger size (and later launch), the Flex also manages to assemble a slightly more powerful configuration:
The biggest internal differences here are the i5-4200U CPU, which is a 1.6 GHz Haswell model with a TDP of 15 W and the ability to Turbo Boost (versus the Yoga 11S’ i5-3339Y, which is Ivy Bridge with a marginally lower TDP of 13 W and no Turbo Boost), the integrated graphics improvements that follow with the newer CPU, and a few more ports made possible by the larger chassis. Well, and the regression to a TN panel from the Yoga 11S’ much-appreciated IPS display, which is a bummer. Externally, your wallet will also appreciate a $250 drop in price: our model, as configured here, retails for just $749 (versus the $999 Yoga 11S we reviewed a few months back).
You can actually score a Flex 14 for as low as $429 (as of this writing), by the way, but if you’re after any sort of respectable configuration, that price quickly climbs above the $500 mark. Ours is the least expensive option currently available with both a solid-state drive and an i5 CPU.
Streaming games straight from NVIDIA
Over the weekend NVIDIA released a December update for the SHIELD Android mobile gaming device that included a very interesting, and somewhat understated, new feature: Beta support for NVIDIA GRID.
You have likely heard of GRID before, NVIDIA has been pushing it as part of the companies vision going forward to GPU computing in every facet and market. GRID was aimed at creating GPU-based server farms to enable mobile, streaming gaming to users across the country and across the world. While initially NVIDIA only talked about working with partners to launch streaming services based on GRID, they have obviously changed their tune slightly with this limited release.
If you own a SHIELD, and install the most recent platform update, you'll find a new icon in your NVIDIA SHIELD menu called GRID Beta. The first time you start this new application, it will attempt to measure your bandwidth and latency to offer up an opinion on how good your experience should be. NVIDIA is asking for at least 10 Mbps of sustained bandwidth, and wants round trip latency under 60 ms from your location to their servers.
Currently, servers are ONLY located in Northern California so the further out you are, the more likely you will be to run into problems. However, oing some testing in Kentucky and Ohio resulted in a very playable gaming scenarios, though we did run into some connection problems that might be load-based or latency-based.
After the network setup portion users are shown 8 different games that they can try. Darksiders, Darksiders II, Street Fighter X Tekken, Street Fighter IV, Alan Wake, The Witcher 2, Red Faction: Armageddon and Trine 2. You are free to play them free of charge during this beta though I think you can be sure they will be removed and erased at some point; just a reminder. Saves work well and we were able to save and resume games of Darksiders 2 on GRID easily and quickly.
Starting up the game was fast, about on par with starting up a game on a local PC, though obviously the server is loading it in the background. Once the game is up and running, you are met with some button mapping information provided by NVIDIA for that particular game (great addition) and then you jump into the menus as if you were running it locally.
Introduction and Technical Specifications
Courtesy of Noctua
Noctua is a well known name in the enthusiast world for its high-end CPU cooler products. Their flagship cooler, the NH-D14, features a nickel-plated copper base with dual radiator towers actively cooled by low noise 120mm and 140mm fans. The NH-D14 can be used with all current Intel and AMD CPU offerings. The cooler was put to the test against other similarly classed air and water-based cooling systems to see just how well Noctua's design would hold up. The Noctua NH-D14 does not come cheap with a retail price at $99.99, but its performance and utility should make up for that initial outlay.
Courtesy of Noctua
Courtesy of Noctua
The Noctua NH-D14 cooler is everything you would expect in a premium CPU cooler - nickel-plating for corrosion resistance, twin-tower radiators for massive heat dissipation potential, and copper / aluminum hybrid design for optimal heat transfer from the CPU. Noctua designed the NH-D14 with a total of six heat pipes, laid out in a U-shaped design which passes through the copper base plate and terminates in the radiator towers. The bottom of the copper base plate leaves the factory ground flat and polished to a mirror-like finish, ensuring optimal interfacing with the CPU surface.
Courtesy of Noctua
Noctua included the following components in with the base cooler: SecureFirm2™ multi-socket mounting kit, NF-P14 140mm fan, NF-P12 120mm fan, four fan mounting brackets, a dual-ended fan power cable, two single-fan low power cables, a case badge, and NT-H1 thermal compound.
PC Component Selections
It's that time of year again! When those of us lucky enough with the ability, get and share the best in technology with our friends and family. You are already the family IT manager so why not help spread the holiday cheer by picking up some items for them, and hey...maybe for you. :)
This year we are going to break up the guide into categories. We'll have a page dedicated to PC components, one for mobile devices like notebooks and tablets and one for PC accessories. Then, after those specific categories, we'll have an open ended collection of pages where each PC Perspective team member can throw in some wildcards.
Our Amazon code is: pcper04-20
Intel Core i7-4770K Haswell Processor
The Intel Core i7-4770K is likely the best deal in computing performance today, after to power just about any configuration of PC you can think of without breaking much a sweat. You want to game? This part has you covered? You want to encode some video? The four cores and included HyperThreading support provide just about as much power as you could need. Yes there are faster processors in the form of the the Ivy Bridge-E and even 10+ core Xeon processors, but those are significantly more expensive. For a modest price of $299 you can get what is generally considered the "best" processor on the market.
Corsair Carbide Series Air 540 Case
Cases are generally considered a PC component that is more about the preference of the buyer but there are still fundamentals that make cases good, solid cases. The new Corsair Carbide Air 540 is unique in a lot of ways. The square-ish shape allows for a division of your power supply, hard drives and SSDs from the other motherboard-attached components. Even though the case is a bit shorter than others on the market, there is plenty of working room inside thanks to the Corsair dual-chamber setup and it even includes a pair of high-performance Corsair AF140L fans for intake and exhaust. The side panel window is HUGE allowing you to show off your goods and nice touches like the rubber grommeted cable routing cut outs and dust filters make this one of the best mid-range cases available.
Another retail card reveals the results
Since the release of the new AMD Radeon R9 290X and R9 290 graphics cards, we have been very curious about the latest implementation of AMD's PowerTune technology and its scaling of clock frequency as a result of the thermal levels of each graphics card. In the first article covering this topic, I addressed the questions from AMD's point of view - is this really a "configurable" GPU as AMD claims or are there issues that need to be addressed by the company?
The biggest problems I found were in the highly variable clock speeds from game to game and from a "cold" GPU to a "hot" GPU. This affects the way many people in the industry test and benchmark graphics cards as running a game for just a couple of minutes could result in average and reported frame rates that are much higher than what you see 10-20 minutes into gameplay. This was rarely something that had to be dealt with before (especially on AMD graphics cards) so to many it caught them off-guard.
Because of the new PowerTune technology, as I have discussed several times before, clock speeds are starting off quite high on the R9 290X (at or near the 1000 MHz quoted speed) and then slowly drifting down over time.
Another wrinkle occurred when Tom's Hardware reported that retail graphics cards they had seen were showing markedly lower performance than the reference samples sent to reviewers. As a result, AMD quickly released a new driver that attempted to address the problem by normalizing to fan speeds (RPM) rather than fan voltage (percentage). The result was consistent fan speeds on different cards and thus much closer performance.
However, with all that being said, I was still testing retail AMD Radeon R9 290X and R9 290 cards that were PURCHASED rather than sampled, to keep tabs on the situation.
Introduction, Specifications and Packaging
If you're into the laptop storage upgrade scene, you hear the same sort of arguments all the time. "Do I go with a HDD for a large capacity and low cost/GB, but suffer performance"? "I want an SSD, but can't afford the capacity I need"! The ideal for this scenario is to combine both - go with a small capacity SSD for your operating system and apps, while going with a larger HDD for bulk storage at a lower cost/GB. The catch here is that most mobile platforms only come with a single 2.5" 9.5mm storage bay, and you just can't physically fit a full SSD and a full HDD into that space, can you? Well today Western Digital has answered that challenge with the Black2 Dual Drive:
Yup, we're not kidding. This is a 120GB SSD *and* a 1TB HDD in a single package. Not a hybrid. Two drives, and it's nothing short of a work of art.
The 7 Year Console Refresh
The consoles are coming! The consoles are coming! Ok, that is not necessarily true. One is already here and the second essentially is too. This of course brings up the great debate between PCs and consoles. The past has been interesting when it comes to console gaming, as often the consoles would be around a year ahead of PCs in terms of gaming power and prowess. This is no longer the case with this generation of consoles. Cutting edge is now considered mainstream when it comes to processing and graphics. The real incentive to buy this generation of consoles is a lot harder to pin down as compared to years past.
The PS4 retails for $399 US and the upcoming Xbox One is $499. The PS4’s price includes a single controller, while the Xbox’s package includes not just a controller, but also the next generation Kinect device. These prices would be comparable to some low end PCs which include keyboard, mouse, and a monitor that could be purchased from large brick and mortar stores like Walmart and Best Buy. Happily for most of us, we can build our machines to our own specifications and budgets.
As a directive from on high (the boss), we were given the task of building our own low-end gaming and productivity machines at a price as close to that of the consoles and explaining which solution would be superior at the price points given. The goal was to get as close to $500 as possible and still have a machine that would be able to play most recent games at reasonable resolutions and quality levels.
Does downloading make a difference?
I posted a story earlier this week that looked at the performance of the new PS4 when used with three different 2.5-in storage options: the stock 500GB hard drive, a 1TB hybrid SSHD and a 240GB SSD. The results were fairly interesting (and got a good bit of attention) but some readers wanted more data. In particular, many asked how things might change if you went the full digital route and purchased games straight from the Sony's PlayStation Network. I also will compare boot times for each of the tested storage devices.
You should definitely check out the previous article if you missed it. It not only goes through the performance comparison but also details how to change the hard drive on the PS4 from the physical procedure to the software steps necessary. The article also details the options we selected for our benchmarking.
- HGST 500GB 5400 RPM HDD - $50 - $0.10/GB
- Seagate 1TB Hybrid SSHD - $122 - $0.12/GB
- Corsair 240GB Force GS SSD - $189 - $0.78/GB
Today I purchased a copy of Assassin's Creed IV from the PSN store (you're welcome Ubisoft) and got to testing. The process was the same: start the game then load the first save spot. Again, each test was run three times and the averages were reported. The PS4 was restarted between each run.
The top section of results is the same that was presented earlier - average load times for AC IV when the game is installed from the Blu-ray. The second set is new and includes average load times fro AC IV after the installation from the PlayStation Network; no disc was in the drive during testing.
Load time improvements
On Friday Sony released the PlayStation 4 onto the world. The first new console launch in 7 years, the PS4 has a lot to live up to, but our story today isn't going to attempt to weigh the value of the hardware or software ecosystem. Instead, after our PS4 teardown video from last week, we got quite a few requests for information on storage performance with the PS4 and what replacement hardware might offer gamers.
Hard Drive Replacement Process
Changing the hard drive in your PlayStation 4 is quite simple, a continuation of a policy Sony's policy with the PS3.
Installation starts with the one semi-transparent panel on the top of the unit, to the left of the light bar. Obviously make sure your PS4 is completely turned off and unplugged.
Simply slide it to the outside of the chassis and wiggle it up to release. There are no screws or anything to deal with yet.
Once inside you'll find a screw with the PS4 shapes logos on them; that is screw you need to remove to pull out the hard drive cage.
Introduction and Features
Today Western Digital launched an important addition to their Personal Cloud Storage NAS family - the My Cloud EX4:
The My Cloud EX4 is Western Digital's answer to the increased demand for larger personal storage devices. When folks look for places to consolidate all of their bulk files, media, system backups, etc, they tend to extend past what is possible with a single hard drive. Here is Western Digital's projection on where personal storage is headed:
Where the My Cloud was a single drive solution, the My Cloud EX4 extends that capability to span up to four 3.5" drives. When it comes to devices that span across several drives, the number 4 is a bit of a sweet spot, as it enables several RAID configurations:
Everything but online capacity expansion (where the user can swap drives one at a time to a larger capacitiy) is suppoted. While WD has stated that feature will be available in a future update, I find it a bit risky to intentionally and repeatedly fail an array by pulling drives and forcing rebuilds. It just makes more sense to back up the data and re-create a fresh array with the new larger drives installed.
Ok, so we've got the groundwork down with a 4-bay NAS device. What remains to be seen is how Western Digital has implemented the feature set. There is a lot to get through here, so let's get to it.
NVIDIA Tegra Note Program
Clearly, NVIDIA’s Tegra line has not been as successful as the company had hoped and expected. The move for the discrete GPU giant into the highly competitive world of the tablet and phone SoCs has been slower than expected, and littered with roadblocks that were either unexpected or that NVIDIA thought would be much easier to overcome.
The truth is that this was always a long play for the company; success was never going to be overnight and anyone that thought that was likely or possible was deluded. Part of it has to do with the development cycle of the ARM ecosystem. NVIDIA is used to a rather quick development, production, marketing and sales pattern thanks to its time in high performance GPUs, but the SoC world is quite different. By the time a device based on a Tegra chip is found in the retail channel it had to go through an OEM development cycle, NVIDIA SoC development cycle and even an ARM Cortex CPU development cycle. The result is an extended time frame from initial product announcement to retail availability.
Partly due to this, and partly due to limited design wins in the mobile markets, NVIDIA has started to develop internal-designed end-user devices that utilize its Tegra SoC processors. This has the benefit of being much faster to market – while most SoC vendors develop reference platforms during the normal course of business, NVIDIA is essentially going to perfect and productize them.
More Details from Lisa Su
The executives at AMD like to break their own NDAs. Then again, they are the ones typically setting these NDA dates, so it isn’t a big deal. It is no secret that Kaveri has been in the pipeline for some time. We knew a lot of the basic details of the product, but there were certainly things that were missing. Lisu Su went up onstage and shared a few new details with us.
Kaveri will be made up of 4 “Steamroller” cores, which are enhanced versions of the previous Bulldozer/Trinity/Vishera families of products. Nearly everything in the processor is doubled. It now has dual decode, more cache, larger TLBs, and a host of other smaller features that all add up to greater single thread performance and better multi-threaded handling and performance. Integer performance will be improved, and the FPU/MMX/SSE unit now features 2 x 128 bit FMAC units which can “fuse” and support AVX 256.
However, there was no mention of the fabled 6 core Kaveri. At this time, it is unlikely that particular product will be launched anytime soon.
Introduction and Design
With few exceptions, it’s generally been taken for granted that gaming notebooks are going to be hefty devices. Portability is rarely the focus, with weight and battery life alike usually sacrificed in the interest of sheer power. But the MSI GE40 2OC—the lightest 14-inch gaming notebook currently available—seeks to compromise while retaining the gaming prowess. Trending instead toward the form factor of a large Ultrabook, the GE40 is both stylish and manageable (and perhaps affordable at around $1,300)—but can its muscle withstand the reduction in casing real estate?
While it can’t hang with the best of the 15-inch and 17-inch crowd, in context with its 14-inch peers, the GE40’s spec sheet hardly reads like it’s been the subject of any sort of game-changing handicap:
One of the most popular CPUs for Haswell gaming notebooks has been the 2.4 GHz (3.4 GHz Turbo) i7-4700MQ. But the i7-4702MQ in the GE40-20C is nearly as powerful (managing 2.2 GHz and 3.2 GHz in those same areas respectively), and it features a TDP that’s 10 W lower at just 37 W. That’s ideal for notebooks such as the GE40, which seek to provide a thinner case in conjunction with uncompromising performance. Meanwhile, the NVIDIA GTX 760M is no slouch, even if it isn’t on the same level as the 770s and 780s that we’ve been seeing in some 15.6-inch and 17.3-inch gaming beasts.
Elsewhere, it’s business as usual, with 8 GB of RAM and a 120 GB SSD rounding out the major bullet points. Nearly everything here is on par with the best of rival 14-inch gaming models with the exception of the 900p screen resolution (which is bested by some notebooks, such as Dell’s Alienware 14 and its 1080p panel).
EVGA Brings Custom GTX 780 Ti Early
Reference cards for new graphics card releases are very important for a number of reasons. Most importantly, these are the cards presented to the media and reviewers that judge the value and performance of these cards out of the gate. These various articles are generally used by readers and enthusiasts to make purchasing decisions, and if first impressions are not good, it can spell trouble. Also, reference cards tend to be the first cards sold in the market (see the recent Radeon R9 290/290X launch) and early adopters get the same technology in their hands; again the impressions reference cards leave will live in forums for eternity.
All that being said, retail cards are where partners can differentiate and keep the various GPUs relevant for some time to come. EVGA is probably the most well known NVIDIA partner and is clearly their biggest outlet for sales. The ACX cooler is one we saw popularized with the first GTX 700-series cards and the company has quickly adopted it to the GTX 780 Ti, released by NVIDIA just last week.
I would normally have a full review for you as soon as we could but thanks to a couple of upcoming trips that will keep me away from the GPU test bed, that will take a little while longer. However, I thought a quick preview was in order to show off the specifications and performance of the EVGA GTX 780 Ti ACX.
As expected, the EVGA ACX design of the GTX 780 Ti is overclocked. While the reference card runs at a base clock of 875 MHz and a typical boost clock of 928 MHz, this retail model has a base clock of 1006 MHz and a boost clock of 1072 MHz. This means that all 2,880 CUDA cores are going to run somewhere around 15% faster on the EVGA ACX model than the reference GTX 780 Ti SKUs.
We should note that though the cooler is custom built by EVGA, the PCB design of this GTX 780 Ti card remains the same as the reference models.
An issue of variance
AMD just sent along an email to the press with a new driver to use for Radeon R9 290X and Radeon R9 290 testing going forward. Here is the note:
We’ve identified that there’s variability in fan speeds across AMD R9 290 series boards. This variability in fan speed translates into variability of the cooling capacity of the fan-sink.
The flexibility of AMD PowerTune technology enables us to correct this variability in a driver update. This update will normalize the fan RPMs to the correct values.
The correct target RPM values are 2200RPM for the AMD Radeon R9 290X ‘Quiet mode’, and 2650RPM for the R9 290. You can verify these in GPU-Z.
If you’re working on stories relating to R9 290 series products, please use this driver as it will reduce any variability in fan speeds. This driver will be posted publicly tonight.
Great! This is good news! Except it also creates some questions.
When we first tested the R9 290X and the R9 290, we discussed the latest iteration of AMD's PowerTune technology. That feature attempts to keep clocks as high as possible under the constraints of temperature and power. I took issue with the high variability of clock speeds on our R9 290X sample, citing this graph:
I then did some digging into the variance and the claims that AMD was building a "configurable" GPU. In that article we found that there were significant performance deltas between "hot" and "cold" GPUs; we noticed that doing simple, quick benchmarks would produce certain results that were definitely not real-world in nature. At the default 40% fan speed, Crysis 3 showed 10% variance with the 290X at 2560x1440:
GK110 in all its glory
I bet you didn't realize that October and November were going to become the onslaught of graphics cards it has been. I know I did not and I tend to have a better background on these things than most of our readers. Starting with the release of the AMD Radeon R9 280X, 270X and R7 260X in the first week of October, it has pretty much been a non-stop battle between NVIDIA and AMD for the hearts, minds, and wallets of PC gamers.
Shortly after the Tahiti refresh came NVIDIA's move into display technology with G-Sync, a variable refresh rate feature that will work with upcoming monitors from ASUS and others as long as you have a GeForce Kepler GPU. The technology was damned impressive, but I am still waiting for NVIDIA to send over some panels for extended testing.
Later in October we were hit with the R9 290X, the Hawaii GPU that brought AMD back in the world of ultra-class single GPU card performance. It has produced stellar benchmarks and undercut the prices (then at least) of the GTX 780 and GTX TITAN. We tested it in both single and multi-GPU configurations and found that AMD had made some impressive progress in fixing its frame pacing issues, even with Eyefinity and 4K tiled displays.
NVIDIA dropped a driver release with ShadowPlay that allows gamers to record playback locally without a hit on performance. I posted a roundup of R9 280X cards which showed alternative coolers and performance ranges. We investigated the R9 290X Hawaii GPU and the claims that performance is variable and configurable based on fan speeds. Finally, the R9 290 (non-X model) was released this week to more fanfare than the 290X thanks to its nearly identical performance and $399 price tag.
And today, yet another release. NVIDIA's GeForce GTX 780 Ti takes the performance of the GK110 and fully unlocks it. The GTX TITAN uses one fewer SMX and the GTX 780 has three fewer SMX units so you can expect the GTX 780 Ti to, at the very least, become the fastest NVIDIA GPU available. But can it hold its lead over the R9 290X and validate its $699 price tag?
Introduction, Specifications and Packaging
It has been a while since OCZ introduced their Vector SSD, and it was in need of a refresh to bring its pricing more in-line with the competition, which had been equipping their products with physically smaller flash dies (therefore reducing cost). Today, OCZ launched a refresh to their Vector - now dubbed the Vector 150:
The OCZ strategy changed up a while back. They removed a lot of redundancy and confusing product lines, consolidating everything into a few simple solutions. Here's a snapsot of that strategy, showing the prior and newer iterations of three simple solutions:
The Vector 150 we look at today falls right into the middle here. I just love the 'ENTHUSIST' icon they went with:
Why is there a bourbon review on a PC-centric website?
We can’t live, eat, and breathe PC technology all the time. All of us have outside interests that may not intersect with the PC and mobile market. I think we would be pretty boring people if that were the case. Yes, our professional careers are centered in this area, but our personal lives do diverge from the PC world. You certainly can’t drink a GPU, though I’m sure somebody out there has tried.
The bottle is unique to Wyoming Whiskey. The bourbon has a warm, amber glow about it as well. Picture courtesy of Wyoming Whiskey
Many years ago I became a beer enthusiast. I loved to sample different concoctions, I would brew my own, and I settled on some personal favorites throughout the years. Living in Wyoming is not necessarily conducive to sampling many different styles and types of beers, and so I was in a bit of a rut. A few years back a friend of mine bought me a bottle of Tomatin 12 year single malt scotch, and I figured this would be an interesting avenue to move down since I had tapped out my selection of new and interesting beers (Wyoming has terrible beer distribution).