Subject: General Tech | February 20, 2014 - 07:35 PM | Jeremy Hellstrom
Tagged: vlan, gaming, fun, fragging frogs
This weekend, specifically at 10AM ET on February 22nd the fifth Virtual LAN Party kicks off, hosted by the famous Fragging Frogs and PC Perspective, with a good chance of some secret visitors from the Red Team. There is no set end time for the event nor for any of the games so if you have any time on Saturday in which you do not find yourself saving innocents from a fate worse than death* then hop on and play with us. You get to hang out with Forum members you have built friendships with, shoot those you hold a grudge against, and have the chance to frag Ryan, Josh and all the other PC Per staff brave enough to set foot into this battle royale.
*The only acceptable excuse to bail on us.
The games are many, both old and new; with a strong likelihood of every Battlefield game since 1942 represented along with spicy versions of Unreal 2004, Torchlight II, Hawken, a game involving heavy weapons versus dinosaurs and much more. You can see the whole list here and can suggest others if we missed one of your favourites. To prepare for the event you should install TeamSpeak 3 which is our chat client of choice; you can find the server info right here. Other than that just show up when you can and hop into the game of your choice.
We have been doing VLAN's on a very irregular schedule since 2010 but if this weekend goes well we might just start eating a bit more fibre and start a more regular schedule. There were once even tournaments held, though that was back when UT2K4 was shiny and new, which we could start up again. For even more fun, you could help the Frogs get back together on the regular schedule we used to hold to back in the golden days of yore. Don't forget to consider a jump in The Pond once you are finished playing, keep that CPU and GPU working full out!
This is also a great time to thank Lenny and AMD for the wonderful good deed that they did for one of our long term members. You can read about their good works here and be sure to thank them for putting the effort into helping out one of our own.
We look forward to seeing you there and shooting you several times.
Subject: General Tech, Graphics Cards | February 20, 2014 - 05:45 PM | Ken Addison
Tagged: nvidia, mining, maxwell, litecoin, gtx 750 ti, geforce, dogecoin, coin, bitcoin, altcoin
As we have talked about on several different occasions, Altcoin mining (anything that is NOT Bitcoin specifically) is a force on the current GPU market whether we like it or not. Traditionally, Miners have only bought AMD-based GPUs, due to the performance advantage when compared to their NVIDIA competition. However, with continued development of the cudaMiner application over the past few months, NVIDIA cards have been gaining performance in Scrypt mining.
The biggest performance change we've seen yet has come with a new version of cudaMiner released yesterday. This new version (2014-02-18) brings initial support for the Maxwell architecture, which was just released yesterday in the GTX 750 and 750 Ti. With support for Maxwell, mining starts to become a more compelling option with this new NVIDIA GPU.
With the new version of cudaMiner on the reference version of the GTX 750 Ti, we were able to achieve a hashrate of 263 KH/s, impressive when you compare it to the performance of the previous generation, Kepler-based GTX 650 Ti, which tops out at about 150KH/s or so.
As you may know from our full GTX 750 Ti Review, the GM107 overclocks very well. We were able to push our sample to the highest offset configurable of +135 MHz, with an additional 500 MHz added to the memory frequency, and 31 mV bump to the voltage offset. All of this combined to a ~1200 MHz clockspeed while mining, and an additional 40 KH/s or so of performance, bringing us to just under 300KH/s with the 750 Ti.
As we compare the performance of the 750 Ti to AMD GPUs and previous generation NVIDIA GPUs, we start to see how impressive the performance of this card stacks up considering the $150 MSRP. For less than half the price of the GTX 770, and roughly the same price as a R7 260X, you can achieve the same performance.
When we look at power consumption based on the TDP of each card, this comparison only becomes more impressive. At 60W, there is no card that comes close to the performance of the 750 Ti when mining. This means you will spend less to run a 750 Ti than a R7 260X or GTX 770 for roughly the same hash rate.
Taking a look at the performance per dollar ratings of these graphics cards, we see the two top performers are the AMD R7 260X and our overclocked GTX 750 Ti.
However, when looking at the performance per watt differences of the field, the GTX 750 Ti looks more impressive. While most miners may think they don't care about power draw, it can help your bottom line. By being able to buy a smaller, less efficient power supply the payoff date for the hardware is moved up. This also bodes well for future Maxwell based graphics cards that we will likely see released later in 2014.
Subject: Motherboards | February 20, 2014 - 03:29 PM | Jeremy Hellstrom
Tagged: z87, Killer Fatal1ty, asrock
ASRock has picked up some impressive branding recently but the question of performance remains. They have been in the motherboard business for a while but have always been considered a bargain brand by most enthusiasts. [H]ard|OCP reviewed their flagship Z87 board recently to see if perhaps it is time to reconsider that impression. With three PCIe 3.0 16x slots and four PCIe 2.0 1x slots it does seem a high end motherboard and the inclusion of an onboard KillerNIC may also attract some attention. If this board really wants to attract attention it is the overclocking and stability which really matter and seeing as how it finished the review wearing a gold medal you can get a rough idea of how that testing went. Read the full review to see ASRock's impressive Killer Fatal1ty board in action.
"If you’re a long time reader of HardOCP then you probably know we are not huge fans of ASRock as a brand overall. Recently we purchased a new motherboard to spend some time with the ASRock Z87 KILLER FATAL1TY, and as a result our opinions are changing. Is that change for the better or for worse?"
Here are some more Motherboard articles from around the web:
- EVGA Z87 Stinger (Intel LGA 1150) @ techPowerUp
- ASUS RAMPAGE IV BLACK EDITION @ techPowerUp
- Gigabyte GA-Z87X-OC @ X-bit Labs
- Gigabyte G1 Sniper Z87 Motherboard Review @ Hardware Asylum
- ASRock Z87 Extreme11/ac @ SSD Review
- GIGABYTE Z87X-UD5 TH Intel Z87 Thunderbolt Motherboard Review @ Legit Reviews
- ASUS Z87I-PRO (Intel LGA 1150) @ techPowerUp
- Gigabyte GA-Z87X-UD4H Motherboard Review @ Modders-Inc
- MSI Z87I Gaming AC Mini-ITX @ Kitguru
- ASUS A88X-PRO AMD FM2+ Motherboard Review @ Legit Reviews
- Gigabyte GA-F2A88X-UP4 AMD FM2+ Motherboard Review @ OCIA
- ECS KBN-I + AMD E1-2100 "Kabini" APU @ Phoronix
Subject: Storage | February 20, 2014 - 02:37 PM | Jeremy Hellstrom
Tagged: LaCie, external drive, 5TB, thunderbolt
That didn't take very long, Toshiba just announced their 5TB drive and now LaCie has announced an external drive with 5TB of storage. You will need Thunderbolt to properly interface with it, perhaps a good thing for users since transferring 5TB over USB 2.0 is not the most enjoyable experience. This also means you can pick up the 5 bay model called 5big and have 25TB of external storage available for you.
CUPERTINO, CA – Today, LaCie announced the availability of 5TB, 7200 rpm hard–drive capacities in its 5big Thunderbolt Series, 2big Thunderbolt Series and d2 Thunderbolt Series. Delivering external storage products that range from 5TB single drive systems to 25TB RAID solutions boosts storage capacity by 20 percent. This increase showcases the company's commitment to provide the fastest, highest capacity storage solutions on the market.
Increasingly larger file formats for film and photography have driven the demand for more storage capacity. The availability of 5TB hard drives enables LaCie to deliver significantly more storage capacity in its same compact desktop designs. This saves professionals valuable desktop space.
LaCie's 5big Thunderbolt now features a capacity of up to 25TB, which makes it the largest 5–bay storage solution on the market. Combined with industry–leading speeds up to 785 MB/s*, it is the ideal product for video professionals to pair with a Thunderbolt–enabled computer, like the new Mac Pro, to drive 4K workflows. Photography professionals will appreciate the larger capacities of the d2 Thunderbolt and 2big Thunderbolt, with the same fast transfer speeds and responsive photo browsing that they depend on from these products.
The new capacities are also available on the LaCie 2big Quadra and d2 Quadra storage solutions. All products can be purchased at the LaCie Online Store and LaCie Resellers.
Subject: General Tech | February 20, 2014 - 02:17 PM | Ken Addison
Tagged: podcast, video, toshiba, raptr, R9 290X, r9 290, pcper, OEM, maxwell, gtx 750 ti, desktop pc, 750 ti, 5TB
PC Perspective Podcast #288 - 02/20/2014
Join us this week as we discuss the release of the NVIDIA GTX 750 Ti, Upgrading Crappy Desktops, 5TB Hard Drives and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the Store
- RSS - Subscribe through your regular RSS reader
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath and Allyn Malventano
Week in Review:
News items of interest:
Hardware/Software Picks of the Week:
Subject: General Tech | February 20, 2014 - 01:36 PM | Jeremy Hellstrom
Tagged: VIA, rumours
VIA, that once famous company which has petered out in the North American market is back in the news. According to DigiTimes they recently joined forces with a firm owned by the Chinese government and is now moving production over to new facilities. VIA has only 20% of this new joint venture which could signal the final end to their existence as a producer of x86 processors. The move could be influenced by Intel, who license both the PCIe and x86 technology to VIA but this is deemed unlikely as Intel would like to stay on the Chinese governments good side. The current Nano and V7 are Vista capable and appear in mobile devices in the AP region.
"VIA Technologies is rumored to have started shifting its x86 CPU technologies and related personnel to its newly formed IC design joint venture with a China government-owned investment firm, according to market watchers, adding that VIA recently notified clients that it will stop supplying x86 processors temporarily."
Here is some more Tech News from around the web:
- SkyDrive is dead! All hail Microsoft OneDrive! Happy now, Uncle Rupe? @ The Register
- Microsoft cries out to UK government against open source @ The Inquirer
- Cisco's the new Tivo, pumps out 'DVR in the cloud' offering @ The Register
Subject: General Tech | February 20, 2014 - 02:40 AM | Scott Michaud
Tagged: square enix, pc gaming, final fantasy
Update: Fixed a couple of points as per the comments.
Final Fantasy might be returning to the PC as its publisher, Square Enix, grows more interested in the platform. Final Fantasy VII and VIII were both available on the PC within a few months of their original PlayStation releases. Since then, Final Fantasy was basically non-existent on the platform, beyond the two MMO releases (XI and XIV).
Within the last year, both titles were re-released on Steam to decent sales. Yoshinori Kitase, producer for the franchise, told Eurogamer that this popularity has grabbed their attention. He acknowledged that the developer does not have a lot of experience with creating a good PC experience, but they could be very interested in the future.
It's an early stage for us. We haven't got an awful lot of experience in this field. So when we have more know-how and experience in this market we would be very interested.
Kitase also noted that, by ignoring the PC platform, their games are completely off the table in several markets. He did not mention any markets by name, but China only recently reopened its borders to Microsoft, Sony, and Nintendo after banning them in 2000. In Brazil, a PS4 launched at a little over 3x the US price, after converting into USD, because of tax and other distribution issues.
Also, while not mentioned in the article, Square Enix has been very active in porting their back-catalog to mobile platforms. This seems to be a time of re-evaluation for the company. While they have had recent troubles with projecting sales figures, mostly with Eidos releases, they have at least dodged Games for Windows Live in favor of Steam.
Also, ending with a pun, Final Fantasy VII supports Cloud Saves. Hehehe.
Subject: General Tech | February 19, 2014 - 07:22 PM | Scott Michaud
Tagged: wolfenstein, Doom 4, bethesda
Of course, many will disagree with the concept of pre-ordering. Many have said that you should wait until reviews are released before making a purchase. Ironically, a couple of these people have also argued against the merits of game reviews, which is food for thought. Still, there is a solid argument for not spending your money blindly. It can be nice to get bonus content for reserving copies, such as developer commentary, but it can easily get ridiculous.
What if I don't own that platform!!!
In this case, if you pre-order Wolfenstein: The New Order, you will get access to the beta for the upcoming Doom game (carefully not called, "Doom 4") inside the box. Unfortunately, they will not provide much more details than that. They will not mention whether it will be single-player or multiplayer, when it will start, how long it will last, or what platforms it will occur on.
"Beta timing and platform options are subject to Bethesda Softworks' discretion."
Personally, I cannot see how this would be possible. Wouldn't it be absolutely terrible PR if a gamer purchased Wolfenstein for a platform that the Doom Beta was not available for? I would have to expect that this is only in there for legal reasons, in case an issue arises. Still, that would absolutely suck. Bethesda does like the PC platform, however. I guess we have that going for us.
Wolfenstein: The New World Order will be available on May 20th in North America.
Subject: Editorial, General Tech | February 19, 2014 - 06:15 PM | Scott Michaud
Tagged: bioshock infinite
The team behind the original BioShock and Bioshock: Infinite decided to call it quits. After seventeen years, depending on where you start counting, the company dissolved to form another, much smaller studio. Only about fifteen employees will transition to the new team. The rest are provided financial support, given a bit of time to develop their portfolios, and can attend a recruitment day to be interviewed by other studios and publishers. They may also be offered employment elsewhere in Take-Two Interactive.
The studio formed by the handful of remaining employees will look to develop games based on narrative, which is definitely their strength. Each game will be distributed digitally and Take-Two will continue to be their parent company.
While any job loss is terrible, I am interested in the future project. BioShock: Infinite sold millions of copies but I wonder if its size ultimately caused it harm. It was pretty and full of detail, at the expense of requiring a large team. The game had a story which respected your intelligence, you may not understand it and that was okay, but I have little confidence that it was anywhere close to the team's original vision. From budget constraints to the religious beliefs of development staff, we already know about several aspects of the game that changed significantly. Even Elizabeth, according to earlier statements from Ken Levine, was on the bubble because of her AI's complexity. I can imagine how difficult it is to resist those changes when seeing man-hour budgets. I cannot, however, imagine BioShock: Infinite without Elizabeth. A smaller team might help them concentrate their effort where it matters and keep artistic vision from becoming too dilute.
As for BioShock? The second part of the Burial at Sea DLC is said to wrap up the entire franchise. 2K will retain the license if they want to release sequels or spin-offs. I doubt Ken Levine will have anything more to do with it, however.
Subject: Graphics Cards | February 19, 2014 - 04:43 PM | Jeremy Hellstrom
Tagged: geforce, gm107, gpu, graphics, gtx 750 ti, maxwell, nvidia, video
We finally saw Maxwell yesterday, with a new design for the SMs called SMM each of which consist of four blocks of 32 dedicated, non-shared CUDA cores. In theory that should allow NVIDIA to pack more SMMs onto the card than they could with the previous SMK units. This new design was released on a $150 card which means we don't really get to see what this new design is capable of yet. At that price it competes with AMD's R7 260X and R7 265, at least if you can find them at their MSRP and not at inflated cryptocurrency levels. Legit Reviews contrasted the performance of two overclocked GTX 750 Ti to those two cards as well as to the previous generation GTX 650Ti Boost on a wide selection of games to see how it stacks up performance-wise which you can read here.
That is of course after you read Ryan's full review.
"NVIDIA today announced the new GeForce GTX 750 Ti and GTX 750 video cards, which are very interesting to use as they are the first cards based on NVIDIA's new Maxwell graphics architecture. NVIDIA has been developing Maxwell for a number of years and have decided to launch entry-level discrete graphics cards with the new technology first in the $119 to $149 price range. NVIDIA heavily focused on performance per watt with Maxwell and it clearly shows as the GeForce GTX 750 Ti 2GB video card measures just 5.7-inches in length with a tiny heatsink and doesn't require any internal power connectors!"
Here are some more Graphics Card articles from around the web:
- MSI GTX 750 Ti Gaming Video Card Review @HiTech Legion
- NVIDIA GeForce GTX 750 Ti @ Benchmark Reviews
- ASUS GTX 750 OC 1 GB @ techPowerUp
- MSI GTX 750 Ti Gaming 2 GB @ techPowerUp
- NVIDIA GeForce GTX 750Ti the Arrival of Maxwell @HiTech Legion
- Palit GTX 750 Ti StormX Dual 2 GB @ techPowerUp
- The GTX 750 Ti Review; Maxwell Arrives @ Hardware Canucks
- Nvidia GeForce GTX 750 Ti vs. AMD Radeon R7 265 @ Legion Hardware
- MSI GTX750Ti OC Twin Frozr @ Kitguru
- NVIDIA GeForce GTX 750 Ti 2 GB @ techPowerUp
- NVIDIA GeForce GTX 750 Ti "Maxwell" On Linux @ Phoronix
- A quick look at Mantle on AMD's Kaveri APU @ The Tech Report
- Sapphire Radeon R9 Tri-X OC video card @ Hardwareoverclock
- AMD Radeon R9 290: Still Not Good For Linux Users @ Phoronix
- AMD Radeon R7 265 2GB Video Card Review @ Legit Reviews
- Sapphire Radeon R7 260X OC 2GB Graphics Card Review @ Techgage
- XFX Double Dissipation R9 280X @ [H]ard|OCP
Subject: General Tech | February 19, 2014 - 04:01 PM | Jeremy Hellstrom
Tagged: gaming, titanfall, modding
If you didn't get lucky enough to get in on the Titanfall demo then all you can do is read the previews and wonder if what you are missing out on is really as good as people say it is. The reviews we've seen have been very positive and describe what seems to be a new style of online shooter. The basics remain the same and we have all seen footage of the 3 storey mechs which give the game its name but Rock, Paper, SHOTGUN also describes how parkour is a big part of the game and is easier to get used to than Mirror's Edge. The launch process also sounds like an improvement, when starting you end up in a private area which makes it easy to pick who you play with if you have a group of up to 12 people together. Even with the limit of 6 players per side the map won't feel empty thanks to the designed inclusion of bots on both teams. It is also nice to hear that Respawn is already acknowledging the modding community for the PC version of their game.
"There are a lot of different ways to make videogame fights meaningful. Singleplayer games do it by couching your shotgun blasts and pistol whips in the context of a story. Multiplayer games do it by emphasising competition via scoreboards, and by layering XP bonuses and equipment progression on top as rewards for each kill. Titanfall aims to do it with a mixture of all of the above, and based on its limited beta, finds mixed success."
Here is some more Tech News from around the web:
- Microsoft kicks off a week of Xbox 360 game price cuts @ The Inquirer
- > HACK INFO ON DISRUPT ONTO RPS FRONT PAGE @ Rock, Paper, SHOTGUN
- Irrational Games, developers of BioShock series, to close @ HEXUS
- Jacking Into The Matrix: EVE And Oculus’ Utopian Dreams @ Rock, Paper, SHOTGUN
- Mein Gott: Wolfenstein Preorders Secure DOOM Beta Access @ Rock, Paper, SHOTGUN
An Upgrade Project
When NVIDIA started talking to us about the new GeForce GTX 750 Ti graphics card, one of the key points they emphasized was the potential use for this first-generation Maxwell GPU to be used in the upgrade process of smaller form factor or OEM PCs. Without the need for an external power connector, the GTX 750 Ti provided a clear performance delta from integrated graphics with minimal cost and minimal power consumption, so the story went.
Eager to put this theory to the test, we decided to put together a project looking at the upgrade potential of off the shelf OEM computers purchased locally. A quick trip down the road to Best Buy revealed a PC sales section that was dominated by laptops and all-in-ones, but with quite a few "tower" style desktop computers available as well. We purchased three different machines, each at a different price point, and with different primary processor configurations.
The lucky winners included a Gateway DX4885, an ASUS M11BB, and a Lenovo H520.
Subject: Mobile | February 19, 2014 - 02:25 PM | Tim Verry
Tagged: windows 8, viewsonic, viewpad 10i, tablet, celeron n2910, Bay Trail, android 4.2
ViewSonic is launching a new 10-inch tablet called the Viewpad 10i. The tablet is powered by an Intel bay Trail processor and it offers a dual boot configuration of Windows 8 and Android 4.2 operating systems. The slate tablet weighs 650 grams. It is available online for around $500 USD.
The Viewpad 10i has a 10.1” IPS capacitive multi-touch display with a resolution of 1280x800. ViewSonic has also included two 2MP cameras (front and rear), a built-in speaker, and a dedicated Windows button below the display. External connectivity includes micro USB and micro SD ports in addition to 802.11n Wi-Fi and Bluetooth wireless radios.
Internal specifications on the Viewpad 10i include an Intel Celeron N2910 “Bay Trail” processor, 2GB of RAM, and a 64GB SSD. The Bay Trail processor is a 7.5W TDP (4.5W SDP) part with 1.6 GHz quad core CPU, Intel HD Graphics GPU clocked at 756 MHz, and 2 MB of cache. A 7,000 mAh battery offers up to six hours of battery life.
You can find more photos of Viewsonic's new tablet here.
The ability to dual boot Windows and Android is neat, but it does come at a premium versus competing 10-inch Bay Trail tablets that run a single OS out of the box. Is the approximately $500 price tag worth it?
Read more about Intel's Bay Trail architecture at PC Perspective.
Subject: General Tech | February 19, 2014 - 12:33 PM | Jeremy Hellstrom
Tagged: security, router, TheMoon
A worm known as TheMoon has been in the news recently but the actual infection of Linksys routers has likely been spreading for quite a while now. You may have also read about the backdoor on Linksys/Cisco and Netgear routers which as been open for almost a decade and can be as simple as connecting to port 8083 if you can get direct access to the router. Some of these vulnerabilities can be mitigated by turning off remote administration and uPNP services but it seems your consumer level router is still a huge security risk. Your best bet is to spend a weekend and follow the advice of most Slashdot commentators; flash your router with OpenWRT or a version of Tomato and you will have better security and control over your router. Just don't do it to the modem your ISP provided you with.
"The remote-access management flaw that allowed TheMoon worm to thrive on Linksys routers is far from the only vulnerability in that particular brand of hardware, though it might be simpler to call all home-based wireless routers gaping holes of insecurity than to list all the flaws in those of just one vendor. An even longer list of Linksys (and Cisco and Netgear) routers were identified in January as having a backdoor built into the original versions of their firmware in 2005 and never taken out."
Here is some more Tech News from around the web:
- Oops: Security Holes In Belkin Home Automation Gear @ Slashdot
- Intel unveils Xeon E7 v2 for data centres with focus on data analytics @ The Inquirer
- Ignore the pie-in-the-sky storage roadmaps. This is what's REALLY afoot @ The Register
- How NOT to evaluate hard disk reliability: Backblaze vs world+dog @ The Register
- How to Operate Your Spycams with ZoneMinder on Linux (part 1) @ Linux.com
Introduction and Technical Specifications
Courtesy of SilverStone
SilverStone Technology is a well known brand name with high quality solutions in the form of everything from cases to case-mounted fan controllers and displays. They have also gone through several iterations of CPU all-in-on liquid cooling solutions with their newest models being part of the Tundra Series. The Tundra Series TD02 liquid cooler is designed to cool CPUs of any make, including the latest offering from both Intel and AMD. The cooler is comprised of a massive 2x120mm radiator attached to a copper base plate with integrated pump. To best measure the TD02's performance, we set it against several other high-performance liquid and air-based coolers. With a retail MSRP of $129.99, the TD02 comes in at the higher end of the all-in-one cooler price range.
Courtesy of SilverStone
Subject: General Tech, Processors, Mobile | February 19, 2014 - 03:28 AM | Scott Michaud
Tagged: Intel, SoC, atom, haswell, Haswell-E, Airmont, Ivy Bridge-EX
Every few months, we get another snapshot at some of Intel's products. This timeline has a rough placement for every segment, from their Internet of Things (IoT) product, the Quark, up to the Xeon E7 v2. While it covers from now through December, it is not designed to be a strict schedule and might contain an error or two.
Image Credit: VR-Zone
First up is Ivy Bridge-EX (Xeon E7 v2). PCMag has an interesting rundown on these parts in depth, although some aspects are a little fuzzy. These 22nm-based chips range from 6 to 15 cores and can access up to 1.5TB of memory, per socket. Intel also claims they will support up to four times the I/O bandwidth for disk and network transactions. Naturally, they have all the usual virtualization and other features that are useful for servers. Most support Turbo Boost and all but one have Hyper-Threading Technology.
Jumping back to the VR-Zone editorial, the timeline suggests that the Quark X1000 will launch in April. As far as I can tell, this is new information. Quark is Intel's ultra low-end SoC that is designed for adding intelligence to non-computing devices. One example given by Intel at CES was a smart baby bottle warmer.
The refresh of Haswell is also expected to happen in April.
Heading into the third quarter, we should see Haswell-E make an appearance for the enthusiast desktop and moderately high-end server. This should be the first time since Sandy Bridge-E (2011) that expensive PCs get a healthy boost to single-threaded performance, clock for clock. Ivy Bridge-E, while a welcome addition, was definitely aimed at reducing power consumption.
Ending the year should be the launch of Airmont at 14nm. The successor to Silvermont, Airmont will be the basis of Cherry Trail tablets and lower end PCs at the very end of the year. Moorefield, which is Airmont for smartphones, is not listed on this roadmap and should not surface until 2015.
Subject: General Tech, Graphics Cards | February 19, 2014 - 12:01 AM | Scott Michaud
Tagged: raptr, gaming evolved, amd
The AMD Gaming Evolved App updates your drivers, optimizes your game settings, streams your gameplay to Twitch, accesses some social media platforms, and now gives prizes. Points are given for playing games using the app, optimizing game settings, and so forth. These can be exchanged for rewards ranging from free games, to Sapphire R9-series graphics cards.
This program has been in beta for a little while now, without the ability to redeem points. The system has been restructured to encourage using the entire app by lowering the accumulation rate for playing games and adding other goals. Beta participants do not lose all of their points, rather it is rescaled more in line with the new system.
Subject: General Tech, Graphics Cards | February 18, 2014 - 09:03 AM | Scott Michaud
Tagged: nvidia, gtx titan black, geforce titan, geforce
NVIDIA has just announced the GeForce GTX Titan Black. Based on the full high-performance Kepler (GK110) chip, it is mostly expected to be a lower cost development platform for GPU processing applications. All 2,880 single precision (FP32) CUDA Cores and 960 double precision (FP64) CUDA Cores are unlocked, yielding 5.1 TeraFLOPs of 32-bit decimal and 1.3 TeraFLOPs of 64-bit decimal performance. The chip contains 1536kB of L2 Cache and will be paired with 6GB of video memory on the board.
The original GeForce GTX Titan launched last year, almost to the day. Also based on the GK110 design, it also featured full double precision performance with only one SMX disabled. Of course, no component at the time contained a fully-enabled GK110 processor. The first product with all 15 SMX units active was not realized until the Quadro K6000, announced in July but only available in the fall. It was followed by the GeForce GTX 780 Ti (with a fraction of its FP64 performance) in November, and the fully powered Tesla K40 less than two weeks after that.
For gaming applications, this card is expected to have comparable performance to the GTX 780 Ti... unless you can find a use for the extra 3GB of memory. Games do not display much benefit with the extra 64-bit floating point (decimal) performance because the majority of their calculations are at 32-bit precision.
The NVIDIA GeForce GTX Titan Black is available today at a price of $999.
What we know about Maxwell
I'm going to go out on a limb and guess that many of you reading this review would not have normally been as interested in the launch of the GeForce GTX 750 Ti if a specific word hadn't been mentioned in the title: Maxwell. It's true, the launch of GTX 750 Ti, a mainstream graphics card that will sit in the $149 price point, marks the first public release of the new NVIDIA GPU architecture code named Maxwell. It is a unique move for the company to start at this particular point with a new design, but as you'll see in the changes to the architecture as well as the limitations, it all makes a certain bit of sense.
For those of you that don't really care about the underlying magic that makes the GTX 750 Ti possible, you can skip this page and jump right to the details of the new card itself. There I will detail the product specifications, performance comparison and expectations, etc.
If you are interested in learning what makes Maxwell tick, keep reading below.
The NVIDIA Maxwell Architecture
When NVIDIA first approached us about the GTX 750 Ti they were very light on details about the GPU that was powering it. Even though the fact it was built on Maxwell was confirmed the company hadn't yet determined if it was going to do a full architecture deep dive with the press. In the end they went somewhere in between the full detail we are used to getting with a new GPU design and the original, passive stance. It looks like we'll have to wait for the enthusiast GPU class release to really get the full story but I think the details we have now paint the story quite clearly.
During the course of design the Kepler architecture, and then implementing it with the Tegra line in the form of the Tegra K1, NVIDIA's engineering team developed a better sense of how to improve the performance and efficiency of the basic compute design. Kepler was a huge leap forward compared to the likes of Fermi and Maxwell is promising to be equally as revolutionary. NVIDIA wanted to address both GPU power consumption as well as finding ways to extract more performance from the architecture at the same power levels.
The logic of the GPU design remains similar to Kepler. There is a Graphics Processing Cluster (GPC) that houses Simultaneous Multiprocessors (SM) built from a large number of CUDA cores (stream processors).
GM107 Block Diagram
Readers familiar with the look of Kepler GPUs will instantly see changes in the organization of the various blocks of Maxwell. There are more divisions, more groupings and fewer CUDA cores "per block" than before. As it turns out, this reorganization was part of the ability for NVIDIA to improve performance and power efficiency with the new GPU.
Subject: General Tech, Cases and Cooling | February 17, 2014 - 08:36 PM | Scott Michaud
Tagged: passive cooling, cooling
Somewhere in the world, someone is developing a passively-cooled desktop made up of copper water pipes. Thirty-six (36) of them pass through what looks like an aluminum block attached to the socket LGA 1155 heatsink mount. As the copper pipes heat up, it passes to the air within it. Convection forces this to exhaust upward through the copper chimney and replaces it with cool air from below.
All Images, Credit: "Monster", CoolEnjoy.net Forums
From the 3D prototype, it looks like two passively-cooled discrete GPUs are intended to fit just above the elbow in the chimney. Even from the rendering, it is clear that quite a lot of thought and effort has gone into this project. I cannot tell how they intend to access PCIe slots from up there, be it a larger motherboard or an extension adapter, but options probably exist.
Initial testing with a Core i5-4440 (stock frequencies) show around 65 deg C at full CPU load. This should be in line with a typical air-based cooler.
Either way, this is the most impressive "SuperPipe" cooler that I have seen.
Your move, MSI.
Get notified when we go live!