Tokyo Tech Goes Green with KFC (NVIDIA and Efficiency)

Subject: General Tech, Graphics Cards, Systems | November 21, 2013 - 09:47 PM |
Tagged: nvidia, tesla, supercomputing

GPUs are very efficient in terms of operations per watt. Their architecture is best suited for a gigantic bundle of similar calculations (such as a set of operations for each entry of a large blob of data). These are the tasks which also take up the most computation time especially for, not surprisingly, 3D graphics (where you need to do something to every pixel, fragment, vertex, etc.). It is also very relevant for scientific calculations, financial and other "big data" services, weather prediction, and so forth.

nvidia-submerge.png

Tokyo Tech KFC achieves over 4 GigaFLOPs per watt of power draw from 160 Tesla K20X GPUs in its cluster. That is about 25% more calculations per watt than current leader of the Green500 (CINECA Eurora System in Italy, with 3.208 GFLOPs/W).

One interesting trait: this supercomputer will be cooled by oil immersion. NVIDIA offers passively cooled Tesla cards which, according to my understanding of how this works, suit very well to this fluid system. I am fairly certain that they remove all of the fans before dunking the servers (I figured they would be left on).

By the way, was it intentional to name computers dunked in giant vats of heat-conducting oil, "KFC"?

Intel has done a similar test, which we reported on last September, submerging numerous servers for over a year. Another benefit of being green is that you are not nearly as concerned about air conditioning.

NVIDIA is actually taking it to the practical market with another nice supercomputer win.

Other NVIDIA Supercomputing News:

Source: NVIDIA
Author:
Subject: Systems
Manufacturer:

The 7 Year Console Refresh

Be sure you jump to the second page to see our recommendations for gaming PC builds that are inexpensive yet compete well with the capabilities and performance of the PlayStation 4 and Xbox One!!

The consoles are coming!  The consoles are coming!  Ok, that is not necessarily true.  One is already here and the second essentially is too.  This of course brings up the great debate between PCs and consoles.  The past has been interesting when it comes to console gaming, as often the consoles would be around a year ahead of PCs in terms of gaming power and prowess.  This is no longer the case with this generation of consoles.  Cutting edge is now considered mainstream when it comes to processing and graphics.  The real incentive to buy this generation of consoles is a lot harder to pin down as compared to years past.

ps4apu.jpg

The PS4 retails for $399 US and the upcoming Xbox One is $499.  The PS4’s price includes a single controller, while the Xbox’s package includes not just a controller, but also the next generation Kinect device.  These prices would be comparable to some low end PCs which include keyboard, mouse, and a monitor that could be purchased from large brick and mortar stores like Walmart and Best Buy.  Happily for most of us, we can build our machines to our own specifications and budgets.

As a directive from on high (the boss), we were given the task of building our own low-end gaming and productivity machines at a price as close to that of the consoles and explaining which solution would be superior at the price points given.  The goal was to get as close to $500 as possible and still have a machine that would be able to play most recent games at reasonable resolutions and quality levels.

Continue reading our comparison of PC vs. PS4 vs. Xbox One Hardware Comparison: Building a Competing Gaming PC!!

Author:
Manufacturer: Sony

Does downloading make a difference?

This is PART 2 of our testing on the PlayStation 4 storage systems, with the stock hard drive, an SSHD hybrid and an SSD.  Previously, we compared performance based on Blu-ray based installations though today we add downloaded titles from PSN to the mix.  Be sure you read PART 1, PlayStation 4 (PS4) HDD, SSHD and SSD Performance Testing.

I posted a story earlier this week that looked at the performance of the new PS4 when used with three different 2.5-in storage options: the stock 500GB hard drive, a 1TB hybrid SSHD and a 240GB SSD.  The results were fairly interesting (and got a good bit of attention) but some readers wanted more data.  In particular, many asked how things might change if you went the full digital route and purchased games straight from the Sony's PlayStation Network.  I also will compare boot times for each of the tested storage devices.

You should definitely check out the previous article if you missed it. It not only goes through the performance comparison but also details how to change the hard drive on the PS4 from the physical procedure to the software steps necessary. The article also details the options we selected for our benchmarking.

psn1.jpg

Today I purchased a copy of Assassin's Creed IV from the PSN store (you're welcome Ubisoft) and got to testing.  The process was the same: start the game then load the first save spot.  Again, each test was run three times and the averages were reported. The PS4 was restarted between each run.

load3.png

The top section of results is the same that was presented earlier - average load times for AC IV when the game is installed from the Blu-ray.  The second set is new and includes average load times fro AC IV after the installation from the PlayStation Network; no disc was in the drive during testing.

Continue reading our story on the performance testing of HDD, SSD and SSHD with downloaded and Blu-ray installed games on PS4!!

Author:
Manufacturer: Sony

Load time improvements

This is PART 1 of our testing on the PlayStation 4 storage systems, with the stock hard drive, an SSHD hybrid and an SSD.  In PART 2 we take a look at the changes introduced with PSN downloaded games versus Blu-ray installed games as well as show boot time differences.  Be sure you read PART 2, PlayStation 4 (PS4) Blu-ray and Download Storage Performance, Boot Times.

On Friday Sony released the PlayStation 4 onto the world.  The first new console launch in 7 years, the PS4 has a lot to live up to, but our story today isn't going to attempt to weigh the value of the hardware or software ecosystem.  Instead, after our PS4 teardown video from last week, we got quite a few requests for information on storage performance with the PS4 and what replacement hardware might offer gamers.

Hard Drive Replacement Process

Changing the hard drive in your PlayStation 4 is quite simple, a continuation of a policy Sony's policy with the PS3.

01_0.jpg

Installation starts with the one semi-transparent panel on the top of the unit, to the left of the light bar.  Obviously make sure your PS4 is completely turned off and unplugged.

02_0.jpg

Simply slide it to the outside of the chassis and wiggle it up to release.  There are no screws or anything to deal with yet.

03_0.jpg

Once inside you'll find a screw with the PS4 shapes logos on them; that is screw you need to remove to pull out the hard drive cage. 

Continue reading our analysis of PS4 HDD, SSHD and SSD Performance!!

Sony Playstation 4 (PS4) Teardown and Disassembly

Subject: General Tech, Systems | November 15, 2013 - 02:42 PM |
Tagged: video, teardown, ps4, playstation 4, APU, amd

Last night Ken and I headed over the local Best Buy to pick up my preorder of the new Playstation 4.  What would any hardware geek immediately do with this hardware?  Obviously we take a screwdriver to it and take it apart.

In this video, which is a recording of our live stream that started last night at 12:30am EST, you'll see us unbox the PS4, turn it on, take it apart and put it back together.  And I only had to fix one piece with gaffers tape, so there's that.

ps4teardown.jpg

(We'll have a collection of high-resolution photos later today as well.)

Though they are out of stock, Amazon.com appears to be getting more PS4s in stock pretty regularly, so keep an eye out if you are interested in picking one up still.

Happy Friday!

NVIDIA Grid GPUs Available for Amazon EC2

Subject: General Tech, Graphics Cards, Systems | November 5, 2013 - 09:33 PM |
Tagged: nvidia, grid, AWS, amazon

Amazon Web Services allows customers (individuals, organizations, or companies) to rent servers of certain qualities to match their needs. Many websites are hosted at their data centers, mostly because you can purchase different (or multiple) servers if you have big variations in traffic.

I, personally, sometimes use it as a game server for scheduled multiplayer events. The traditional method is spending $50-80 USD per month on a... decent... server running all-day every-day and using it a couple of hours per week. With Amazon EC2, we hosted a 200 player event (100 vs 100) by purchasing a dual-Xeon (ironically the fastest single-threaded instance) server connected to Amazon's internet backbone by 10 Gigabit Ethernet. This server cost just under $5 per hour all expenses considered. It was not much of a discount but it ran like butter.

nvidia-grid-bracket.png

This leads me to today's story: NVIDIA GRID GPUs are now available at Amazon Web Services. Both companies hope their customers will use (or create services based on) these instances. Applications they expect to see are streamed games, CAD and media creation, and other server-side graphics processing. These Kepler-based instances, named "g2.2xlarge", will be available along side the older Fermi-based Cluster Compute Instances ("cg1.4xlarge").

It is also noteworthy that the older Fermi-based Tesla servers are about 4x as expensive. GRID GPUs are based on GK104 (or GK107, but those are not available on Amazon EC2) and not the more compute-intensive GK110. It would probably be a step backwards for customers intending to perform GPGPU workloads for computational science or "big data" analysis. The newer GRID systems do not have 10 Gigabit Ethernet, either.

So what does it have? Well, I created an AWS instance to find out.

aws-grid-cpu.png

Its CPU is advertised as an Intel E5-2670 with 8 threads and 26 Compute Units (CUs). This is particularly odd as that particular CPU is eight-core with 16 threads; it is also usually rated by Amazon at 22 CUs per 8 threads. This made me wonder whether the CPU is split between two clients or if Amazon disabled Hyper-Threading to push the clock rates higher (and ultimately led me to just log in to an instance and see). As it turns out, HT is still enabled and the processor registers as having 4 physical cores.

The GPU was slightly more... complicated.

aws-grid-gpu.png

NVIDIA control panel apparently does not work over remote desktop and the GPU registers as a "Standard VGA Graphics Adapter". Actually, two are available in Device Manager although one has the yellow exclamation mark of driver woe (random integrated graphics that wasn't disabled in BIOS?). GPU-Z was not able to pick much up from it but it was of some help.

Keep in mind: I did this without contacting either Amazon or NVIDIA. It is entirely possible that the OS I used (Windows Server 2008 R2) was a poor choice. OTOY, as a part of this announcement, offers Amazon Machine Image (AMI)s for Linux and Windows installations integrated with their ORBX middleware.

I spot three key pieces of information: The base clock is 797 MHz, the memory size is 2990 MB, and the default drivers are Forceware 276.52 (??). The core and default clock rate, GK104 and 797 MHz respectively, are characteristic of the GRID K520 GPU with its 2 GK104 GPUs clocked at 800 MHz. However, since the K520 gives each GPU 4GB and this instance only has 3GB of vRAM, I can tell that the product is slightly different.

I was unable to query the device's shader count. The K520 (similar to a GeForce 680) has 1536 per GPU which sounds about right (but, again, pure speculation).

I also tested the server with TCPing to measure its networking performance versus the cluster compute instances. I did not do anything like Speedtest or Netalyzr. With a normal cluster instance I achieve about 20-25ms pings; with this instance I was more in the 45-50ms range. Of course, your mileage may vary and this should not be used as any official benchmark. If you are considering using the instance for your product, launch an instance and run your own tests. It is not expensive. Still, it seems to be less responsive than Cluster Compute instances which is odd considering its intended gaming usage.

Regardless, now that Amazon picked up GRID, we might see more services (be it consumer or enterprise) which utilizes this technology. The new GPU instances start at $0.65/hr for Linux and $0.767/hr for Windows (excluding extra charges like network bandwidth) on demand. Like always with EC2, if you will use these instances a lot, you can get reduced rates if you pay a fee upfront.

Official press blast after the break.

Source: NVIDIA

(The Verge) Valve's Steam Machine and Steam Controller

Subject: General Tech, Systems, Shows and Expos | November 4, 2013 - 03:36 PM |
Tagged: valve, Steam Machine, steam os, CES 2014

I guess The Verge, with its Steam Machine photos, prove all three next-gen consoles (trollolol) are designed to look like home theater devices. Of course you will never be able to purchase a Steam Machine from Valve but, since they are releasing their CAD files, I am sure at least one Steam Machine will be exactly to reference spec.

STEAM_M_console_controller_hero_large_verge_super_wide.jpg

Image Source: The Verge

And, for the record, I think the reference enclosure is classy. Living room appliances suit a lot better than kitchen ones.

On a serious note: pictures of the internals. The beta Steam Machines will contain full desktop components aligned in such a way that each has its own sector to breathe from. The hottest parts intake and exhaust as far away from one another as possible. This makes the chassis relatively wide and short: a video card's length, in depth; about 3 expansion slots, tall; and about 3 PCIe cards height, wide. The actual measurements are 12" x 12" x 3" (W x D x H).

Steam-Machine-Open.jpg

Photo Credit: The Verge

This is mostly possible because the GeForce Titan GPU is mounted upside-down and parallel with the motherboard. I have never experienced a 90-degree PCIe extension slot but, according to Josh Walrath, this is a common accessory in servers (especially 1U and 2U racks). The Titan intakes downward into a relatively unoccupied section of the case and exhausts out the back.

The Verge also had some things to say about the Steam Controller. The design motivations are interesting but I will leave that discussion to the original article (this news post will be long enough when I'm done with it). There are two points that I would like to bring up, though:

The first is a clarification of the original Steam Controller announcement: Valve will produce and sell Steam Controller on its own. This was originally a big question mark as it could water down how "reference" Valve's controller actually is. With Valve taking all-the-reins, the hardware looks more set in stone.

Will Valve still allow OEMs to learn from their design? Who knows.

The second is also interesting.

What Valve left out of the Steam Controller is almost as intriguing as what went in. Though Valve co-founder Gabe Newell told us that the company wanted to put biometric sensors into game controllers, the team discovered that hands weren't a good source of biofeedback since they were always moving around. However, the team hinted to me — strongly — that an unannounced future VR headset might measure your body's reaction to games at the earlobe. Such a device could know when you’re scared or excited, for instance, and adjust the experience to match.

Seeing Google, Valve, and possibly Apple all approach content delivery, mobile, home theater, and wearable computing... simultaneously... felt like there was a heavy link between them. This only supports that gut feeling. I believe this is the first step in a long portfolio integrating each of these seemingly unrelated technologies together. We should really watch how these companies develop these technologies: especially in relation to their other products.

Stay tuned for CES 2014 in early January. This will be the stage for Valve's hardware and software partners to unbutton their lips and spill their guts. I'm sure Josh and Ryan will have no problems cleaning it all up.

Source: The Verge

(WinSupersite) Surface 2 Reviewed

Subject: General Tech, Systems, Mobile | November 1, 2013 - 04:49 PM |
Tagged: windows rt, Surface 2

The Surface 2 is what happened to the Surface RT. Microsoft decided that "RT" has no place on this product except, of course, its software ("Windows RT") because they painted themselves into a corner on that one. The message is something like, "It's Windows RT 8.1 but not Windows 8.1; in fact, you cannot run that software on it". I expect, and you probably know I have voiced, that this all is a moot point in the semi-near future (and that sucks).

Microsoft's "Official" Surface 2 overviews.

Paul Thurrott down at his Supersite for Windows reviewed Surface 2 in terms of the original Surface RT. The inclusion of Tegra 4 was a major plus for him yielding "night and day" improvement over the previous Tegra 3. In fact, he thinks that everything is at least as good as the original. There is not a single point on his rubric where the Surface RT beats its successor.

Of course there is a single section where the Surface 2 lacks (it is shared with the Surface RT and I think you can guess what it is). The ecosystem, apps for Windows RT, is the platform's "Achilles Heel". It is better than it once was, with the inclusion of apps like Facebook, but glaring omissions will drive people away. He makes this point almost in passing but I, of course, believe this is a key issue.

It is absolutely lacking in key apps, and you will most likely never see such crucial solutions as full Photoshop, iTunes, or Google Chrome on this platform. But if we're being honest with ourselves here, as we must, these apps are, for better or worse, important. (The addition of Chrome alone would be a huge win for both Windows RT and Surface 2.)

I agree that this is the problem with the Windows RT platform and, in Google Chrome's case, the blame belongs to no-one but Microsoft. They will explicitly deny any web browser unless it is a reskin of Internet Explorer (using the "Trident" rendering system and their Javascript engine). You will not see full Firefox or full Google Chrome because Gecko, Servo, Webkit, and Blink are not allowed to be installed on end-user machines.

You are paying Microsoft to not let you install third party browsers. Literally.

Not only does this limit its usefulness but it also reduces the pressure to continue innovation. Why add developer features to Internet Explorer when you can control their use with Windows Store? Sure, Internet Explorer has been on a great trajectory since IE9. I would say that versions 10 and especially 11 could be considered "top 3" contenders as app platforms.

The other alternative is the web, and this is where Internet Explorer 11 plays such a crucial role. While many tier-one online services—Spotify, Pandora, Amazon Cloud Player and Prime Video, and so on—are lacking native Windows RT aps, the web interfaces (should) work fine, and IE 11 is evolving into a full-featured web app platform that should present a reasonable compromise for those users.

Only if Microsoft continues their effort. No-one else is allowed to.

Now that I expanded that point, be sure to check out the rest of Paul Thurrott's review. He broke his review down into sections, big and small, and stuck his opinion wherever he could. Also check out his preview of the Nokia Lumia 2520 to see whether that (if either device) is worth waiting for.

Lenovo's gaming series, meet the Erazer

Subject: Systems | October 31, 2013 - 01:45 PM |
Tagged: Lenovo, Erazer X700

Early this week a deal on the Lenovo Erazer X700 Gaming System was posted and now you can have a chance to see how it performs at Benchmark Reviews.  The bundle it arrives with is rather impressive, a backlit keyboard and a gaming mouse which you can modify the weight to your preference.  While the external aesthetics are interesting it is the internals that we want to know about, especially the watercooling which is revealed below.  It performed well but their are some caveats you should read about in the review if you imagine yourself buying this system and upgrading it in the future.

lenovo_erazer_x700_open2.jpg

"It’s been years since I’ve bought a pre-built desktop computer, so I was interested in the opportunity to check out the Erazer X700 Gaming System that Lenovo offered to us to review. The Erazer occupies a space between the sub-$500 generic boxes most people are satisfied with and the expensive boutique systems at the other end of the scale."

Here are some more Systems articles from around the web:

Systems

Seasonic PSUs Will Power HashFast Bitcoin Miners

Subject: General Tech, Cases and Cooling, Systems | October 22, 2013 - 07:10 PM |
Tagged: seasonic, Power Supplies, mining, bitcoin, asic

Seasonic (Sea Sonic Electronics) has announced a design win that will see its power supplies used in HashFast’s bitcoin mining rigs. The upcoming HashFast mining rigs feature the company’s “Golden Nonce” ASIC(s) and all-in-one water coolers. HashFast has a single ASIC Baby Jet and multi-ASIC Sierra rig. Both units will be available December 15 starting at $2,250 and $6,300 respectively.

The Seasonic power supplies are high efficiency models with Japanese capacitors and at least 80 PLUS Bronze. On the high end, Seasonic has PSUs that are up to 93% efficient. HashFast stated that it chose Seasonic for its mining rigs because of the build quality and efficiency. The Baby Jet and Sierra mining rigs allow users to overclock the ASICs, and the systems can be rather demanding on PSUs.

HashFast Baby Jet BTC Miner.jpg

The Golden Nonce ASIC is a 28nm chip that is rated at 400 GHash/s and 0.65 Watts per Gigahash.

Beyond that, the companies have not gone into specifics. It is good news for Seasonic, and should mean a stable system for bitcoin miners (the 93% efficiency rating is nice as well, as it means less wasted electricity and slightly more bitcoin mining profit).

The full press blast is below for reference.

Read more about Bitcoin @ PC Perspective!

Source: Seasonic