PCPer Live! Borderlands: The Pre-Sequel Game Stream Powered by NVIDIA Part 2!

Subject: Editorial, Graphics Cards | October 21, 2014 - 07:45 PM |
Tagged: video, pcper, nvidia, live, GTX 980, geforce, game stream, borderlands: the pre-sequel, borderlands

UPDATE: It's time for ROUND 2!

I'm sure like the staff at PC Perspective, many of our readers have been obsessively playing the Borderlands games since the first release in 2009. Borderlands 2 arrived in 2012 and once again took hold of the PC gaming mindset. This week marks the release of Borderlands: The Pre-Sequel, which as the name suggests, takes place before the events of Borderlands 2. The Pre-Sequel has playable characters that were previously only known to the gamer as NPCs and that, coupled with the new low-gravity game play style, should entice nearly everyone that loves the first-person, loot-driven series to come back.

To celebrate the release, PC Perspective has partnered with NVIDIA to host a couple of live game streams that will feature some multi-player gaming fun as well some prizes to giveaway to the community. I will be joined once again by NVIDIA's Andrew Coonrad and Kris Rey to tackle the campaign in a cooperative style while taking a couple of stops to give away some hardware.

livelogo.jpg

Borderlands: The Pre-Sequel Game Stream Powered by NVIDIA Part 2

5pm PT / 8pm ET - October 21st

PC Perspective Live! Page

Need a reminder? Join our live mailing list!

Here are some of the prizes we have lined up for those of you that join us for the live stream:

Holy crap, that's a hell of a list!! How do you win? It's really simple: just tune in and watch the Borderlands: The Pre-Sequel Game Stream Powered by NVIDIA! We'll explain the methods to enter live on the air and anyone can enter from anywhere in the world - no issues at all!

So stop by Tuesday night for some fun, some gaming and the chance to win some hardware!

global-landing-page-borderlands-presequal.jpg

2K_Borderlands_Pre-Sequel_AthenaToss_1stPerson.jpg

2K_Borderlands_Pre-Sequel_moonBandits.jpg

Apple Announces New Mac Minis with Haswell. What?

Subject: Editorial, General Tech, Systems | October 17, 2014 - 03:22 PM |
Tagged: Thunderbolt 2, thunderbolt, mac mini, mac, Intel, haswell, apple

I was not planning to report on Apple's announcement but, well, this just struck me as odd.

So Apple has relaunched the Mac Mini with fourth-generation Intel Core processors, after two years of waiting. It is the same height as the Intel NUC, but it also almost twice the length and twice the width (Apple's 20cm x 20cm versus the NUC's ~11cm x 11cm when the case is included). So, after waiting through the entire Haswell architecture launch cycle, right up until the imminent release of Broadwell, they are going with the soon-to-be outdated architecture, to update their two-year-old platform?

((Note: The editorial originally said "two-year-old architecture". I thought that Haswell launched about six months earlier than it did. The mistake was corrected.))

apple-macmini-hero.png

I wonder if, following the iTunes U2 deal, this device will come bundled with Limp Bizkit's "Nookie"...

The price has been reduced to $499, which is a welcome $100 price reduction especially for PC developers who want a Mac to test cross-platform applications on. It also has Thunderbolt 2. These are welcome additions. I just have two, related questions: why today and why Haswell?

The new Mac Mini started shipping yesterday. 15-watt Broadwell-U is expected to launch at CES in January with 28W parts anticipated a few months later, for the following quarter.

Source: Apple

Intel Announces Q3 2014: Mucho Dinero

Subject: Editorial | October 15, 2014 - 12:39 PM |
Tagged: revenue, Results, quarterly, Q3, Intel, haswell, Broadwell, arm, amd, 22nm, 2014, 14nm

Yesterday Intel released their latest quarterly numbers, and they were pretty spectacular.  Some serious milestones were reached last quarter, much to the dismay of Intel’s competitors.  Not everything is good with the results, but the overall quarter was a record one for Intel.  The company reported revenues of $14.55 billion dollars with a net income of $3.31 billion.  This is the highest revenue for a quarter in the history of Intel.  This also is the first quarter in which Intel has shipped 100 million processors.

The death of the PC has obviously been overstated as the PC group had revenue of around $9 billion.  The Data Center group also had a very strong quarter with revenues in the $3.7 billion range.  These two groups lean heavily on Intel’s 22 nm TriGate process, which is still industry leading.  The latest Haswell based processors are around 10% of shipping units so far.  The ramp up for these products has been pretty impressive.  Intel’s newest group, the Internet of Things, has revenues that shrank by around 2% quarter over quarter, but it has grown by around 14% year over year.

Intel-Swimming-in-Money.jpg

Not all news is good news though.  Intel is trying desperately to get into the tablet and handheld markets, and so far has had little traction.  The group reported revenues in the $1 million range.  Unfortunately, that $1 million is offset by about $1 billion in losses.  This year has seen an overall loss for mobile in the $3 billion range.  While Intel arguably has the best and most efficient process for mobile processors, it is having a hard time breaking into this ARM dominated area.  There are many factors involved here.  First off there are more than a handful of strong competitors working directly against Intel to keep them out of the market.  Secondly x86 processors do not have the software library or support that ARM has in this very dynamic and fast growing section.  We also must consider that while Intel has the best overall process, x86 processors are really only now achieving parity in power/performance ratios.  Intel still is considered a newcomer in this market with their 3D graphics support.

Intel is quite happy to take this loss as long as they can achieve some kind of foothold in this market.  Mobile is the future, and while there will always be the need for a PC (who does heavy duty photo editing, video editing, and immersive gaming on a mobile platform?) the mobile market will be driving revenues from here on out.  Intel absolutely needs to have a presence here if they wish to be a leader at driving technologies in this very important market.  Intel is essentially giving away their chips to get into phones and tablets, and eventually this will pave the way towards a greater adoption.  There are still hurdles involved, especially on the software side, but Intel is working hard with developers and Google to make sure support is there.  Intel is likely bracing themselves for a new generation of 20 nm and 16 nm FinFET ARM based products that will start showing up in the next nine months.  The past several years has seen Intel push mobile up to high priority in terms of process technology.  Previously these low power, low cost parts were relegated to an N+1 process technology from Intel, but with the strong competition from ARM licensees and pure-play foundries Intel can no longer afford that.  We will likely see 14 nm mobile parts from Intel sooner as opposed to later.

Intel has certainly shored up a lot of their weaknesses over the past few years.  Their integrated 3D/GPU support has improved in leaps and bounds over the years, their IPC and power consumption with CPUs is certainly industry leading, and they continue to pound out impressive quarterly reports.  Intel is certainly firing on all cylinders at this time and the rest of the industry is struggling to keep up.  It will be interesting to see if Intel will keep up with this pace, and it will be imperative for the company to continue to push into mobile markets.  I have never counted Intel out as they have a strong workforce, a solid engineering culture, and some really amazingly smart people (except Francois… he is just slightly above average- he is a GT-R aficionado after all).

Next quarter appears to be more of the same.  Intel is expecting revenue in the $14.7 billion, plus or minus $500 million.  This continues along with the strong sales of PC and server parts for Intel that helps buoy them to these impressive results.  Net income and margins again look to appear similar to what this past quarter brought to the table.  We will see the introduction of the latest 14 nm Broadwell processors, which is an important step for Intel.  14 nm development and production has taken longer than people expected, and Intel has had to lean on their very mature 22 nm process longer than they wanted to.  This has allowed a few extra quarters for the pure-play foundries to try to catch up.  Samsung, TSMC, and GLOBALFOUNDRIES are all producing 20 nm products with a fast transition to 16/14 nm FinFET by early next year.  This is not to say that these 16/14nm FinFET products will be on par with Intel’s 14 nm process, but it at least gets them closer.  In the near term though, these changes will have very little effect on Intel and their product offerings over the next nine months.

Source: Intel

PCPer Live! Borderlands: The Pre-Sequel Game Stream Powered by NVIDIA

Subject: Editorial, Graphics Cards | October 13, 2014 - 10:28 PM |
Tagged: video, pcper, nvidia, live, GTX 980, geforce, game stream, borderlands: the pre-sequel, borderlands

UPDATE: You missed this weeks live stream but you can watch the game play via this YouTube embed!!

I'm sure like the staff at PC Perspective, many of our readers have been obsessively playing the Borderlands games since the first release in 2009. Borderlands 2 arrived in 2012 and once again took hold of the PC gaming mindset. This week marks the release of Borderlands: The Pre-Sequel, which as the name suggests, takes place before the events of Borderlands 2. The Pre-Sequel has playable characters that were previously only known to the gamer as NPCs and that, coupled with the new low-gravity game play style, should entice nearly everyone that loves the first-person, loot-driven series to come back.

To celebrate the release, PC Perspective has partnered with NVIDIA to host a couple of live game streams that will feature some multi-player gaming fun as well some prizes to giveaway to the community. I will be joined by NVIDIA's Andrew Coonrad and Kris Rey to tackle the campaign in a cooperative style while taking a couple of stops to give away some hardware.

livelogo.jpg

Borderlands: The Pre-Sequel Game Stream Powered by NVIDIA

5pm PT / 8pm ET - October 14th

PC Perspective Live! Page

Need a reminder? Join our live mailing list!

Here are some of the prizes we have lined up for those of you that join us for the live stream:

Holy crap, that's a hell of a list!! How do you win? It's really simple: just tune in and watch the Borderlands: The Pre-Sequel Game Stream Powered by NVIDIA! We'll explain the methods to enter live on the air and anyone can enter from anywhere in the world - no issues at all!

So stop by Tuesday night for some fun, some gaming and the chance to win some hardware!

global-landing-page-borderlands-presequal.jpg

2K_Borderlands_Pre-Sequel_AthenaToss_1stPerson.jpg

2K_Borderlands_Pre-Sequel_moonBandits.jpg

World of Warcraft: Warlords of Draenor Requirements Listed

Subject: Editorial, General Tech | September 24, 2014 - 03:55 PM |
Tagged: wow, blizzard

When software has been supported and maintained for almost ten years, like World of Warcraft, it is not clear whether the worst compatible machine at launch should remain supported or whether the requirements should increase over time. For instance, when Windows XP launched, the OS was tuned for 128MB of RAM. Later updates made it be highly uncomfortable with anything less than a whole gigabyte. For games though, we mostly pretend that they represent the time that they were released.

blizzard-battlenet-real01.jpg

That mental model does not apply to World of Warcraft: Warlords of Draenor. While technically this is an expansion pack, its requirements jumped again (significantly if compared to the original release). Even the first expansion pack, Burning Crusade, was able to run on a GeForce 2. Those cards were bundled with the original Unreal Tournament, which was a relatively new game at the time that the GeForce 2 was released.

Now? Well the minimum is:

  • Windows XP or later.
  • Intel Core 2 Duo E6600 or AMD Phenom X3 8750
  • NVIDIA GeForce 8800 GT, AMD Radeon HD 4850), or Intel HD Graphics 3000.
  • 2GB of RAM
  • 35GB HDD

And the recommended is:

  • Windows 7 or 8 (x86-64)
  • Intel Core i5 2400 or AMD FX-4100
  • NVIDIA GeForce GTX 470 or AMD Radeon HD 5870
  • 4GB of RAM
  • 35GB HDD

World of Warcraft, and other MMORPGs, might get a pass on this issue. With its subscription model, there is not really an expectation that a user can go back and see the game in the same state as it launched. It is not a work, but a service -- and that does not devalue its artistic merits. It just is not really the same game now that it was then.

World of Warcraft: Warlords of Draenor will launch on November 13th.

Source: Blizzard

The price of upgrading, DDR4 starts to appear

Subject: Editorial, General Tech, Memory | August 20, 2014 - 04:08 PM |
Tagged: Haswell-E, G.Skill, ddr4-2800, ddr4-2666, ddr4-2400, ddr4-2133, ddr4, crucial, corsair

DDR4 is starting to arrive at NewEgg and some kits are actually in stock for those who want to be the first on their block to have these new DIMMs and can remortgage their home.  The price of Haswell-E CPUs and motherboards is as of yet unknown but looking over the past few years of Intel's new processors you can assume the flagship processor will be around $999.99 with the feature rich motherboards starting around $200 and quickly raising from there.

32gb.png

Both G.SKILL and Crucial have lead with 32GB kits in DDR4-2133 and DDR4-2400 and as you can see the price for their DIMMs and most likely the competitions will be between $450 to $500.

16gb.png

At the 16GB mark you have more choices with Corsair joining in and a range of speeds that go up to DDR4-2800 as well as your choice of a pair of 8GB DIMMs or four 4GB DIMMs.  Corsair was kind enough to list the timings, the DDR4-2666 @ 15-17-17-35 and the DDR4-2800 @ 16-18-18-36 though you will certainly pay a price for the RAM with the highest frequencies.

8gb.png

For those on a budget it would seem like waiting is your best choice, especially as Amazon is offering a limited selection of the new kits, as there is only a single 8GB kit from Crucial although you can buy two of the single DIMMs without heatspreaders for $110. 

Intel product releases are always dearly priced, the introduction of a new generation of RAM is both exciting and daunting. You will see power reductions, base frequencies that were uncommon in DDR3 and very likely an increase in the ability to overclock these DIMMs but it is going to cost you.  If Haswell-E is in your sights you should start planning on how to afford replacing your CPU, motherboard and RAM at the same time as this is no refresh this is a whole new product line.

Source: NewEgg

PC Perspective Hardware Workshop 2014 @ Quakecon 2014 in Dallas, TX

Subject: Editorial, General Tech, Shows and Expos | July 23, 2014 - 04:43 PM |
Tagged: workshop, video, streaming, quakecon, prizes, live, giveaways

UPDATE: The event is over, but the video is embeded below if you want to see the presentations! Thanks again to everyone that attended and all of our sponsors!

It is that time of year again: another installment of the PC Perspective Hardware Workshop!  Once again we will be presenting on the main stage at Quakecon 2014 being held in Dallas, TX July 17-20th.

logo-1500px.jpg
 

Main Stage - Quakecon 2014

Saturday, July 19th, 12:00pm CT

Our thanks go out to the organizers of Quakecon for allowing us and our partners to put together a show that we are proud of every year.  We love giving back to the community of enthusiasts and gamers that drive us to do what we do!  Get ready for 2 hours of prizes, games and raffles and the chances are pretty good that you'll take something out with you - really, they are pretty good!

Our primary partners at the event are those that threw in for our ability to host the workshop at Quakecon and for the hundreds of shirts we have ready to toss out!  Our thanks to NVIDIASeasonic and Logitech!!

nvidia_logo_small.png

seasonic-transparent.png

logitech-transparent.png

Live Streaming

If you can't make it to the workshop - don't worry!  You can still watch the workshop live on our live page as we stream it over one of several online services.  Just remember this URL: http://pcper.com/live and you will find your way!

 

PC Perspective LIVE Podcast and Meetup

We are planning on hosting any fans that want to watch us record our weekly PC Perspective Podcast (http://pcper.com/podcast) on Wednesday or Thursday evening in our meeting room at the Hilton Anatole.  I don't yet know exactly WHEN or WHERE the location will be, but I will update this page accordingly on Wednesday July 16th when we get the data.  You might also consider following me on Twitter for updates on that status as well.

After the recording, we'll hop over the hotel bar for a couple drinks and hang out.  We have room for at leaast 50-60 people to join us in the room but we'll still be recording if just ONE of you shows up.  :)

Prize List (will continue to grow!)

Continue reading to see the list of prizes for the workshop!!!

Battlefield Will Not Be Annualized Says Patrick Söderlund

Subject: Editorial, General Tech | June 17, 2014 - 07:54 PM |
Tagged: battlefield, medal of honor, ea

Last year, we got Battlefield 4. The year before? Medal of Honor: Warfighter. The year before? Battlefield 3. The year before? Medal of Honor (Reboot). We will not be getting a new Medal of Honor this year, because Danger Close was shut down in June 2013. Danger Close developed the two recent Medal of Honor titles and, as EA Los Angeles, many of the previous Medal of Honor titles and many RTS games (Command and Conquer, Red Alert, Lord of the Rings: The Battle for Middle-Earth).

battlefield-hardline.jpg

Many of their employees are now working at DICE LA.

So, when a new Medal of Honor title should be released, we get Battlefield: Hardline. A person with decent pattern recognition might believe that Battlefield, or its spinoffs, would fill the gap left by Medal of Honor. Not so, according to Patrick Söderlund, Executive VP of EA Studios. As was the case at E3, where both studios (DICE and Visceral) repetitively claimed that Battlefield: Hardline was the product (literally) of a fluke encounter and pent-up excitement for cops and robbers.

Of course, they do not close the door for annualized Battlefield releases, either. They just say that it is not their plan to have that be "the way it's going to be forever and ever". Honestly, for all the hatred that annualized releases get, the problem is not the frequency. If EA can bring out a Battlefield title every year, and one that is continually a good game, then power to them. The problem is that, with an annual release cycle, it is hard to get success-after-success, especially when fatigue is an opposing, and (more importantly) ever-increasing force.

It is the hard, but lucrative road.

Source: PC Gamer

Why would SanDisk buy Fusion-io for $1.1 Billion?

Subject: Editorial, Storage | June 17, 2014 - 09:56 AM |
Tagged: sandisk, fusion-io, buyout

Fusion-io was once a behemoth of flash memory storage. Back when SSDs were having a hard time saturating SATA 3Gb/sec, Fusion-io was making fire breathing PCIe SSDs full of SLC flash and pushing relatively insane IOPS and throughput figures. Their innovations were a good formula at the time. They made the controller a very simple device, basically just a simple bridge from the PCIe bus to the flash memory. This meant that most of the actual work was done in the driver. This meant that Fusion-io SSDs were able to leverage the CPU and memory of the host system to achieve very high performance.

iops (2010).jpg

Fusion-io ioDrive 160 creams the competition back in 2010.

Being the king of IOPS back in the early days of flash memory storage, Fusion-io was able to charge a premium for their products. In a 2010 review, I priced their 160GB SSD at about $40/GB. In the years since, while flash memory prices (and therefore SSD products) have steadily dropped in price while achieving higher and higher performance figures, Fusion-io products have mostly remained static in price. All of this time, the various iterations of the ioDrive continued to bank on the original model of a simple controller and the bulk of the work taking place in the driver. This actually carries a few distinct disadvantages, in that the host system has to spent a relatively large amount of CPU and memory resources towards handling the Fusion-io devices. While this enables higher performance, it leaves less resources available to actually do stuff with the data. This ends up adding to the build cost of a system, as more CPU cores and memory must be thrown at the chassis handling the storage. In more demanding cases, additional systems would need to be added to the rack space in order to handle the additional storage overhead in addition to the other required workloads. Lastly, the hefty driver means Fusion-io devices are not bootable, despite early promises to the contrary. This isn't necessarily a deal breaker for enterprise use, but it does require system builders to add an additional storage device (from a different vendor) to handle OS duties.

iops (2014).png

In 2014, the other guys are making faster stuff. Note this chart is 4x the scale of the 2010 chart.

Lets fast forward to present times. Just over a week ago, Fusion-io announced their new 'Atomic' line of SSDs. The announcement seemed to fall flat, and did little to save the continuous decline of their stock price. I suspect this was because despite new leadership, these new products are just another iteration of the same resource consuming formula. Another reason for the luke warm reception might have been the fact that Intel launched their P3700 series a few days prior. The P3700 is a native PCIe SSD that employs the new NVM Express communication standard. This open standard was developed specifically for flash memory communication, and it allows more direct access to flash in a manner that significantly reduces the overhead required to perform high data throughputs and very high IO's per second. NVMe is a very small driver stack with native support built into modern operating systems, and is basically the polar opposite of the model Fusion-io has relied on for years now.

NVMe.png

Intel's use of NVMe enables very efficient access to flash memory with minimal CPU overhead.

Fusion-io's announcement claimed "The Atomic Series of ioMemory delivers the highest transaction rate per gigabyte for everything from read intensive workflows to mixed workloads.". Let's see how this stacks up against the Intel P3700 - an SSD that launched the same week:



Model Fusion-io PX600 Intel P3700
Capacity (TB) 1.0 1.3 2.6 5.2 0.4 0.8 1.6 2.0
Interface / Flash type PCIe 2.0 x8 / 20nm MLC PCIe 3.0 x4 / 20nm MLC
Read BW (GB/sec) 2.7 2.7 2.7 2.7 2.7 2.8 2.8 2.8
Write BW (GB/sec) 1.5 1.7 2.2 2.1 1.2 1.9 1.9 1.9
4k random read IOPS 196,000 235,000 330,000 276,000 450,000 460,000 450,000 450,000
Read transactions/GB 196 181 127 53 1,125 575 281 225
4k random write IOPS 320,000 370,000 375,000 375,000 75,000 90,000 150,000 175,000
Write transactions/GB 320 285 144 72 188 113 94 88
4k 70/30 R/W IOPS Unlisted 150,000 200,000 240,000 250,000
Read latency 92us 20/115us
Write latency 15us 20/25us
Endurance (PBW) 12 16 32 64 7.3 14.6 29.2 36.5
Endurance / TB 12.0 12.3 12.3 12.3 18.3 18.3 18.3 18.3
Cost Unlisted $1,207 $2,414 $4,828 $6,035
Cost/GB Unlisted $3.02 $3.02 $3.02 $3.02
Warranty 5 years 5 years
                 

Source: Fusion-io / Intel

We are comparing flagship to flagship (in a given form factor) here. Starting from the top, the Intel P3700 is available in generally smaller capacities than the Fusion-io PX600. Both use 20nm flash, but the P3700 uses half the data lanes at twice the throughput. Regarding Fusion-io's 'transaction rate per GB' point, well, it's mostly debunked by the Intel P3700, which has excellent random read performance all the way down to its smallest 400GB capacity point. The seemingly unreal write specs seen from the PX600 are, well, actually unreal. Flash memory writes take longer than reads, so the only logical explanation for the inversion we see here is that Fusion-io's driver is passing those random writes through RAM first. Writing to RAM might be quicker, but you can't sustain it indefinitely, and it consumes more host system resources in the process. Moving further down the chart, we see Intel coming in with a ~50% higher endurance rating when compared to the Fusion-io. The warranties may be of equal duration, but the Intel drive is (on paper / stated warranty) guaranteed to outlast the Fusion-io part when used in a heavy write environment.

For pricing, Intel launched the P3700 at a competitive $3/GB. Pricing data for Fusion-io is not available, as they are behind a bit of a 'quote wall', and no pricing at all was included with the Atomic product launch press materials. Let's take a conservative guess and assume the new line is half the cost/GB of their previous long-standing flagship, the Octal. One vendor lists pricing directly at $124,995 for 10.24TB ($12.21/GB) and $99,995 for 5.12TB ($19.53/GB), both of which require minumum support contracts as an additional cost. Half of $12/GB is still more than twice the $3/GB figure from Intel.

My theory as to why SanDisk is going for Fusion-io?

  • A poor track record since the Fusion-io IPO have driven the stock price way down, making it prime for a buyout.
  • SanDisk is one of the few remaining flash memory companies that does not own their own high end controller tech.
  • Recent Fusion-io product launch overshadowed by much larger (Intel) company launching a competing superior product at a lower cost/GB.

So yeah, the buyout seemed inevitable. The question that remains is what will SanDisk do with them once they've bought them? Merging the two will mean that Fusion-io can include 'in house' flash and (hopefully) offer their products at a lower cost/GB, but that can only succeed if the SanDisk flash performs adequately. Assuming it does, there's still the issue of relatively high costs when compared to freshly competing products from Intel and others. Last but not least is the ioDrive driver model, which grows incresingly dated while the rest of the industry adopts NVMe.

AMD Restructures. Lisa Su Is Now COO.

Subject: Editorial, General Tech, Graphics Cards, Processors, Chipsets | June 13, 2014 - 06:45 PM |
Tagged: x86, restructure, gpu, arm, APU, amd

According to VR-Zone, AMD has reworked their business, last Thursday, sorting each of their projects into two divisions and moving some executives around. The company is now segmented into the "Enterprise, Embedded, and Semi-Custom Business Group", and the "Computing and Graphics Business Group". The company used to be divided between "Computing Solutions", which handled CPUs, APUs, chipsets, and so forth, "Graphics and Visual Solutions", which is best known for GPUs but also contains console royalties, and "All Other", which was... everything else.

amd-new2.png

Lisa Su, former general manger of global business, has moved up to Chief Operating Officer (COO), along with other changes.

This restructure makes sense for a couple of reasons. First, it pairs some unprofitable ventures with other, highly profitable ones. AMD's graphics division has been steadily adding profitability to the company while its CPU division has been mostly losing money. Secondly, "All Other" is about a nebulous as a name can get. Instead of having three unbalanced divisions, one of which makes no sense to someone glancing at AMD's quarterly earnings reports, they should now have two, roughly equal segments.

At the very least, it should look better to an uninformed investor. Someone who does not know the company might look at the sheet and assume that, if AMD divested from everything except graphics, that the company would be profitable. If, you know, they did not know that console contracts came into their graphics division because their compute division had x86 APUs, and so forth. This setup is now more aligned to customers, not products.

Source: VR-Zone

Podcast #304 - GeForce GTX TITAN Z, Core i7-4790K, Gigabyte Z97X-SOC Force and more!

Subject: Editorial | June 12, 2014 - 02:28 PM |
Tagged: Z97X-SOC Force, video, titan z, radeon, project tango, podcast, plextor, nvidia, Lightning, gtx titan z, gigabyte, geforce, E3 14, amd, 4790k, 290x

PC Perspective Podcast #304 - 06/12/2014

We have lots of reviews to talk about this week including the GeForce GTX TITAN Z, Core i7-4790K, Gigabyte Z97X-SOC Force, E3 News and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Josh Walrath, Jeremy Hellstrom and Allyn Maleventano

Program length: 1:11:36

Subscribe to the PC Perspective YouTube Channel for more videos, reviews and podcasts!!

Join the PC Perspective Team and Austin Evans LIVE on Tonight's Podcast!

Subject: Editorial | June 4, 2014 - 07:42 PM |
Tagged: video, pcper, live, austin evans

Tonight's live edition of the PC Perspective Podcast is going to have a special guest, the Internet's Austin Evans. You likely know of Austin through his wildly popular YouTube channel or maybe his dance moves.

But seriously, Austin Evans is a great guy with a lot of interesting input on technology. Stop by our live page at http://www.pcper.com/live at 10pm EST / 7pm EST for all the fun!

Make sure you don't miss it by signing up for our PC Perspective Live Mailing List!

pcperlive.png

Source: PCPer Live!

TrueCrypt Taken Offline Doesn't Pass My Smell Test

Subject: Editorial, General Tech | May 29, 2014 - 02:17 AM |
Tagged: TrueCrypt

It should not pass anyone's smell test but it apparently does, according to tweets and other articles. Officially, the TrueCrypt website (which redirects to their SourceForge page) claims that, with the end of Windows XP support (??), the TrueCrypt development team wants users to stop using their software. Instead, they suggest a switch to BitLocker, Mac OSX built-in encryption, or whatever random encryption suite comes up when you search your Linux distro's package manager (!?). Not only that, but several versions of Windows (such as 7 Home Premium) do not have access to BitLocker. Lastly, none of these are a good solution for users who want a single encrypted container across multiple OSes.

A new version (don't use it!!!) called TrueCrypt 7.2 was released and signed with their private encryption key.

TrueCrypt_Logo.png

The developers have not denied the end of support, and its full-of-crap reason. (Seriously, because Microsoft deprecated Windows XP almost two months ago, they pull support for a two year old version now?)

They have also not confirmed it. They have been missing since at least "the announcement" (or earlier if they were not the ones who made it). Going missing and unreachable, the day of your supposedly gigantic resignation announcement, does not support the validity of that announcement. 

To me, that is about as unconfirmed as you can get.

Still, people are believing the claims that TrueCrypt 7.1a is not secure. The version has been around since February 2012 and, beyond people looking at its source code, has passed a significant portion of a third-party audit. Even if you believe the website, it only says that TrueCrypt will not be updated for security. It does not say that TrueCrypt 7.1a is vulnerable to any known attack.

In other words, the version that has been good enough for over two years, and several known cases of government agencies being unable to penetrate it, is probably as secure today as it was last week.

"The final version", TrueCrypt 7.2, is a decrypt-only solution. It allows users to unencrypt existing vaults, although who knows what else it does, to move it to another solution. The source code changes have been published, and they do not seem shady so far, but since we cannot even verify that their private key has not leaked, I wouldn't trust it. A very deep compromise could make finding vulnerabilities very difficult.

So what is going on? Who knows. One possibility is that they were targeted for a very coordinated hack, one which completely owned them and their private key, performed by someone(s) who spent a significant amount of time modifying a fake 7.2 version. Another possibility is that they were legally gagged and forced to shut down operations, but they managed to negotiate a method for users to decrypt existing data with a neutered build.

One thing is for sure, if this is a GoG-style publicity stunt, I will flip a couple of tables.

We'll see. ┻━┻ \_()_/ ┻━┻

Source: TrueCrypt

Mozilla Firefox to Implement Adobe DRM for Video

Subject: Editorial, General Tech | May 14, 2014 - 09:56 PM |
Tagged: ultraviolet, mozilla, DRM, Adobe Access, Adobe

Needless to say, DRM is a controversial topic and I am clearly against it. I do not blame Mozilla. The non-profit organization responsible for Firefox knew that they could not oppose Chrome, IE, and Safari while being a consumer software provider. I do not even blame Apple, Google, and Microsoft for their decisions, either. This problem is much bigger and it comes down to a total misunderstanding of basic mathematics (albeit at a ridiculously abstract and applied level).

22-mozilla-2.jpg

Simply put, piracy figures are meaningless. They are a measure of how many people use content without paying (assuming they are even accurate). You know what is more useful? Sales figures. Piracy figures are measurements, dependent variables, and so is revenue. Measurements cannot influence other measurements. Specifically, measurements cannot influence anything because they are, themselves, the result of influences. That is what "a measure" is.

Implementing DRM is not a measurement, however. It is a controllable action whose influence can be recorded. If you implement DRM and your sales go down, it hurt you. You may notice piracy figures decline. However, you should be too busy to care because you should be spending your time trying to undo the damage you did to your sales! Why are you looking at piracy figures when you're bleeding money?

I have yet to see a DRM implementation that correlated with an increase in sales. I have, however, seen some which correlate to a massive decrease.

The thing is, Netflix might know that and I am pretty sure that some of the web browser companies know that. They do not necessarily want to implement DRM. What they want is content and, surprise, the people who are in charge of the content are definitely not enlightened to that logic. I am not even sure if they realize that the reason why content is pirated before their release dates is because they are not leaked by end users.

But whatever. Technical companies, who want that content available on their products, are stuck finding a way to appease those content companies in a way that damages their users and shrinks their potential market the least. For Mozilla, this means keeping as much open as possible.

do-not-hurt-2.jpg

Since they do not have existing relationships with Hollywood, Adobe Access will be the actual method of displaying the video. They are clear to note that this only applies to video. They believe their existing relationships in text, images, and games will prevent the disease from spreading. This is basically a plug-in architecture with a sandbox that is open source and as strict as possible.

This sandbox is intended to prevent a security vulnerability from having access to the host system, give a method of controlling the DRM's performance if it hitches, and not allow the DRM to query the machine for authentication. The last part is something they wanted to highlight, because it shows their effort to protect the privacy of their users. They also imply a method for users to opt-out but did not go into specifics.

As an aside, Adobe will support their Access DRM software on Windows, Mac, and Linux. Mozilla is pushing hard for Android and Firefox OS, too. According to Adobe, Access DRM is certified for use with Ultraviolet content.

I accept Mozilla's decision to join everyone else but I am sad that it came to this. I can think of only two reasons for including DRM: for legal (felony) "protection" under the DMCA or to make content companies feel better while they slowly sink their own ships chasing after numbers which have nothing to do with profits or revenue.

Ultimately, though, they made a compromise. That is always how we stumble and fall down slippery slopes. I am disappointed but I cannot suggest a better option.

Source: Mozilla

Mozilla Makes Suggestions to the FCC about Net Neutrality

Subject: Editorial, General Tech | May 5, 2014 - 08:08 PM |
Tagged: mozilla, net neutrality

Recently, the FCC has been moving to give up Net Neutrality. Mozilla, being dedicated to the free (as in speech) and open internet, has offered a simple compromise. Their proposal is that the FCC classifies internet service providers (ISPs) as common carriers on the server side, forcing restrictions on them to prevent discrimination of traffic to customers, while allowing them to be "information services" to consumers.

mozilla-fcc.png

In other words, force ISPs to allow services to have unrestricted access to consumers, without flipping unnecessary tables with content distribution (TV, etc.) services. Like all possibilities so far, it could have some consequences, however.

"Net Neutrality" is a hot issue lately. Simply put, the internet gives society an affordable method of sharing information. How much is "just information" is catching numerous industries off guard, including ones which Internet Service Providers (ISPs) participate in (such as TV and Movie distribution), and that leads to serious tensions.

On the one hand, these companies want to protect their existing business models. They want consumers to continue to select their cable and satellite TV packages, on-demand videos, and other services at controlled profit margins and without the stress and uncertainty of competing.

On the other hand, if the world changes, they want to be the winner in that new reality. Yikes.

mozilla-UP.jpg

A... bad... photograph of Mozilla's "UP" anti-datamining proposal.

Mozilla's proposal is very typical of them. They tend to propose compromises which divides an issue such that both sides get the majority of their needs. Another good example is "UP", or User Personalization, which tries to cut down on data mining by giving a method for the browser to tell websites what they actually want to know (and let the user tell the browser how much to tell them). The user would compromise, giving the amount of information they find acceptable, so the website would compromise and take only what they need (rather than developing methods to grab anything and everything they can). It feels like a similar thing is happening here. This proposal gives users what they want, freedom to choose services without restriction, without tossing ISPs into "Title II" common carrier altogether.

Of course, this probably comes with a few caveats...

The first issue that pops in my mind is, "What is a service?". I see this causing problems for peer-to-peer applications (including BitTorrent Sync and Crashplan, excluding Crashplan Central). Neither endpoint would necessarily be classified as "a server", or at least convince a non-technical lawmaker that is the case, and thus ISPs would not need to apply common carrier restrictions to them. This could be a serious issue for WebRTC. Even worse, companies like Google and Netflix would have no incentive to help fight those battles -- they're legally protected. It would have to be defined, very clearly, what makes "a server".

Every method will get messy for someone. Still, the discussion is being made.

Source: Mozilla

Post Tax Day Celebration! Win an EVGA Hadron Air and GeForce GTX 750!

Subject: Editorial, General Tech, Graphics Cards | April 30, 2014 - 10:05 AM |
Tagged: hadron air, hadron, gtx 750, giveaway, evga, contest

Congrats to our winner: Pierce H.! Check back soon for more contests and giveaways at PC Perspective!!

In these good old United States of America, April 15th is a trying day. Circled on most of our calendars is the final deadline for paying up your bounty to Uncle Sam so we can continue to have things like freeway systems and universal Internet access. 

But EVGA is here for us! Courtesy of our long time sponsor you can win a post-Tax Day prize pack that includes both an EVGA Hadron Air mini-ITX chassis (reviewed by us here) as well as an EVGA GeForce GTX 750 graphics card. 

evgacontestapril.jpg

Nothing makes paying taxes better than free stuff that falls under the gift limit...

With these components under your belt you are well down the road to PC gaming bliss, upgrading your existing PC or starting a new one in a form factor you might not have otherwise imagined. 

Competing for these prizes is simple and open to anyone in the world, even if you don't suffer the same April 15th fear that we do. (I'm sure you have your own worries...)

  1. Fill out the form at the bottom of this post to give us your name and email address, in addition to the reasons you love April 15th! (Seriously, we need some good ideas for next year to keep our heads up!) Also, this does not mean you should leave a standard comment on the post to enter, though you are welcome to do that too.
     
  2. Stop by our Facebook page and give us a LIKE (I hate saying that), head over to our Twitter page and follow @pcper and heck, why not check our our many videos and subscribe to our YouTube channel?
     
  3. Why not do the same for EVGA's Facebook and Twitter accounts?
     
  4. Wait patiently for April 30th when we will draw and update this news post with the winners name and tax documentation! (Okay, probably not that last part.)

A huge thanks goes out to friends and supporters at EVGA for providing us with the hardware to hand out to you all. If it weren't for sponsors like this PC Perspective just couldn't happen, so be sure to give them some thanks when you see them around the In-tar-webs!!

Good luck!

Source: EVGA

AMD AM1 Retested on 60 Watt Power Supply

Subject: Editorial | April 23, 2014 - 09:51 PM |
Tagged: TDP, Athlon 5350, Asus AM1I-A, amd, AM1

If I had one regret about my AM1 review that posted a few weeks ago, it was that I used a pretty hefty (relatively speaking) 500 watt power supply for a part that is listed at a 25 watt TDP.  Power supplies really do not hit their efficiency numbers until they are at least under 50% load.  Even the most efficient 500 watt power supply is going to inflate the consumption numbers of these diminutive parts that we are currently testing.

am1p_01.jpg

Keep it simple... keep it efficient.

Ryan had sent along a 60 watt notebook power supply with an ATX cable adapter at around the same time as I started testing on the AMD Athlon 5350 and Asus AM1I-A.  I was somewhat roped into running that previously mentioned 500 watt power supply due to comparative reasons.  I was using a 100 watt TDP A10-6790 APU with a pretty loaded Gigabyte A88X based ITX motherboard.  That combination would have likely fried the 60 watt (12v x 5A) notebook power supply under load.

Now that I had a little extra time on my hands, I was able to finally get around to seeing exactly how efficient this little number could get.  I swapped the old WD Green 1 TB drive for a new Samsung 840 EVO 500 GB SSD.  I removed the BD-ROM drive completely from the equation as well.  Neither of those parts uses a lot of wattage, but I am pushing this combination to go as low as I possibly can.

power-idle.png

power-load.png

The results are pretty interesting.  At idle we see the 60 watt supply (sans spinning drive and BD-ROM) hitting 12 watts as measured from the wall.  The 500 watt power supply and those extra pieces added another 11 watts of draw.  At load we see a somewhat similar numbers, but not nearly as dramatic as at idle.  The 60 watt system is drawing 29 watts while the 500 watt system is at 37 watts.

am1p_02.jpg

So how do you get from a 60 watt notebook power adapter to ATX standard? This is the brains behind the operation.

The numbers for both power supplies are both good, but we do see that we get a nice jump in efficiency from using the smaller unit and a SSD instead of a spinning drive.  Either way, the Athlon 5350 and AMD AM1 infrastructure sip power as compared to most desktop processors.

Source: AMD

Ars Technica Estimates Steam Sales and Hours Played

Subject: Editorial, General Tech | April 16, 2014 - 01:56 AM |
Tagged: valve, steam

Valve does not release sales or hours played figures for any game on Steam and it is rare to find a publisher who will volunteer that information. That said, Steam user profiles list that information on a per-account basis. If someone, say Ars Technica, had access to sufficient server capacity, say an Amazon Web Services instance, and a reasonable understanding of statistics, then they could estimate.

Oh look, Ars Technica estimated by extrapolating from over 250,000 random accounts.

SteamHW.png

If interested, I would definitely look through the original editorial for all of its many findings. Here, if you let me (and you can't stop me even if you don't), I would like to add my own analysis on a specific topic. The Elder Scrolls V: Skyrim on the PC, according to VGChartz, sold 3.42 million copies on at retail, worldwide. The thing is, Steamworks was required for every copy sold at retail or online. According to Ars Technica's estimates, 5.94 million copies were registered with Steam.

5.94 minus 3.42 is 2.52 million copies sold digitally. Almost a third of PC sales were made through Steam and other digital distribution platforms. Also, this means that the PC was the game's second-best selling platform, ahead of the PS3 (5.43m) and behind the Xbox 360 (7.92m), minus any digital sales on those platforms if they exist, of course. Despite its engine being programmed in DirectX 9, it is still a fairly high-end game. That is a fairly healthy install base for decent gaming PCs.

Did you discover anything else on your own? Be sure to discuss it in our comments!

Source: Ars Technica

GDC 2014: Shader-limited Optimization for AMD's GCN

Subject: Editorial, General Tech, Graphics Cards, Processors, Shows and Expos | March 30, 2014 - 01:45 AM |
Tagged: gdc 14, GDC, GCN, amd

While Mantle and DirectX 12 are designed to reduce overhead and keep GPUs loaded, the conversation shifts when you are limited by shader throughput. Modern graphics processors are dominated by sometimes thousands of compute cores. Video drivers are complex packages of software. One of their many tasks is converting your scripts, known as shaders, into machine code for its hardware. If this machine code is efficient, it could mean drastically higher frame rates, especially at extreme resolutions and intense quality settings.

amd-gcn-unit.jpg

Emil Persson of Avalanche Studios, probably known best for the Just Cause franchise, published his slides and speech on optimizing shaders. His talk focuses on AMD's GCN architecture, due to its existence in both console and PC, while bringing up older GPUs for examples. Yes, he has many snippets of GPU assembly code.

AMD's GCN architecture is actually quite interesting, especially dissected as it was in the presentation. It is simpler than its ancestors and much more CPU-like, with resources mapped to memory (and caches of said memory) rather than "slots" (although drivers and APIs often pretend those relics still exist) and with how vectors are mostly treated as collections of scalars, and so forth. Tricks which attempt to combine instructions together into vectors, such as using dot products, can just put irrelevant restrictions on the compiler and optimizer... as it breaks down those vector operations into those very same component-by-component ops that you thought you were avoiding.

Basically, and it makes sense coming from GDC, this talk rarely glosses over points. It goes over execution speed of one individual op compared to another, at various precisions, and which to avoid (protip: integer divide). Also, fused multiply-add is awesome.

I know I learned.

As a final note, this returns to the discussions we had prior to the launch of the next generation consoles. Developers are learning how to make their shader code much more efficient on GCN and that could easily translate to leading PC titles. Especially with DirectX 12 and Mantle, which lightens the CPU-based bottlenecks, learning how to do more work per FLOP addresses the other side. Everyone was looking at Mantle as AMD's play for success through harnessing console mindshare (and in terms of Intel vs AMD, it might help). But honestly, I believe that it will be trends like this presentation which prove more significant... even if behind-the-scenes. Of course developers were always having these discussions, but now console developers will probably be talking about only one architecture - that is a lot of people talking about very few things.

This is not really reducing overhead; this is teaching people how to do more work with less, especially in situations (high resolutions with complex shaders) where the GPU is most relevant.

Mozilla Dumps "Metro" Version of Firefox

Subject: Editorial, General Tech | March 16, 2014 - 03:27 AM |
Tagged: windows, mozilla, microsoft, Metro

If you use the Firefox browser on a PC, you are probably using its "Desktop" application. They also had a version for "Modern" Windows 8.x that could be used from the Start Screen. You probably did not use it because fewer than 1000 people per day did. This is more than four orders of magnitude smaller than the number of users for Desktop's pre-release builds.

Yup, less than one-thousandth.

22-mozilla-2.jpg

Jonathan Nightingale, VP of Firefox, stated that Mozilla would not be willing to release the product without committing to its future development and support. There was not enough interest to take on that burden and it was not forecast to have a big uptake in adoption, either.

From what we can see, it's pretty flat.

The code will continue to exist in the organization's Mercurial repository. If "Modern" Windows gets a massive influx of interest, they could return to what they had. It should also be noted that there never was a version of Firefox for Windows RT. Microsoft will not allow third-party rendering engines as a part of their Windows Store certification requirements (everything must be based on Trident, the core of Internet Explorer). That said, this is also true of iOS and Firefox Junior exists with these limitations. It's not truly Firefox, little more than a re-skinned Safari (as permitted by Apple), but it exists. I have heard talks about Firefox Junior for Windows RT, Internet Explorer reskinned by Mozilla, but not to any detail. The organization is very attached to its own technology because, if whoever made the engine does not support new features or lags in JavaScript performance, the re-skins have nothing to leverage it.

Paul Thurrott of WinSupersite does not blame Mozilla for killing "Metro" Firefox. He acknowledges that they gave it a shot and did not see enough pre-release interest to warrant a product. He places some of the blame on Microsoft for the limitations it places on browsers (especially on Windows RT). In my opinion, this is just a symptom of the larger problem of Windows post-7. Hopefully, Microsoft can correct these problems and do so in a way that benefits their users (and society as a whole).

Source: Mozilla