Flash player not detected. Click here to install flash.
« 1 2 3 4 5 »

Samsung 850 EVO SKUs leaked, leads to initial pricing, specs

Subject: Storage | October 28, 2014 - 01:30 PM |
Tagged: ssd, sata, Samsung, 850 EVO

Thanks to an updated SKU list and some searching, we've come across some initial photos, specs, and pricing for the upcoming Samsung 850 EVO.

8310217.01.prod_.jpg

You may have heard of an 850 EVO 1TB listing over at Frys, but there's actually more information out there. Here's a quick digest:

Specs:

  • Memory: 3D VNAND
  • Read: 550MB/sec
  • Write: 520MB/sec
  • Weight: 0.29 lbs

Pricing (via Antares Pro listings at time of writing):

  • 120GB (MZ-75E120B/AM): $100 ($0.83 / GB)
  • 250GB (MZ-75E250B/AM): $146 ($0.58 / GB)
  • 500GB (MZ-75E500B/AM): $258 ($0.52 / GB)
  • 1TB     (MZ-75E1T0B/AM): $477 ($0.48 / GB)

In addition to the above, we saw the 1TB model listed for $500 at Frys, and also found the 500GB for $264 at ProVantage. The shipping date on the Frys listing was initially November 3rd, but that has since shifted to November 24th, presumably due to an influx of orders.

We'll be publishing a full capacity roundup on the 850 Pro in anticipation of the 850 EVO launch, which based on these leaks is imminent.

GeForce GTX 970 Coil Whine Concerns

Subject: Graphics Cards | October 28, 2014 - 12:09 PM |
Tagged: maxwell, GTX 970, geforce, coil whine

Coil whine is the undesirable effect of electrical components creating audible noise when operating. Let's look to our friends at Wikipedia for a concise and accurate description of the phenomenon:

Coil noise is, as its name suggests, caused by electromagnetic coils. These coils, which may act as inductors or transformers, have a certain resonant frequency when coupled with the rest of the electric circuit, as well as a resonance at which it will tend to physically vibrate.

As the wire that makes up the coil passes a variable current, a small amount of electrical oscillation occurs, creating a small magnetic field. Normally this magnetic field simply works to establish the inductance of the coil. However, this magnetic field can also cause the coil itself to physically vibrate. As the coil vibrates physically, it moves through a variable magnetic field, and feeds its resonance back into the system. This can produce signal interference in the circuit and an audible hum as the coil vibrates.

Coil noise can happen, for example, when the coil is poorly secured to the circuit board, is poorly damped, or if the resonant frequency of the coil is close to the resonant frequency of the electric circuit. The effect becomes more pronounced as the signal passing through the coil increases in strength, and as it nears the resonant frequency of the coil, or as it nears the resonant frequency of the circuit. Coil noise is also noticed most often when it is in the humanly audible frequency.

Coil noise is also affected by the irregularities of the magnetic material within the coil. The flux density of the inductor is effected by these irregularities, causing small currents in the coil, contaminating the original signal. This particular subset of is sometimes referred to as magnetic fluctuation noise or the Barkhausen effect. Coil noise can also occur in conjunction with the noise produced by magnetostriction.

Gamers that frequently upgrade their graphics cards may have been witness to this problem with a particular install, or you might have been one of the lucky ones to never deal with the issue. If your computer sits under your desk, in a loud room or you only game with headphones, it's also possible that you just never noticed.

inductor.jpg

Possibly offending inductors?

The reason this comes up to today is that reports are surfacing of GeForce GTX 970 cards from various graphics card vendors exhibiting excessive coil whine or coil noise. These reports are coming in from multiple forum threads around the internet, a collection of YouTube videos of users attempting to capture the issue and even official statements from some of NVIDIA's partners. Now, just because the internet is talking about it doesn't necessarily mean it's a "big deal" relative to the number of products being sold. However, after several Twitter comments and emails requesting we look into the issue, I thought it was pertinent to start asking questions.

As far as I can tell today, GTX 970 cards from multiple vendors including EVGA, MSI and Gigabyte all have users reporting issues and claims of excessive coil noise. For my part here, I have two EVGA GTX 970 cards and an MSI GTX 970, none of which are producing sound at what I would call "excessive" levels. Everyone's opinion of excessive noise is going to vary, but as someone who sits next to a desk-high test bed and hears hundreds of cards a year, I am confident I have a good idea of what to listen for.

We are still gathering data on this potential issue, but a few of the companies mentioned above have issued official or semi-official statements on the problem.

From MSI:  

The coil whine issue is not specific to 900 series, but can happen with any high end GPU and that MSI is looking in to ways to minimize the issue. If you still have concern regarding this issue, then please contact our RMA department.

From EVGA:

We have been watching the early feedback on GTX 970 and inductor noise very closely, and have actively taken steps to improve this. We urge anyone who has this type of concern to contact our support so we can address it directly.

From NVIDIA: 

We’re aware of a small percentage of users reporting excessive “coil whine” noises and are actively looking into the issue.

We are waiting for feedback from other partners to see how they plan to respond.

Since all of the GTX 970 cards currently shipping are non-reference, custom built PCB designs, NVIDIA's input to the problem is one mostly of recommendations. NVIDIA knows that it is their name and brand being associated with any noisy GeForce cards so I would expect a lot of discussions and calls being had behind closed doors to make sure partners are addressing user concerns.

IMG_9794.JPG

Interestingly, the GeForce GTX 970 was the one card of this Maxwell release where all of NVIDIA's partners chose to go the route of custom designs rather than adopting the NVIDIA reference design. On the GTX 980, however, you'll find a mix of both and I would wager that NVIDIA's reference boards do not exhibit any above average noise levels from coils. (I have actually tested four reference GTX 980s without coil whine coming into play.) Sometimes offering all of these companies the option to be creative and to differentiate can back-fire if the utmost care isn't taken in component selection.

Ironically the fix is simple: a little glue on those vibrating inductor coils and the problem goes away. But most of the components are sealed making the simple fix a non-starter for the end user (and I wouldn't recommend doing that anyway). It does point to a lack of leadership from board manufacturers that are willing to skimp on hardware in such a way to make this a big enough issue that I am sitting here writing about this today.

As an aside, if you hear coil whine when running a game at 500-5000 FPS, I don't think that counts as being a major problem for your gaming. I have seen a video or two running a DX9 render test at over 4500 FPS - pretty much any card built today will make noises you don't expect when hitting that kind of performance level.

As for my non-official discussions on the topics with various parties, everyone continues to reiterate that the problem is not as widespread as the some of the forum threads would have you believe. It's definitely higher than normal, and getting public acknowledgements from EVGA and MSI basically confirms that, but one person told me the complaint and RMA levels are where they were expected to be consider the "massively fast sell out rates" the GTX 970 is experiencing. 

Of course, AMD isn't immune to coil whine issues either. If you remember back to the initial launch of the Radeon R9 290X and R9 290, we had similar coil whine issues and experienced those first hand on reference card designs. (You can see a video I recorded of an XFX unit back in November of 2013 here.) You can still find threads on popular forums from that time period discussing the issue and YouTube never seems to forget anything, so there's that. Of course, the fact that previous card launches might have seen issues along the same line doesn't forgive the issue in current or later card releases, but it does put things into context.

So, let's get some user feedback; I want to hear from GTX 970 owners about their experiences to help guide our direction of research going forward.

Click here to take our short poll for GTX 970 owners!

Source: Various

Sony PS4 and Microsoft Xbox One Already Hitting a Performance Wall

Subject: General Tech, Graphics Cards | October 27, 2014 - 04:50 PM |
Tagged: xbox one, sony, ps4, playstation 4, microsoft, amd

A couple of weeks back a developer on Ubisoft's Assassin's Creed Unity was quoted that the team had decided to run both the Xbox One and the Playstation 4 variants of the game at 1600x900 resolution "to avoid all the debates and stuff." Of course, the Internet exploded in a collection of theories about why that would be the case: were they paid off by Microsoft?

For those of us that focus more on the world of PC gaming, however, the following week an email into the Giantbomb.com weekly podcast from an anonymous (but seemingly reliable) developer on the Unity team raised even more interesting material. In this email, despite addressing other issues on the value of pixel count and the stunning visuals of the game, the developer asserted that we may have already peaked on the graphical compute capability of these two new gaming consoles. Here is a portion of the information:

The PS4 couldn’t do 1080p 30fps for our game, whatever people, or Sony and Microsoft say. ...With all the concessions from Microsoft, backing out of CPU reservations not once, but twice, you’re looking at about a 1-2 FPS difference between the two consoles.

What's hard is not getting the game to render but getting everything else in the game at the same level of performance we designed from the start for the graphics. By the amount of content and NPCs in the game, from someone who witnessed a lot of optimizations for games from Ubisoft in the past, this is crazily optimized for such a young generation of consoles. This is really about to define the next-generation unlike any other game beforehand.

We are bound from the CPU because of AI. Around 50% of the CPU is used for the pre-packaged rendering parts..

unity1.jpg

So, if we take this anonymous developers information as true, and this whole story is based on that assumption, then have learned some interesting things.

  1. The PS4, the more graphically powerful of the two very similarly designed consoles, was not able to maintain a 30 FPS target when rendering at 1920x1080 resolution with Assassin's Creed Unity.
     
  2. The Xbox One (after giving developers access to more compute cycles previously reserved to Kinect) is within a 1-2 FPS mark of the PS4.
     
  3. The Ubisoft team see Unity as being "crazily optimized" for the architecture and consoles even as we just now approach the 1 year anniversary of their release.
     
  4. Half of the CPU compute time is being used to help the rendering engine by unpacking pre-baked lighting models for the global illumination implementation and thus the game is being limited by the 50% remaining performance power the AI, etc.

It would appear that just as many in the media declared when the specifications for the new consoles were announced, the hardware inside the Playstation 4 and Xbox One undershoots the needs of game developers to truly build "next-generation" games. If, as this developer states, we are less than a year into the life cycle of hardware that was planned for an 8-10 year window and we have reached performance limits, that's a bad sign for game developers that really want to create exciting gaming worlds. Keep in mind that this time around the hardware isn't custom built cores or using a Cell architecture - we are talking about very basic x86 cores and traditional GPU hardware that ALL software developers are intimately familiar with. It does not surprise me one bit that we have seen more advanced development teams hit peak performance.

unity2.jpg

If the PS4, the slightly more powerful console of the pair, is unable to render reliably at 1080p with a 30 FPS target, then unless the Ubisoft team are completely off the rocker in terms of development capability, the advancement of gaming on consoles would appear to be somewhat limited. Remember the specifications for these two consoles:

  PlayStation 4 Xbox One
Processor 8-core Jaguar APU 8-core Jaguar APU
Motherboard Custom Custom
Memory 8GB GDDR5 8GB DDR3
Graphics Card 1152 Stream Unit APU 768 Stream Unit APU
Peak Compute 1,840 GFLOPS 1,310 GFLOPS

The custom built parts from AMD both feature an 8-core Jaguar x86 architecture and either 768 or 1152 stream processors. The Jaguar CPU cores aren't high performance parts: single-threaded performance of Jaguar is less than the Intel Silvermont/Bay Trail designs by as much as 25%. Bay Trail is powering lots of super low cost tablets today and even the $179 ECS LIVA palm-sized mini-PC we reviewed this week. And the 1152/768 stream processors in the GPU portion of the AMD APU provide some punch, but a Radeon HD 7790 (now called the R7 260X), released in March of 2013, provides more performance than the PS4 and the Radeon R7 250X is faster than what resides in the Xbox One.

xboxonegpu.jpg

If you were to ask me today what kind of performance would be required from AMD's current GPU lineup for a steady 1080p gaming experience on the PC, I would probably tell you the R9 280, a card you can buy today for around $180. From NVIDIA, I would likely pick a GTX 760 (around $200).

Also note that if the developer is using 50% of the CPU resources for rendering computation and the remaining 50% isn't able to hold up its duties on AI, etc., we likely have hit performance walls on the x86 cores as well.

Even if this developer quote is 100% correct that doesn't mean that the current generation of consoles is completely doomed. Microsoft has already stated that DirectX 12, focused on performance efficiency of current generation hardware, will be coming to the Xbox One and that could mean additional performance gains for developers. The PS4 will likely have access to OpenGL Next that is due in the future. And of course, it's also possible that this developer is just wrong and there is plenty of headroom left in the hardware for games to take advantage of.

unity3.jpg

But honestly, based on my experience with these GPU and CPU cores, I don't think that's the case. If you look at screenshots of Assassin's Creed Unity and then look at the minimum and recommended specifications for the game on the PC, there is huge, enormous discrepancy. Are the developers just writing lazy code and not truly optimizing for the hardware? It seems unlikely that a company the size of Ubisoft would choose this route on purpose, creating a console game that runs in a less-than-ideal state while also struggling on the PC version. Remember, there is almost no "porting" going on here: the Xbox One and Playstation 4 share the same architecture as the PC now.

Of course, we might just be treading through known waters. I know we are a bit biased, and so is our reader base, but I am curious: do you think MS and Sony have put themselves in a hole with their shortsighted hardware selections?

UPDATE: It would appear that a lot of readers and commentors take our editorial on the state of the PS4 and XB1 as a direct attack on AMD and its APU design. That isn't really the case - regardless of what vendors' hardware is inside the consoles, had Microsoft and Sony still targeted the same performance levels, we would be in the exact same situation. An Intel + NVIDIA hardware combination could just have easily been built to the same peak theoretical compute levels and would have hit the same performance wall just as quickly. MS and Sony could have prevented this by using higher performance hardware, selling the consoles at a loss out the gate and preparing each platform for the next 7-10 years properly. And again, the console manufacturers could have done that with higher end AMD hardware, Intel hardware or NVIDIA hardware. The state of the console performance war is truly hardware agnostic.

Connected Data announces Transporter Genesis Private Cloud Appliance

Subject: Storage | October 27, 2014 - 04:17 PM |
Tagged: Transporter Genesis, transporter, connected data

Connected Data (whose members are merged with Drobo), have really been pushing their new Transporter line. When we saw them this past CES, there was only a small desktop appliance meant to connect and sync files between homes or small offices. Now they are stepping up their Transporter game by scaling all the way up to 24TB rack mount devices!

Transporter Genesis_AG_L.jpg

For those unaware, Transporter is a personal cloud solution, but with software and mobile app support akin to that of Dropbox. Their desktop software tool has seen rapid addition of features, and the company has even rolled out version history support. Features are nice, but what will now set Transporter apart from competing options is scalability:

Transporter.png

The base level Transporter (right) is a relatively simple device with a single 2.5" HDD installed. These devices scale through the '5' and '15' models, which appear to be built on Drobo hardware. The 'Genesis' models (left) are not simply Drobo 1200i's with blue stickers on them, they are full blown Xeon systems with redundant power supplies, an 80GB SSD, up to 32GB or RAM and 24TB of raw storage capacity. Here is what a typical business rollout of Transporter might look like with these new additions at play:

Transporter2.png

Features currently supported across the line:

  • 256 Bit AES communication
  • Transporter Desktop software solution (Windows and Mac)
  • Transporter mobile app (iOS and Android)
  • Redundancy within each node ('5' and above)
  • Redundancy across nodes (via sync)
  • Active Directory support
  • No recurring fees

The 12TB Genesis 75 comes in at $9,999, but the '15' and '5' should prove to be lower cost options. The base model single bay Transporter can be found for just over $100 (BYOHDD). Full press blast after the break.

Trancend's new M.2 SSD, the MTS800

Subject: Storage | October 27, 2014 - 04:00 PM |
Tagged: M.2, ssd, transcend, MTS800

M.2 is quickly gaining popularity thanks to its small size and power requirements as well as the possible speed increase and other features.  Transcend's 128GB MTS800 drive features fill AES encryption, wear levelling and garbage collection as well as something new, StaticDataRefresh Technology.  That is their name for a process which automatically restores the charge levels in the NAND cells which both prevents errors from accumulating as well as performance reduction over time.  M.2 drives do come with a price premium, the 128GB model is available for $76 on Amazon but the performance is impressive, the lowest transfer speed The SSD Review saw during their testing was 265.61MB/s.

629x419xTranscend-MTS800.jpg

"We have been seeing more M.2 SSDs lately, a lot of which are companies’ first steps into the market since the form factor is so new. They have been designed to meet strict size requirements and allow for greater flexibility in product development. They are the perfect fit for mobile devices with their compact size and light weight."

Here are some more Storage reviews from around the web:

Storage

Samsung updates 840 EVO Performance Restoration Tool

Subject: Storage | October 27, 2014 - 02:59 PM |
Tagged: Samsung, firmware, 840 evo

Over the weekend Samsung silently updated their 840 EVO Performance Restoration Tool. The incremental update improved support for some system configurations that were previously not recognizing an installed 840 EVO. Samsung also improved how the GUI progress bar responds during the update process, presumably to correct the near silent failure that occurred when the tool was unable to update the drive's firmware. Previously, the tool would halt at 15% without any clear indication that the firmware could not be updated (this would occur if the tool was unable to issue the necessary commands to the SSD, mainly due to the motherboard being in the wrong storage controller mode or using an incompatible storage driver).

DSC05837.JPG

Still no word on relief for those owners of the original 840 (non EVO or Pro). We've also heard from some users with Samsung OEM TLC-based SSDs that showed the same type of slow down (some variants of the PM851 apparently used TLC flash). More to follow there.

We evaluated the Samsung 840 EVO Performance Restoration Tool here. If you've already successfully run the 1.0 version of the tool, there is no need to re-run the 1.1 version, as it will not do anything additional to an EVO that has been updated and restored.

Source: Samsung

Earphones without the flashy colours and branding

Subject: General Tech | October 27, 2014 - 02:33 PM |
Tagged: audio, Takstar, HD5500

Some people still prefer headsets with a simplistic design and understated branding as opposed to models with colours bright enough to pass for emergency beacons and a logo large enough to be spotted from orbit.  Takstar understands this and even offers their product for less money than their ostentatious competitors, but that is only half the story as they still need to sound good.  It has a variety of connection options, a 1/8" adapter designed for mobile devices as well as a larger 1/4" connection for use on stereos.  On a mobile device the bass is lacking, which is more because of the lack of power as the headsets sounded much better on the 1/4" plug from a more powerful source.  Do not expect a miracle from $75 circumaural headphones but for the value conscious you should take a look at TechPowerUp's review.

hd5500.jpg

"Takstar is well-known for their bang-for-the-buck headphones, and today, we take a look at their HD5500s. Priced at $74.50, these headphones are for mobile users who want a solid and well-sounding pair of headphones. We take the HD5500s for a spin to see if they can live up to such expectation."

Here is some more Tech News from around the web:

Audio Corner

Source: techPowerUp
Subject: Systems
Manufacturer: ECS

Introduction

DSC_0484 (Large).JPG

When Intel revealed their miniature PC platform in 2012, the new “Next Unit of Computing” (NUC) was a tiny motherboard with a custom case, and admittedly very little compute power. Well, maybe not so much with the admittedly: “The Intel NUC is an ultra-compact form factor PC measuring 4-inch by 4-inch. Anything your tower PC can do, the Intel NUC can do and in 4 inches of real estate.” That was taken from Intel’s NUC introduction, and though their assertion was perhaps a bit premature, technology does continue its rapid advance in the small form-factor space. We aren’t there yet by any means, but the fact that a mini-ITX computer can be built with the power of an ATX rig (limited to single-GPU, of course) suggests that it could happen for a mini-PC in the not so distant future.

With NUC the focus was clearly on efficiency over performance, and with very low power and noise there were practical applications for such a device to offset the marginal "desktop" performance. The viability of a NUC would definitely depend on the user and their particular needs, of course. If you could find a place for such a device (such as a living room) it may have been worth the cost, as the first of the NUC kits were fairly expensive (around $300 and up) and did not include storage or memory. These days a mini PC can be found starting as low as $100 or so, but most still do not include any memory or storage. They are tiny barebones PC kits after all, so adding components is to be expected...right?

DSC_0809 (Large).JPG

It’s been a couple of years now, and the platform continues to evolve - and shrink to some startlingly small sizes. Of the Intel-powered micro PC kits on today’s market the LIVA from ECS manages to push the boundaries of this category in both directions. In addition to boasting a ridiculously small size - actually the smallest in the world according to ECS - the LIVA is also very affordable. It carries a list price of just $179 (though it can be found for less), and that includes onboard memory and storage. And this is truly a Windows PC platform, with full Windows 8.1 driver support from ECS (previous versions are not supported).

Continue reading our look at the ECS LIVA Mini PC!!

No new Intel for you this year

Subject: General Tech | October 27, 2014 - 12:35 PM |
Tagged: Haswell-EX, Haswell-EP4S, Intel, server, xeon, Broadwell-DE, Skylake

Intel's release schedules have been slowing down, unfortunately in a large part that is due to the fact that the only competition they face in certain market segments is themselves.  For high end servers it looks like we won't see Haswell-EX or EP4S until the second half of next year and Skylake chips for entry level servers until after the third quarter.  Intel does have to fight for their share of the SoC and low powered chips, DigiTimes reports the Broadwell-DE family and the C2750 and C2350 should be here in the second quarter which gives AMD and ARM a chance to gain market share against Intel's current offerings.  Along with the arrival of the new chips we will also see older models from Itanium, Xeon, Xeon Phi and Atom be discontinued; some may be gone before the end of the year.  You have already heard the bad news about Broadwell-E.

index.jpg

"Intel's next-generation server processors for 2015 including new Haswell-EX (Xeon E7 v3 series) and -EP4S (Xeon E5-4600 v3 series), are scheduled to be released in the second quarter of 2015, giving clients more time to transition to the new platform, according to industry sources."

Here is some more Tech News from around the web:

Tech Talk

Source: DigiTimes

(Oldish News) Kingdom Hearts 3 on Unreal Engine 4

Subject: General Tech | October 26, 2014 - 11:15 PM |
Tagged: square enix, kingdom hearts 3, unreal engine 4, ue4

I did not report on this the first time because it did not seem like a credible rumor. As it turns out, they were citing an interview with the game's director from Famitsu, the Japanese video game magazine. Basically, while Square likes to make their own engine to use with their RPG projects, their Luminous Engine did not satisfy their needs so they decided to shift production to Unreal Engine 4. While it is still not scheduled to come to the PC, we know that the engine feels at home on our platform.

square-KingdomHearts3-logo.jpg

Image Credit: Wikipedia

As an aside, Famitsu is a surprisingly hard website to machine translate for any content after the first page. I will make a mental note to not feed written content through JavaScript in any website that I make, for the sake of international readers. I eventually had to copy and paste the text directly into Microsoft Translate. It was a pretty terrible experience, but I digress. If you wish to see the interview, do not expect your browser's built-in tools to help. Ctrl-C and Ctrl-V.

It seems pretty clear that Kingdom Hearts was not moved to Unreal Engine 4 for PC support. That would just be silly. More likely, their internal engine might have needed a little extra development work and, especially with the vastly different art styles of Kingdom Hearts and Final Fantasy, they moved the two release dates further apart. Maybe they will even release Kingdom Hearts 3 earlier than intended?

But, if it does come to the PC, it seems somewhat more likely that it will function better than Final Fantasy XIII does. That title was locked to 720p with a few odd quirks, like Esc being the equivalent of "/qq" despite even Alt+F4 giving a warning prompt, and that it seems to require a keyboard to close (I could not find a way to close the game with the gamepad or mouse alone). That said, while a tangent-to-a-tangent, I did like the option to have the original, Japanese dub. Yet again, I digress.

This was not the first time that Square has developed an RPG on Unreal Engine. The Last Remnant, for the Xbox 360 and PC, was developed on Unreal Engine 3. Kingdom Hearts 3 does not have a release date, but it might be sooner than we expect (and probably much earlier than Final Fantasy XV).

Source: Famitsu

The Billion Dollar Businesses of Free to Play

Subject: General Tech | October 26, 2014 - 08:28 PM |
Tagged: pc gaming, free to play

Year to date, League of Legends, Crossfire, and Dungeon Fighter Online are each closing in on one billion dollars in revenue. Yes, three free-to-play MMO titles are closing in on $1 Billion USD in a single year. All three exceed World of Warcraft, which is still the most lucrative subscription MMO. That might change once expansion pack revenue from the upcoming Warlords of Draenor is accounted for, however. The total MMO industry, free-to-play or subscription, is estimated at almost $8 Billion USD, from January through September.

riot-lol-logo.jpg

This is all according to Gamesbeat and their dissection of a SuperData Research (how is that a real name?!) report on the MMO industry. Of course, there is always the possibility that these products will fall short of that milestone by the time January rolls around, but they are pretty close for nine months in and three to go.

The interesting part is why. The article discusses how easily these games can transition between markets due to how low the barrier to entry is. This is especially true in markets that embrace internet cafes, where the game is already installed. The barrier to entry is creating an account, the customer does not even need to think about payment until they have generated interest in the free content.

The second reason, which is not mentioned in the article, is the curve of revenue by customer type. A flat-fee is some value multiplied by the number of legitimate users you have. You will get at most "X" from a customer, maybe a little less for sales, and zero for pirated copies or customers that simply ignore your content. Subscription games split this off to a recurring income; it is the number of legitimate users for that month, summed over every month. While this will get more money from the most dedicated players, because they are playing longer, this still has a ceiling. Free-to-play and other microtransaction-based models have no ceiling except for all the content you have ever made. This is an unlimited ceiling for consumable content.

This can be good for the consumer or it can be bad, of course. Where a game falls on this spectrum really depends on how it is designed. Also, money is not everything. A game can even be released for free if the developer has a reason to not ignore all claims, whether it was a hobby, tech demo, are art piece. It is up to the player (or their gift giver) to decide what is worth their time or money, and that is okay.

Blizzard Is Installing World of Warcraft Servers in Australia

Subject: General Tech | October 26, 2014 - 03:33 AM |
Tagged: wow, blizzard

With the new expansion for World of Warcraft, Blizzard is expanding their infrastructure to better serve their customers in Oceania. The company will not require users who are currently on North American realms to switch, but will be reimbursing server swaps, for as many characters as desired, during the two weeks leading up to Warlords of Draenor's November 13th launch date. This will not affect the time of release, which will be 7:00 PM AEDT / midnight PST (PDT ends on November 2nd).

blizzard-wow-warlords-of-draenor.jpg

The expression, better late than never, definitely applies in this situation. The game has "Oceanic" realms for quite some time now, but they were still physically located in the west coast of America. Sure, the ideal latency of a packet from Australia to California is around 30ms (Update: It's actually around 60ms, 120ms round-trip ideal assuming 66% speed to light in a fiber cable. When Googling the distance between Australia and California, it thought I meant Sydney, Nova Scotia, Canada, 4000mi, not Sydney, Australia, 7500mi. Pixy Misa in the comments, who pointed out my error, said that they experience about 170ms of latency in practice), assuming the speed of light in fiber optics is about 2/3rds of light in a vacuum, but the actual latency is significantly higher in the practical world. Getting the servers about 4000 7500 miles closer should be welcome.

The transfer does not yet have a date, but refunds will be offered for character migrations between 6:01PM AEDT on October 29th, 2014, until 6:59PM AEDT on November 13, 2014. Just make sure to do realm swaps as a separate transaction from anything else you might buy. Apparently Blizzard acknowledges that their storefront will not be able to pick out the Character Transfer and Guild Master Realm Transfer among other services. While they should have spent a little more time making this promotion robust, I cannot really blame them. This is a one-shot. It is probably not worth the man-hours.

Source: Blizzard

AMD Catalyst 14.9.2 Beta for Civilization: Beyond Earth

Subject: Graphics Cards | October 26, 2014 - 02:44 AM |
Tagged: amd, driver, catalyst

So Ryan has been playing many games lately, as a comparison between the latest GPUs from AMD and NVIDIA. While Civilization: Beyond Earth is not the most demanding game in existence on GPUs, it is not trivial either. While not the most complex, from a video card's perspective, it is a contender for most demanding game on your main processor (CPU). It also has some of the most thought-out Mantle support of any title using the API, when using the AMD Catalyst 14.9.2 Beta driver.

firaxis-civilization-beyond-earth.jpg

And now you can!

The Catalyst 14.9.2 Beta drivers support just about anything using the GCN architecture, from APUs (starting with Kaveri) to discrete GPUs (starting with the HD 7000 and HD 7000M series). Beyond enabling Mantle support in Civilization, it also fixes some issues with Metro, Shadow of Mordor, Total War: Rome 2, Watch_Dogs, and other games.

Also, both AMD and Firaxis are aware of a bug in Civilization: Beyond Earth where the mouse cursor does not click exactly where it is supposed to, if the user enables font scaling in Windows. They are working on it, but suggest setting it to the default (100%) if users experience this issue. This could be problematic for customers with high-DPI screens, but could keep you playing until an official patch is released.

You can get 14.9.2 Beta for Windows 7 and Windows 8.1 at AMD's website.

Source: AMD

AMD Radeon R9 290X Now Selling at $299

Subject: Graphics Cards | October 24, 2014 - 03:44 PM |
Tagged: radeon, R9 290X, leaderboard, hwlb, hawaii, amd, 290x

When NVIDIA launched the GTX 980 and GTX 970 last month, it shocked the discrete graphics world. The GTX 970 in particular was an amazing performer and undercut the price of the Radeon R9 290 at the time. That is something that NVIDIA rarely does and we were excited to see some competition in the market.

AMD responded with some price cuts on both the R9 290X and the R9 290 shortly thereafter (though they refuse to call them that) and it seems that AMD and its partners are at it again.

r9290x1.jpg

Looking on Amazon.com today we found several R9 290X and R9 290 cards at extremely low prices. For example:

The R9 290X's primary competition in terms of raw performance is the GeForce GTX 980, currently selling for $549 and up. If you can find them in stock, that means NVIDIA has a hill of $250 to climb when going against the lowest priced R9 290X.

r92901.jpg

The R9 290 looks interesting as well:

Several other R9 290 cards are selling for upwards of $300-320 making them bone-headed decisions if you can get the R9 290X for the same or lower price, but considering the GeForce GTX 970 is selling for at least $329 today (if you can find it) and you can see why consumers are paying close attention. 

Will NVIDIA make any adjustments of its own? It's hard to say right now since stock is so hard to come by of both the GTX 980 and GTX 970 but it's hard to imagine NVIDIA lowering prices as long as parts continue to sell out. NVIDIA believes that its branding and technologies like G-Sync make GeForce cards more valuable and until they being to see a shift in the market, I imagine that will stay the course.

For those of you that utilize our Hardware Leaderboard you'll find that Jeremy has taken these prices into account and update a couple of the system build configurations.

Source: Amazon.com

Cooler Master's new Nepton 240M in action

Subject: General Tech | October 24, 2014 - 01:22 PM |
Tagged: watercooler, Nepton 240M, Nepton, cooler master, all in one

As with the previous generation the new Nepton 240M is designed with "ultra-fine micro channel" technology which quadruples the surface area of the radiator but does provide more resistance to air travelling through the rad.  Installation was a breeze with only one small issue with the gasket which was easily solved.  The Tech Report were more than happy with the new Silencio fans which stayed under 40dB under load, in fact the noise barely changed when compared to when the computer was idle.  The pump was also reasonably quiet and powerful enough to keep the CPU nice and cool though at a cost, the new Nepton 120M has an MSRP of $130.

overview.jpg

"The Nepton 240M is a big liquid cooler with a price to match. We strapped it to TR's Casewarmer to see whether it could take the heat."

Here are some more Cases & Cooling reviews from around the web:

CASES & COOLING

Avoiding online price creep

Subject: General Tech | October 24, 2014 - 12:29 PM |
Tagged: dirty pool, online retailers, wretched hive of scum and villany, airlines

Have you noticed that prices seem to creep up slightly every time you visit an online ticket site hoping for a deal?  As many are probably already aware, the cookies dumped on your machine when you browse allow the sites to keep track of how many times you have visited a site and can base their pricing off of that count.  In other cases they can tell if you are browsing their sites mobile device version or the desktop site and of course if you are logged in as a member or not.  So far none of these practices is technically illegal but they are also laughably easy to defeat.  Simply browsing in anonymous mode, clearing your cookies or even just using a different device will reset those prices and is a habit you should get into.  Slashdot has linked to a PDF which details many of these questionable practices and of course those ever polite commentators under the headline will offer sage and on topic advice.

index.jpg

"For instance, the study found, users logged in to Cheaptickets and Orbitz saw lower hotel prices than shoppers who were not registered with the sites. Home Depot shoppers on mobile devices saw higher prices than users browsing on desktops. Some searchers on Expedia and Hotels.com consistently received higher-priced options, a result of randomized testing by the websites. Shoppers at Sears, Walmart, Priceline, and others received results in a different order than control groups, a tactic known as “steering.”

Here is some more Tech News from around the web:

Tech Talk

Source: Slashdot
Author:
Manufacturer: Firaxis

A Civ for a New Generation

Turn-based strategy games have long been defined by the Civilization series. Civ 5 took up hours and hours of the PC Perspective team's non-working hours (and likely the working ones too) and it looks like the new Civilization: Beyond Earth has the chance to do the same. Early reviews of the game from Gamespot, IGN, and Polygon are quite positive, and that's great news for a PC-only release; they can sometimes get overlooked in the games' media.

For us, the game offers an interesting opportunity to discuss performance. Beyond Earth is definitely going to be more CPU-bound than the other games that we tend to use in our benchmark suite, but the fact that this game is new, shiny, and even has a Mantle implementation (AMD's custom API) makes interesting for at least a look at the current state of performance. Both NVIDIA and AMD sent have released drivers with specific optimization for Beyond Earth as well. This game is likely to be popular and it deserves the attention it gets.

Testing Process

Civilization: Beyond Earth, a turn-based strategy game that can take a very long time to complete, ships with an integrated benchmark mode to help users and the industry test performance under different settings and hardware configurations. To enable it, you simple add "-benchmark results.csv" to the Steam game launch options and then start up the game normally. Rather than taking you to the main menu, you'll be transported into a view of a map that represents a somewhat typical gaming state for a long term session. The game will use the last settings you ran the game at to measure your system's performance, without the modified launch options, so be sure to configure that before you prepare to benchmark.

The output of this is the "result.csv" file, saved to your Steam game install root folder. In there, you'll find a list of numbers, separated by commas, representing the frame times for each frame rendering during the run. You don't get averages, a minimum, or a maximum without doing a little work. Fire up Excel or Google Docs and remember the formula:

1000 / Average (All Frame Times) = Avg FPS

It's a crude measurement that doesn't take into account any errors, spikes, or other interesting statistical data, but at least you'll have something to compare with your friends.

settings.jpg

Our testing settings

Just as I have done in recent weeks with Shadow of Mordor and Sniper Elite 3, I ran some graphics cards through the testing process with Civilization: Beyond Earth. These include the GeForce GTX 980 and Radeon R9 290X only, along with SLI and CrossFire configurations. The R9 290X was run in both DX11 and Mantle.

  • Core i7-3960X
  • ASUS Rampage IV Extreme X79
  • 16GB DDR3-1600
  • GeForce GTX 980 Reference (344.48)
  • ASUS R9 290X DirectCU II (14.9.2 Beta)

Mantle Additions and Improvements

AMD is proud of this release as it introduces a few interesting things alongside the inclusion of the Mantle API.

  1. Enhanced-quality Anti-Aliasing (EQAA): Improves anti-aliasing quality by doubling the coverage samples (vs. MSAA) at each AA level. This is automatically enabled for AMD users when AA is enabled in the game.
     
  2. Multi-threaded command buffering: Utilizing Mantle allows a game developer to queue a much wider flow of information between the graphics card and the CPU. This communication channel is especially good for multi-core CPUs, which have historically gone underutilized in higher-level APIs. You’ll see in your testing that Mantle makes a notable difference in smoothness and performance high-draw-call late game testing.
     
  3. Split-frame rendering: Mantle empowers a game developer with total control of multi-GPU systems. That “total control” allows them to design an mGPU renderer that best matches the design of their game. In the case of Civilization: Beyond Earth, Firaxis has selected a split-frame rendering (SFR) subsystem. SFR eliminates the latency penalties typically encountered by AFR configurations.

EQAA is an interesting feature as it improves on the quality of MSAA (somewhat) by doubling the coverage sample count while maintaining the same color sample count as MSAA. So 4xEQAA will have 4 color samples and 8 coverage samples while 4xMSAA would have 4 of each. Interestingly, Firaxis has decided the EQAA will be enabled on Beyond Earth anytime a Radeon card is detected (running in Mantle or DX11) and AA is enabled at all. So even though in the menus you might see 4xMSAA enabled, you are actually running at 4xEQAA. For NVIDIA users, 4xMSAA means 4xMSAA. Performance differences should be negligible though, according to AMD (who would actually be "hurt" by this decision if it brought down FPS).

Continue reading our article on Civilization: Beyond Earth performance!!

Who rules the ~$250 market? XFX R9 285 Black Edition versus the GTX 760

Subject: Graphics Cards | October 23, 2014 - 04:06 PM |
Tagged: xfx, R9 285 Black Edition, factory overclocked, amd

Currently sitting at $260 the XFX R9 285 Black Edition is a little less expensive than the ASUS ROG STRIKER GTX 760 and significantly more expensive than the ASUS GTX760 DirectCU2 card.  Those prices lead [H]ard|OCP to set up a showdown to see which card provided the best bang for the buck, especially once they overclocked the AMD card to 1125MHz core and 6GHz RAM.  In the end it was a very close race between the cards, the performance crown did go to the R9 285 BE but that performance comes at a premium as you can get performance almost as good for $50 less.  Of course the both the XFX card and the  STRIKER sell at a premium compared to cards with less features and a stock setup; you should expect the lower priced R9 285s to be closer in performance to the DirectCU2 card.

1413885880S78ZQ7Hqqp_1_13_l.jpg

"Today we are reviewing the new XFX Radeon R9 285 Black Edition video card. We will compare it to a pair of GeForce GTX 760 based GPUs to determine the best at the sub-$250 price point. XFX states that it is faster than the GTX 760, but that is based on a single synthetic benchmark, let's see how it holds up in real world gaming."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

Intel Broadwell-E Expected for Q1 2016

Subject: General Tech, Processors, Chipsets | October 23, 2014 - 03:25 PM |
Tagged: Intel, Broadwell, Broadwell-E, Haswell-E

VR-Zone China got hold of an alleged Intel leak, go figure, that talks about their next enthusiast processor platform, Broadwell-E. This architecture is mostly Haswell-E that has its (rated) feature size shrunk down to 14nm. Given an available BIOS, it is expected to support at least some existing LGA 2011-v3 motherboards with the X99 chipset. Like Haswell, they are sticking with a maximum of 40 PCIe lanes. We will need to wait for individual SKUs to see whether one or more models will be limited to 28 lanes, like the Haswell-E-based Core i7-5820K.

intel-broadwell-e-x991.png

Image Credit: Chinese VR-Zone

Intel claims 140W TDP, which is identical to the current three offerings of Haswell-E, for all options. The slide claims six and eight core models will be available (also identical to Haswell-E).

One bullet-point that baffled me is, "Integrated Memory Controller: 4 Channels DDR4 2400, 1 DIMM per Channel". Double-checking with the other writers here, just to make sure sure, it seems like the slide claims that Broadwell-E will only support four sticks of DDR4. This makes zero sense for a couple of reasons. First, one of the main selling points of the enthusiast platform has been the obscene amount of RAM that workstation users demand. Second, and more importantly, if it is compatible with existing motherboards, what is it going to do? Fail to POST if you install a fifth stick? This has to be a typo or referring to something else entirely.

When will you be able to get it? A bit later than we were hoping. It is expected for Q1 2016, rather than late 2015.

Podcast #323 - GTX 980M Performance, MSI X99S Gaming 9 AC and more!

Subject: General Tech | October 23, 2014 - 01:56 PM |
Tagged: video, podcast, GTX 980M, msi, X99S GAMING 9 AC, amd, nvidia, Intel, Kingwin, APU, Kaveri, 344.48, dsr

PC Perspective Podcast #323 - 10/23/2014

Join us this week as we discuss GTX 980M Performance, MSI X99S Gaming 9 AC and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano

Subscribe to the PC Perspective YouTube Channel for more videos, reviews and podcasts!!