Subject: Editorial, General Tech, Systems, Shows and Expos | June 17, 2013 - 03:16 AM | Scott Michaud
Tagged: xbox one, microsoft, ea, E3 13, E3
Update: Microsoft denies the statements from their support account... but this is still one of the major problems with DRM and closed platforms in general. It is stuff like this that you let them do.
Consumers, whether they acknowledge it or not, fear for the control that platform holders have over their content. It was hard for many to believe that having your EA account banned for whatever reason, even a dispute with a forum moderator, forfeited your license to games you play through that EA account. Sounds like another great idea for Microsoft to steal.
@dohertymark If your account is banned, you also forfeit the licenses to any games that have licenses tied to it as listed in the ToU. ^AC
— Xbox Support (@XboxSupport1) June 14, 2013
Not stopping there, later on in the thread they were asked what would happen in the event of a security breach. You know, recourse before destroying access to possibly thousands of dollars of content.
@KillerRamen Ensure your account security features are enabled, and security proofs details are correct. ^ML
— Xbox Support (@XboxSupport1) June 15, 2013
While not a "verified account", @xboxsupport is.
They acknowledge ownership of this account in the background image there.
Honestly, there shouldn't have been any doubt that these actually are Microsoft employees.
At this point, we have definitely surpassed absurdity. Sure, you typically need to do something fairly bad to have Microsoft stop charging your for Xbox Live. Removing access to your entire library of games, to me, is an attempt to limit cheating and the hardware community.
Great, encourage spite from the soldering irons, that works out well.
Don't worry, enthusiasts, you know the PC loves you.
Gaming as a form of entertainment is fundamentally different than gaming as a form of art. When content is entertainment, its message touches you without any intrinsic value and can be replaced with similar content. Sometimes a certain piece of content, itself, has specific value to society. It is these times where we should encourage efforts by organizations such as GoG, Mozilla and W3C, Khronos, and many others. Without help, it could be extremely difficult or impossible for content to be preserved for future generations and future civilizations.
It does not even need to get in the way of the industry and its attempt to profit from the gaming medium; a careless industry, on the other hand, can certainly get in the way of our ability to have genuine art. After all, this is the main reason why I am a PC gamer: the platform allows entertainment to co-exist with communities who support themselves when the official channels do not.
Of course, unless Windows learns a little something from the Xbox. I guess do not get your Windows Store account banned in the future?
Subject: General Tech, Shows and Expos | June 14, 2013 - 04:06 AM | Scott Michaud
Tagged: E3, E3 13, ea, dice
How could I resist?
I was surprised, the EA keynote -- usually an event which dances past, carefully not leaving anything like "an impression" on its way out -- stuck with me more than any other keynote. Sure, throughout the EA Sports segment I was cleaning my "office" and only modestly paying any level of attention, but I feel that DICE swept the show when they appeared. This, and the rest of the week brought good, bad, and awesome news for us PC gamers.
You have probably seen the Battlefield 4 multiplayer demo by this point. We linked to it, we discussed it. It seems like the destructibility found in the Battlefield 3 single player campaign was absent from the multiplayer not because of a technical reason but rather a design decision. Sure, we can see the radio tower collapse, but building destruction was quite simplified even when compared to Bad Company 2.
The Skyscraper collapse seems like it is a legitimate aspect of the game this time around and not just a baloney promotional piece. When the building collapses you can notice the control point disappear from the mini-map in the bottom left corner of the HUD. That gameplay element required quite a bit of design thought, even Bad Company 2 made buildings with Conquest flags indestructible. Maybe the harsh limitations on Battlefield 3 destructibility was more to keep unified game play between the PC and the 24 player-limited consoles?
Sadly, during E3 we have found that mod support will not be available for Battlefield 4. I must compliment GM of DICE, Karl-Magnus Troedsson, for his blunt honesty. It would be much simpler to kick your feet and say wait and see for something you know will never see the light of day; but, he gave us the straight answer. Sure, he said then engine is not ready for a public release but even then he admitted that it was not for our benefit. They do not have a good idea what boundaries they want to allow modders to access. While disappointing, at least it does not have a condescending tone like we experienced with Bad Company 2 and Battlefield 3 mod support requests.
Karl-Magnus Troedsson, DICE GM: We get that question a lot. I always answer the same thing, and then the community calls me bad names. We get the feedback, we understand it. We also would like to see more player-created content, but we would never do something like this if we feel we couldn’t do this 100 percent. That means we need to have the right tools available, we need to have the right security around this regarding what parts of the engine we let loose, so to say. So for BF4 we don’t have any planned mod support, I have to be blunt about saying that. We don’t.
Moving on, though. As we know, Disney decided that LucasArts properties would be best left to the hands at EA. The internet simultaneously joy-teared at the thought of a Star Wars Battlefront title developed by DICE. Sure enough, Star Wars: Battlefront 3 is a thing, and it will be developed using the Frostbite 3 engine.
Still no word on an Indiana Jones titled based on Mirror's Edge. Heh heh heh.
Oh by the way, the announcement I am, by far, most excited for is Mirror's Edge. I absolutely loved the first game, despite its terrible dialog, for how genuine and intrinsically valuable it felt. It gave the impression of a passion project, both in gameplay and in narrative theme. Thankfully, the game is being developed and it will come to the PC.
We also found out that Mirror's Edge is planned to be an "open world action adventure title". Normally that would scare me, but, that was what we were expecting of the first Mirror's Edge before their linear bait-and-switch.
Cannot tell if good or bad... but we will see at some point in the future.
Subject: General Tech, Systems, Shows and Expos | June 13, 2013 - 04:17 AM | Scott Michaud
Tagged: E3, E3 13, dell, alienware, alienware x51
The launch of Haswell led to many new product launches, and so did E3. The overlap? The Alienware X51 gaming desktop has been refreshed with some very compelling components at a surprisingly compelling price.
Unfortunately, there is a slight difference between the Canadian and the American offerings; it is not a case of one citizen paying more than the another, however, as things are more shuffled around than outright better. Our Canadian readers start with a base price of $1499.99, and Americans start out at $1449.99. Americans can spend an extra $100 to upgrade their DVD reader to a Blu-Ray drive, Canadians get Blu-Ray by default. Therefore, if you desire a Blu-Ray drive, it is $50 cheaper to be Canadian; otherwise, it is $50 cheaper to be American.
Whether you are Canadian or American, I would personally recommend spending the extra $100 upgrading your RAM from 8GB to 16 GB. Sure, 8GB is a lot, but the extra can go a long way especially with the direction that web browsers have been heading. You each, also, have the option of spending $300 and receiving a 256GB SSD albeit also at the expense of, beyond the $300, reducing your 2TB HDD down to a slower, 5400RPM 1TB drive.
In all, this actually looks quite compelling for someone who wishes to have a console-esque form-factor near their TV. Unfortunately there are currently no Ubuntu-based options for this X51, although you may freely ($0) choose between Windows 7 Home Premium 64-bit and Windows 8 64-bit.
Subject: General Tech, Graphics Cards, Processors, Shows and Expos | June 13, 2013 - 02:26 AM | Scott Michaud
Tagged: E3, E3 13, amd
The Electronic Entertainment Expo (E3) is the biggest event of the year for millions of gamers. The majority of coverage ends up gawking over the latest news out of Microsoft, Sony, or Nintendo, and we certainly will provide our insights in those places if we believe they have been insufficiently explained, but E3 is also a big time for PC gamers too.
5 GHz and unlocked to go from there.
AMD, specifically, has a lot to say this year. In the year of the next-gen console reveals, AMD provides the CPU architecture for two of the three devices and have also designed each of the three GPUs. This just leaves a slight win for IBM, who is responsible for the WiiU main processor, for whatever that is worth. Unless the Steam Box comes to light and without ties to AMD, it is about as close to a clean sweep as any hardware manufacturer could get.
But for the PCs among us...
For those who have seen the EA press conference, you have probably seen lots of sports. If you stuck around after the sports, you probably saw Battlefield 4 being played by 64 players on stage. AMD has been pushing, very strongly, for developer relations over the last year. DICE, formerly known for being an NVIDIA-friendly developer, did not exhibit Battlefield 4 "The Way It's Meant to be Played" at the EA conference. According to one of AMD's Twitter accounts:
— AMD Radeon Graphics (@AMDRadeon) June 12, 2013
On the topic of "Gaming Evolved" titles, AMD is partnering with Square Enix to optimize Thief for GCN and A-Series APUs. The Press Release specifically mentioned Eyefinity and Crossfire support along with a DirectX 11 rendering engine; of course, the enhancements with real, interesting effects are the seemingly boring ones they do not mention.
The last major point from their E3 event was the launch of their 5 GHz FX processors. For more information on that part, check out Josh's thoughts from a couple of days ago.
Subject: Mobile, Shows and Expos | June 12, 2013 - 08:47 PM | Ryan Shrout
Tagged: E3, razer, blade, haswell, gtx 765m, geforce
With the launch of Intel's Haswell processor, accessory maker-turned notebook vendor Razer announced a pretty slick machine, the Blade. Based on a quad-core, 37 watt Core i7 Haswell CPU and a GeForce GTX 765M GPU, the Razer Blade packs a lot of punch.
It also includes 8GB of DDR3-1600 memory, an mSATA SSD and integrates a 14-in 1600x900 display. The design of the unit looks very similar to that of the MacBook Pro but the black metal finish is really an attractive style change.
The embedded battery is fairly large at 70 Whr and Razer claims this will equate to a 6 hour battery life when operating non-gaming workloads. With a weight just barely creeping past 4 lbs, the Razer Blade is both portable and powerful it seems.
The price tag starts at $1799 so you won't be able to pick one of these up on the cheap, but for users like me that are willing to pay a bit more for performance and style in a slim chassis, the Blade seems like a very compelling option. There are a lot of questions left to answer on this notebook including the thermal concerns of packing that much high frequency silicon into a thin and light form factor. Does the unit get hot in bad places? Can the screen quality match the performance of Haswell + Kepler?
We are working with Razer to get a model in very soon to put it to the test and I am looking forward to answering if we have found the best gaming portable on the market.
Subject: Displays, Shows and Expos | June 12, 2013 - 08:24 PM | Ryan Shrout
Tagged: Oculus, oculus rift, VR, E3
I have been a big proponent of the Oculus Rift and its move into the world of consumer-ready VR (virtual reality) technology. I saw it for the first time at Quakecon 2012 where Palmer Luckey and John Carmack sat on stage and addressed the new direction. Since then we saw it at CES and finally got in our own developer kit last month for some extended hands-on.
While I have definitely been impressed with the Rift in nearly every way while using it, the first thing anyone says when putting on the headset for the first time is about the graphics - the resolution of the unit was just too low and it creates a "screen door" effect because of it. As I wrote in my first preview:
I will say that the low resolution is definitely a barrier for me. Each eye is only seeing a 640x800 resolution in this version of the kit and that close up you can definitely see each pixel. Even worse, this creates a screen door effect that is basically like looking through a window with a screen installed. It's not great but you could get used to it if you had to; I am just hoping the higher resolution version of this kit is closer.
At E3 2013 the team at Oculus was able to put together a very early prototype of an HD version of the screen. By using a new 1920x1080 display each eye is able to see 960x1080; roughly twice the pixel density of the initial developer kit.
I got to spend some time with the higher resolution model today and I have to say that the difference is striking - and instantly noticeable. Gone was the obvious screen door effect and I was able to focus purely on the content. The content itself was new as well - Oculus and Epic were showing the Unreal Engine 4 integration with a custom version of the Elemental demo. The colors were crisp, the effects were amazing and only in a couple of rare instances of solid white color did we notice the black lines that plagued the first version.
As of now Oculus doesn't have plans to offer an updated developer kit with the 1080p screen installed but you just never know. They are still looking at several different phone screens and haven't made any final decisions on which direction to go but they are definitely close.
When I inquired about improvements on head tracking latency and accuracy to aid in motion sickness concerns (like I seem to have) Oculus was hesitant to say there was any single fix. Instead, a combination of lower latency, better hardware and even better thought out content were key to reducing these effects in gamers.
Subject: Displays, Shows and Expos | June 12, 2013 - 07:58 PM | Ryan Shrout
Tagged: wqxga, wqhd, monoprice, ips, E3, 2560x1440
While wandering the halls at E3 to talk with NVIDIA and AMD about the future of gaming, I ran across a small booth with Monoprice in it. If you don't know Monoprice, it is an online seller of electronics and cables and much of its merchandise can be found throughout the offices at PC Perspective.
In recent months Monoprice made news with PC gamers as one of the first major retailers to begin selling the low-cost 27-in 2560x1440 monitors shipping from Korea. While the monitors are likely very much the same, buying from a local company in the US rather than trusting an eBay buyer in Korea brings a lot of peace of mind to the transaction. Getting a dead pixel and 1 year warranty along with it helps too.
On hand at E3 was the Monoprice IPS-ZERO-G Monitor that runs at a 2560x1440 resolution with a single dual-link DVI input. This is an updated to the first model Monoprice shipped with a newer, thinner design and an even better $390 price point.
Monoprice also is offering a model with an internal scalar that allows the display to include additional inputs like HDMI, VGA and DisplayPort. The 27-in IPS-G Pro will sell for $474 and will also be tuned for AdobeRGB and sRGB options.
In addition to the two 27-in models, Monoprice also has added 30-in 2560x1600 monitors: the IPS CrystalPro and the IPS Pro with the same primary differentiation - input support.
I am looking forward to getting my hands on these Monoprice display options to see if they can live up to the levels of the other Korean-built displays we have in the office. If they do, then I think we have a new reason for PC gamers to celebrate.
Another interesting find at the booth were some new HDMI cables using a RedMere controller on the connector to allow for extremely thin (and long) runs. First shown at CES in 2008, the RedMere RM1689 chip runs solely on the power provided by the HDMI output and allows cables to use much less copper to create thinner designs. They will obviously cost a bit more than standard options but you can see from the photo above the difference is striking.
A necessary gesture
NVIDIA views the gaming landscape as a constantly shifting medium that starts with the PC. But the company also sees mobile gaming, cloud gaming and even console gaming as part of the overall ecosystem. But that is all tied together by an investment in content – the game developers and game publishers that make the games that we play on PCs, tablets, phones and consoles.
The slide above shows NVIDIA targeting for each segment – expect for consoles obviously. NVIDIA GRID will address the cloud gaming infrastructure, GeForce and the GeForce Experience will continue with the PC systems and NVIDIA SHIELD and the Tegra SoC will get the focus for the mobile and tablet spaces. I find it interesting that NVIDIA has specifically called out Steam under the PC – maybe a hint of the future for the upcoming Steam Box?
The primary point of focus for today’s press meeting was to talk about the commitment that NVIDIA has to the gaming world and to developers. AMD has been talking up their 4-point attack on gaming that starts really with the dominance in the console markets. But NVIDIA has been the leader in the PC world for many years and doesn’t see that changing.
With several global testing facilities, the most impressive of which exists in Russia, NVIDIA tests more games, more hardware and more settings combinations than you can possibly imagine. They tune drivers and find optimal playing settings for more than 100 games that are now wrapped up into the GeForce Experience software. They write tools for developers to find software bottlenecks and test for game streaming latency (with the upcoming SHIELD). They invest more in those areas than any other hardware vendor.
This is a list of technologies that NVIDIA claims they invented or developed – an impressive list that includes things like programmable shaders, GPU compute, Boost technology and more.
Many of these turned out to be very important in the development and advancement of gaming – not for PCs but for ALL gaming.
Subject: Editorial, General Tech, Systems, Shows and Expos | June 11, 2013 - 04:06 AM | Scott Michaud
Tagged: wwdc 13, MacBook Air, Mac Pro, apple
Sometimes our "Perspective" is needed on Apple announcements because some big points just do not get covered by the usual sources. Other times, portions of the story can be relevant to our readers. This is one of those days where both are true. Either side should review our thoughts and analysis of Apple's recent ultrabook and, especially, their upcoming desktop offerings.
The MacBook Air has been, predictably, upgraded Intel's Haswell processors. Battery life is the first obvious benefit of the CPU, and that has been well reported. The 11-inch MacBook Air gains an extra four hours of battery life, usable for up to 9 hours between charges. The extra space on the 13-inch MacBook Air allows it to last 12 hours between charges.
Less discussed, both MacBook Airs will contain Intel's Iris iGPU more commonly known as Intel HD 5000. You cannot get Intel HD 5000 graphics without selecting a BGA socket component which you would install by soldering it in place. While there are several better solutions from competing GPU vendors, Apple will have one of the first shipping implementations of Haswell's canonical graphics processor. Iris is said to have double the performance of previous generation Ivy Bridge graphics for a fraction of its power consumption.
Also included in the MacBook Air is an 802.11a/b/g/n/ac WiFi network adapter and Bluetooth 4.0. Apple is not typically known to introduce new standards and often lags severely behind what is available on the PC unless they had a hand in trademarking it, USB 3.0 being the obvious and recent example.
The specifications will be somewhat customizable, the user is able to select between: an i5 and an i7 processor, 4GB or 8GB of RAM, and 128, 256, or 512GB SSD. It has shipped the day it was announced with base prices ranging between $999 for an entry-level 11-inch and $1099 for an entry-level 13-inch.
But now we move on to the dying industry, desktop PCs, where all innovation has died unless it is to graft a touch interface to anything and everything.
"Can't innovate any more, my ass", grunts Phil Schiller, on the keynote stage.
Whether you like it, or think "innovation" is the best word, it's a legitimate new design some will want.
While the new Mac Pro is not a system that I would be interested in purchasing, for issues I will outline soon, these devices are what some users really want. I have been a very strong proponent of OEM devices as they highlight the benefit of the PC industry: choice. You can purchase a device, like the new Mac Pro, from a vendor; alternatively, you can purchase the components individually to assemble yourself and save a lot of money; otherwise, you can hire a small business computer store or technician.
We need more companies, like Apple, to try new devices and paradigms for workstations and other high-performance devices. While it is less ideal for Apple to be the ones coming up with these redesigns, Apple's platform encourages applications to be vendor-specific (only run on a Mac), it can still benefit the PC industry by demonstrating that life and demand still exists; trying something new could reap large benefits. Not everyone wants to have a full ATX case with discrete components but still want workstation performance, and that is okay.
Now when it comes to actual specifications, the typical coverage glossed over what could be easily approximated by a trip to Wikipedia and Google. Sure, some may have been in a rush within the auditorium, but still.
The specifications are:
- Intel Xeon E5-2600 V2-class CPU, Ivy Bridge-E, 12 cores max (suggests single-socket)
- 4-channel DDR3 ECC RAM, apparently 4 DIMMS which suggests 4x16GB (Max).
Dual FirePro GPUs, 4096 total shaders with 2x6GB GDDR5.
- Pretty clearly based on FirePro W9000
- Seems to be slightly underclocked, losing about 0.5 Teraflop per GPU.
- PCIe SSD
- Thunderbolt 2, USB3.0, and WiFi ac (+ a/b/g/n??), Bluetooth 4.0
Now the downside is that basically anything you wish to add to the Mac Pro needs to be done through Thunderbolt, Bluetooth 4.0, or USB 3.0. When you purchase an all-in-one custom design, you forfeit your ability to reach in and modify the components. There is also no mention of pricing, and for a computer with this shoplist you should expect to pay a substantial invoice even without "The Apple Tax", but that is not the point of purchasing a high-end workstation. Apple certainly put in as close to the best-of-the-best as they could.
Now could people stop claiming the PC is dead and work towards sustaining it? I know people love stories of jarring industry shifts, but this is ridiculous.
Subject: Editorial, General Tech, Graphics Cards, Shows and Expos | June 10, 2013 - 02:49 AM | Scott Michaud
Tagged: Ultra, geforce titan, computex
So long to Computex 2013, we barely knew thee. You poured stories all over our news feed for more than a whole week. What say you, another story for the... metaphorical road... between here... and... Taipei? Okay, so the metaphorical road is bumpy and unpaved, work with me.
It was substantially more difficult to decipher the name of a video card a number of years ago. Back then, products would be classified by their model numbers and often assigned a suffix like: "Ultra", "Pro", or "LE". These suffixes actually meant a lot, performing noticeably better (or maybe worse) than the suffix-less number and possibly even overlapping with other number-classes.
Just when they were gone long enough for us to miss them, the suffixes might make some measure of a return. On the show floor, Colorful exhibited the NVIDIA GeForce GTX Titan Ultra Edition. This card uses a standard slightly-disabled GK110-based GeForce GTX Titan GPU, with the usual 2688 CUDA cores, and 6GB of GDDR5. While the GK110 chip has potential for 2880 CUDA cores, NVIDIA has not released any product (not even Tesla or Quadro) with more than 2688 CUDA cores enabled. Colorful's Titan Ultra and the reference Titan are electrically identical; this "Ultra" version just adds a water block for a cooling system and defaults to some amount of a factory overclock.
But, this is not the first time we have heard of a Titan Ultra...
Back in April, ExtremeTech found a leak for two official products: the GTX Titan LE and the GTX Titan Ultra. While the LE would be slightly stripped down compared to the full GTX Titan, the GTX Titan Ultra would be NVIDIA's first release of a GK110 part without any CUDA cores disabled.
So if that rumor ends up being true, you could choose between Colorful's GTX Titan Ultra with its partially disabled GK110 based on the full GTX Titan design; or, you could choose the reference GTX Titan Ultra based on a full GK110 GPU unlike the partially disabled GK110 on the full GTX Titan.
If you are feeling nostalgic... that might actually be confusion... as this is why suffixes went away.