All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: Editorial, Processors | June 10, 2013 - 10:53 AM | Ryan Shrout
Tagged: SimCity, Richland, giveaway, contest, APU, amd, a10-6800k
Odd, turns out I found two brand new AMD A10-6800K Richland APUs sitting on my desk this morning. I called AMD to ask what this was all about and they said that if I didn't need them, I might as well give them away to our readers.
"Oh, and throw in a free copy of the new SimCity while you're at it," they told me.
Who am I to argue?
So let's have a giveaway!
We are handing out two AMD A10-6800K "Richland" APUs for you to build a brand new PC around and including a key for SimCity with each of them. If you haven't read Josh's review of the A10-6800K APU you should definitely do so; it will help educate you on exactly what you are getting - for FREE.
To enter, I need you to leave a comment on this very news post below telling us what you would build with a brand new A10 APU - you don't have to be registered to do so but we'd sure like it if you were. (Make sure you leave your correct email address so I can get in touch with you if you win.) Also, feel free to stop by the PC Perspective YouTube channel and either give our videos a gander or subscribe. I think we put out some great content there and we'd like more of you to see it.
I will pick one winner on June 17th and another on June 24th so you have two separate weeks to potentially win!
A big thanks goes out to AMD for supplying the APUs and copies of SimCity for this giveaway. Good luck!!
Subject: Editorial, General Tech, Graphics Cards, Shows and Expos | June 10, 2013 - 02:49 AM | Scott Michaud
Tagged: Ultra, geforce titan, computex
So long to Computex 2013, we barely knew thee. You poured stories all over our news feed for more than a whole week. What say you, another story for the... metaphorical road... between here... and... Taipei? Okay, so the metaphorical road is bumpy and unpaved, work with me.
It was substantially more difficult to decipher the name of a video card a number of years ago. Back then, products would be classified by their model numbers and often assigned a suffix like: "Ultra", "Pro", or "LE". These suffixes actually meant a lot, performing noticeably better (or maybe worse) than the suffix-less number and possibly even overlapping with other number-classes.
Just when they were gone long enough for us to miss them, the suffixes might make some measure of a return. On the show floor, Colorful exhibited the NVIDIA GeForce GTX Titan Ultra Edition. This card uses a standard slightly-disabled GK110-based GeForce GTX Titan GPU, with the usual 2688 CUDA cores, and 6GB of GDDR5. While the GK110 chip has potential for 2880 CUDA cores, NVIDIA has not released any product (not even Tesla or Quadro) with more than 2688 CUDA cores enabled. Colorful's Titan Ultra and the reference Titan are electrically identical; this "Ultra" version just adds a water block for a cooling system and defaults to some amount of a factory overclock.
But, this is not the first time we have heard of a Titan Ultra...
Back in April, ExtremeTech found a leak for two official products: the GTX Titan LE and the GTX Titan Ultra. While the LE would be slightly stripped down compared to the full GTX Titan, the GTX Titan Ultra would be NVIDIA's first release of a GK110 part without any CUDA cores disabled.
So if that rumor ends up being true, you could choose between Colorful's GTX Titan Ultra with its partially disabled GK110 based on the full GTX Titan design; or, you could choose the reference GTX Titan Ultra based on a full GK110 GPU unlike the partially disabled GK110 on the full GTX Titan.
If you are feeling nostalgic... that might actually be confusion... as this is why suffixes went away.
Subject: Editorial, General Tech, Systems, Shows and Expos | June 6, 2013 - 08:46 PM | Scott Michaud
Tagged: xbox one, E3 13, E3
So heading up to E3, Microsoft decided to drop their DRM bombshell so it would get buried over the next couple of days. In terms of permissiveness, the Xbox One is not nearly as bad as feared; of course, it is still terrible in certain ways.
Microsoft will allow games to be played offline on the Xbox One... for 24 hours. If your internet connection has been offline for longer than that period (unclear whether the timer starts when internet goes out or from last update) then your system will be locked to live TV and disc-based movies. Games and apps, even ones which should have no online functionality, will cease to function until you reconnect with Xbox servers.
This also means that if the Xbox servers have an outage lasting between 24 hours and "taken offline forever", all gaming and apparently apps will cease to function on the Xbox One.
It's like if Wall-E grew a Freddie Mercury
But at least they will allow some level of used-game transfer... if the publisher agrees. Check out this statement from Microsoft Studios:
In our role as a game publisher, Microsoft Studios will enable you to give your games to friends or trade in your Xbox One games at participating retailers. Third party publishers may opt in or out of supporting game resale and may set up business terms or transfer fees with retailers. Microsoft does not receive any compensation as part of this. In addition, third party publishers can enable you to give games to friends. Loaning or renting games won’t be available at launch, but we are exploring the possibilities with our partners.
So this will be an interesting experiment: how will revenue and profitability be affected for game publishers who deny used game sales? I honestly expect that used game sales actually promote the purchasing of more games and that initiatives to limit used game transfers will reduce user engagement. Of course Microsoft is now taking all of the flak from Sony, who may or may not be considering the same practice, but I am sure at least Microsoft is hoping that everyone will forget this when shiny new trailers erase the collective gamer memory.
In return, however, Microsoft is being fairly permissive when it comes to how many users can be licensed on a single disk. Up to ten family members are allowed access to your collective library.
And, after all, it should not be a surprise that a console game disappears when Microsoft shuts down their servers: consoles were always designed to be disposable. I have been proclaiming that for quite some time. The difference is now, people cannot really deny it.
Subject: Editorial, General Tech, Shows and Expos | June 6, 2013 - 05:42 PM | Scott Michaud
Tagged: unreal engine 4, ue4, E3 13, E3, computex
We are bleeding through the overlap between Computex and E3 media windows; this news has a somewhat relevant fit for both. Unreal Engine 4 is coming and I expect we will see one or more demos and UE4-powered titles over the next week. In fact, I would be fairly shocked if we do not see the end of the Elemental Demo with the Xbox One E3 keynote. We may also potentially see Unreal Engine 4 running on mobile devices and maybe even HTML5 at some point throughout the tradeshow, either canonically through Epic or via a licensee product.
This morning, Epic opened the Unreal Engine 4 Integrated Partners Program (IPP). Of course they already have a couple of members, most of which were partners with Unreal Engine 3.
The founding IPP partners are:
Wwise from Audiokinetic
- Manages large databases of sound effects and voice-overs
- Manages subtitles and multiple dubbings of voice clips
Autodesk Gameware from Autodesk
- Contains multiple packages including Beast, Navigation, and Scaleform
- Scaleform is a Flash rendering engine for HUDs, menus, etc. developed using Flash Professional in 2D or 3D. It is what StarCraft II, Mass Effect, and Borderlands uses.
- Beast is a lighting toolkit for global illumination, radiosity, etc.
- Navigation is an AI solver, predominantly for pathfinding.
Simplygon from Donya Labs
- Reduces polygon count of models so they take up less processing resources especially as they get further away from the camera.
Enlighten from Geomerics
- Another Global Illumination solver, most popular usage being Battlefield 3.
SpeedTree for Games from IDV
- Makes a bunch of efficient trees so studios do not need to hire as many minimum wage peons.
Intel Threading Building Blocks (TBB) from Intel
- Helps developers manage C++ threading for multicore systems.
- Deals with memory management and scheduling tasks
morpheme from NaturalMotion
- Animation and physics software for designers to create animations
- Works with NVIDIA PhysX
euphoria from NaturalMotion
- Simulates animations based on driving conditions via the CPU, most popular usage being GTA IV.
PhysX and APEX from NVIDIA
- You probably know this one.
- GPU-based rigid body, soft body, fluid, and cloth solvers.
- Allows for destructible environments and other complex simulations.
Oculus Rift from Oculus VR
- You probably also know this one, especially if you keep up with our Video Perspectives.
- Head-mounted display with motion tracking for VR.
Bink Video from Rad Game Tools
- ... is not included! Just kidding, that stuff'll survive a nuclear apocalypse.
- Seriously, check in just about any DirectX or OpenGL game's credits if it includes pre-rendered video cutscenes or video-textures.
- I'll wait here.
- In all seriousness, Rad Game Tools has been licensed in over 15,500 titles. It's been a meme to some extent for game programmers. This should be no surprise.
Telemetry Performance Visualizer from Rad Game Tools
- Allows developers to see graphs of what their hardware is working on over time.
- Helps developers know what benefits the most from optimization.
RealD Developer Kit (RDK) from RealD
- Helps game developers create stereoscopic 3D games.
Umbra 3 from Umbra Software
- Determines what geometry can be seen by the player and what should be unloaded to increase performance.
- Sits between artists and programmers to the former does not need to think about optimization, and the latter does not need to claw their eyes out.
IncrediBuild-XGE from Xoreax
- Apparently farms out tasks to idle PCs on your network.
- I am not sure, but I think it is mostly useful for creating a pre-render farm at a game studio for light-baking and such.
We still have a little while until E3 and so we do not know how E3 will be, but I highly expect to see Unreal Engine 4 be a recurring theme over the next week. Keep coming back to PC Perspective, because you know we have a deep interest in where Epic is headed.
Subject: Editorial, General Tech, Motherboards, Processors | June 4, 2013 - 10:40 AM | Ryan Shrout
Tagged: z87, video, overclocking, live, i7-4770k, haswell, ASUS ROG, asus
While we run around with our hair on fire trying to get ready for the Intel Haswell and Z87 product launch this weekend, I wanted to let everyone know about a live stream event we will be holding on Tuesday, June 4th. JJ from ASUS, a crowd favorite for sure, will be joining us LIVE in studio to talk all about the new lineup of ASUS Z87 motherboards. We'll also discuss performance and overclocking capabilities of the new processor and platform.
ASUS Z87 and Haswell Live Stream
10am PT / 1pm ET - June 4th
Be sure you stop by and join in the show! Questions will be answered, prizes will be given out and fun will be had! Who knows, maybe we can break some stuff live as well?? On hand to give away to those of you joining the live stream, we'll have these prizes:
- 2 x ASUS Z87 Motherboards
- 1 x ASUS Graphics card
Methods for winning will be decided closer to the event, but if you are watching live, you'll be included. And we'll ship anywhere in the world!
ASUS and I also want the event to be interactive, so we want your questions. We'll of course being paying attention to the chat room on our live page but you'll have better luck if you submit your questions about the ASUS Z87 products and Haswell processors before hand, in the comments section below. You don't have to register to ask and we'll have the ability to read them beforehand!
I'll update this post with more information after the reviews and stories start to hit, so keep an eye here for more details!!
Subject: Editorial, General Tech | June 4, 2013 - 03:44 AM | Scott Michaud
Tagged: WCS, starcraft 2, HoTS
A little eye-rest before another barrage of Computex news...
Blizzard took over the canon StarCraft II tournament scene as of last year. The goal was to create a unified ranking system between every tournament and help participants deal with scheduling, a problem in recent years. Throughout the entire year, Blizzard is hosting the 2013 StarCraft II World Championship Series. They seem to like breaking rankings into seasons and the 2013 series, alone, will incorporate three of them leading to the year's grand finals in November.
One year per series; three seasons and a grand finals per year; three regional tournaments and a finale per season. This season's finals will take place this weekend, June 8th and 9th, in South Korea.
Tournaments in Europe, Korea, and North America chose the 16 competitors for the 2013 Season 1 Finals this weekend in Korea. The top five competitors in each tournament (top six for Korea) earned their invite. In all: 3 Protoss, 5 Terrans, and 8 Zerg will be participating. I guess their hearts are only half of the swarm.
If the regional matches were any indication, the seasonal finals should be a very entertaining bridge between Computex coverage and E3 2013. Players are getting much better at the game mechanics while still being able to surprise their opponents and even the audience with unusual strategies. Players exploit windows of weakness in their opponents with a moment of strength; the entertainment mostly comes from seeing each player attempt to delay or lengthen those windows all while hiding their own weak periods into times where the opponent is unable to reasonably exploit it.
What are your opinions of "eSports"? Good concept, bad name?
Subject: Editorial, General Tech, Systems | May 29, 2013 - 07:16 PM | Scott Michaud
Tagged: windows blue, Windows 8.1, windows, microsoft
Personally, I really cannot care too much about the user experience quirks inherent to Windows modernization; the wedge slowly being shoved between the user and their machine is far too concerning. No matter how they modify the interface, restricting what users and developers can install and create on their machine is a deal breaker. But, after that obligatory preface reminding people not to get wound up in UX hiccups and be complacent to the big issues, Windows Blue will certainly address many of those UX hiccups.
As we reported, last month, boot-to-desktop and the Start Button were planned for inclusion with Windows 8.1. Then, the sources were relentless to emphasize: "Until it ships, anything can change."
Images courtesy, Paul Thurrott.
Mary Jo Foley gathered quite a few details since then. Firstly, the option (as in, disabled by default) to boot directly to desktop will be there; from the sounds of it, it looks like it will be disabled by default but not exclusive to Enterprise SKUs. This is somewhat promising, as it would be slightly less likely for Microsoft to kill support for the desktop (and, by extension, x86 applications) if they feel pressure to punctuate it. Still, assuming because "it makes sense" is a bad way to conduct business.
Also available (albeit, enabled by default) is the Start Button, seen in higher quality above. This will be, as far as we know, enabled by default. Its functionality will be to bring up the Start Screen or, alternatively, a new All Apps screen visible at ZDNet. Now this has me interested: while I actually like the Start Screen, a list of apps should provide functionality much closer to the Start Menu than Microsoft was previously comfortable with. Previously, the Start Screen attempted to make the desktop applications feel less comfortable than modern apps; this interface appears like it would feel more comfortable to the desktop. While probably still jarring, it looks to make finding desktop applications easier and quickly gets out of the way of your desktop experience.
According to Paul Thurrott, for those who wish to personalize the Start Screen, you will have the option to share your desktop wallpaper with the it. For tasteful backgrounds, like the one above, I can see this being of good use.
Just please, do not grief someone with a background full of fake tiles.
As a final note, there is still no word about multiple monitor support for "Modern Apps". If you have tried to use them in the past, you know what I am talking about: basically only one at a time, it will jump between monitors if you bring up the Start Screen, and so forth.
Subject: Editorial, General Tech, Cases and Cooling | May 29, 2013 - 02:03 PM | Scott Michaud
Tagged: Windows key, mouse, microsoft, I Hate This Key
Has this ever happened to you while playing a shooter? You need to get to a position so you mash the alt key to sprint and... aw crap I hit the Windows key... well, now I am dead. Have you ever considered purchasing software or a gaming keyboard which allows you disable that button?
Have you ever considered purchasing a mouse which also has that button to give both hands something to fear?
Definitely not a member of their Sidewinder product line.
Okay, so I should be fair: the Microsoft Sculpt Comfort mouse is not designed for gaming and Windows 8-like user experiences revolve heavily around the start button. The mouse button is also more useful than a redundant Windows key; the blue pad also has swipe functionality for extra functions. According to how it is described on its product page, slide gestures are bound to respond to the computer as mouse buttons 4 and 5.
So you can probably bind them to game functions, if you feel daring.
But, in the end, I still need to congratulate Microsoft for trying to innovate computer hardware. This is more than just trying to graft touch functionality to a mouse surface, as both Apple and Microsoft have tried in the past, and tries to make the classical mouse experience better. I doubt it is for most of our audience, but not everything needs to be.
Subject: Editorial | May 27, 2013 - 05:08 PM | PCPer Staff
The Dell Inspiron 15R is a good choice for anyone looking for a reasonably powerful and lightweight laptop. It is powered by a 1.8GHz Core i5-3537U, has 8GB RAM, a 1TB HDD and integral DVD burner, with a 15.6" 1366 x 768 LED-backlit LCD powered by the HD4000 on the i5. Not exactly a gaming PC but at 4.9lbs it is an easy way to bring your work with you wherever you go and have more processing power than a tablet will offer.
Dell Home is offering 3rd generation 15.6" Inspiron 15R 5521 Core i7 Ivy Bridge Laptop with 8GB RAM, 1TB Hard Drive & Touchscreen for $799.99 with FREE shipping. Use $289 instant savings and extra $100 coupon code: JS3KG6045QZ2BR to get final price.
Subject: Editorial, General Tech, Systems | May 27, 2013 - 03:08 AM | Scott Michaud
Tagged: xbox one, ps4, consolitis, consoles
So, as Wired editorial states it: hardcore console gamers don't want much, just the impossible. They want a "super-powered box" tethered to their TV; they want the blockbuster epics and innovative indie titles; they want it to "just work" for what they do. The author, Chris Kohler, wrote his column to demonstrate how this is, and has for quite some time been, highly unprofitable.
I think the bigger problem is that the console manufacturers want the impossible.
Console manufacturers have one goal: get their platform in your house and require their hand be in the pocket of everything you do with it. They need to make an attractive device for that to be true, so they give it enough power to legitimately impress the potential buyer and price it low enough to catch the purchasing impulse. Chances are this involves selling the box under cost at launch and for quite some time after.
But, if all of this juicy control locks the user into overspending in the long run, then it is worth it...
But Microsoft should be thankful that I cost them money to be acquired as a customer.
Well, looking at the Wired article, not only are console gamers ultimately overspending: it is still not enough! Consoles truly benefit no-one! The console manufacturers are not doing any more than maybe breaking even, at some point, eventually, down the line, they hope. Microsoft and Sony throw obnoxious amounts of money against one another in research, development, and marketing. Redundant technologies are formed to pit against their counterparts with billions spent in marketing to try to prove why either choice is better.
All of this money is spent to corral users into a more expensive experience where they can pocket the excess.
Going back to the editorial's claims: with all of this money bleeding out, Microsoft wants to appeal more broadly and compensate the loss with more cash flowing in. Sure, Microsoft has wanted a foothold in the living room for decades at this point, but the Xbox Division bounces between profitability and huge losses; thus, they want to be an entertainment hub if just for the cash alone.
But think back to the start, these troubles are not because it is impossible to satisfy hardcore gamers. These troubles are because Microsoft and Sony cannot generate revenue from their acquired control quicker than they can bleed capital away trying to acquire that control, or at least generate it more than just barely fast enough.
The other solution, which I have felt for quite some time is the real answer (hence why I am a PC gamer), has a large group of companies create an industry body who governs an open standard. Each company can make a substantial profit by focusing on a single chunk of the platform -- selling graphics processors, maintaining a marketplace, or what-have-you -- by leveraging the success of every other chunk.
This model does work, and it is the basis for one of humanity's most successful technology products: the internet.
As a side note: this is also why PC gaming was so successful... Microsoft, developers, Steam/GoG/other marketplaces, and hardware vendors were another version of this... albeit Microsoft had the ability to override them and go in whatever direction they wanted. They didn't, until Windows RT.
And the internet might even be the solution. The web browser is capable, today, of providing amazing gaming experiences and it does not even require a plugin. It is getting more powerful, even faster than the rate at which underlying hardware has evolved.
To end on an ironic note, that makes a web browser more capable of offline play than our current understanding of the Xbox One (and Sony has said nothing either way, for that matter).
I guess the takeaway message is: love the web browser, it "just works".
Subject: Editorial, General Tech, Graphics Cards, Systems | May 23, 2013 - 06:40 PM | Scott Michaud
Tagged: xbox one, xbox, unreal engine, ps4, playstation 4, epic games
Unreal Engine 4 was presented at the PlayStation 4 announcement conference through a new Elemental Demo. We noted how the quality seemed to have dropped in the eight months following E3 while the demo was being ported to the console hardware. The most noticeable differences were in the severely reduced particle counts and the non-existent fine lighting details; of course, Epic pumped the contrast in the PS4 version which masked the lack of complexity as if it were a stylistic choice.
Still, the demo was clearly weakened. The immediate reaction was to assume that Epic Games simply did not have enough time to optimize the demo for the hardware. That is true to some extent, but there are theoretical limits on how much performance you can push out of hardware at 100% perfect utilization.
Now that we know both the PS4 and, recently, the Xbox One: it is time to dissect more carefully.
A recent LinkedIn post from EA Executive VP and CTO, Rajat Taneja, claims that the Xbox One and PS4 are a generation ahead of highest-end PC on the market. While there are many ways to interpret that statement, in terms of raw performance that statement is not valid.
As of our current knowledge, the PlayStation 4 contains an eight core AMD "Jaguar" CPU with an AMD GPU containing 18 GCN compute units, consisting of a total of 1152 shader units. Without knowing driving frequencies, this chip should be slightly faster than the Xbox One's 768 shader units within 12 GCN compute units. The PS4 claims their system has a total theoretical 2 teraFLOPs of performance and the Xbox One would almost definitely be slightly behind that.
Back in 2011, the Samaritan Demo was created by Epic Games to persuade console manufacturers. This demo was how Epic considered the next generation of consoles to perform. They said, back in 2011, that this demo would theoretically require 2.5 teraFLOPs of performance for 30FPS at true 1080p; ultimately their demo ran on the PC with a single GTX 680, approximately 3.09 teraFLOPs.
This required performance, (again) approximately 2.5 teraFLOPs, is higher than what is theoretically possible for the consoles, which is less than 2 teraFLOPs. The PC may have more overhead than consoles, but the PS4 and Xbox One would be too slow even with zero overhead.
Now, of course, this does not account for reducing quality where it will be the least noticeable and other cheats. Developers are able to reduce particle counts and texture resolutions in barely-noticeable places; they are also able to render below 1080p or even below 720p, as was the norm for our current console generation, to save performance for more important things. Perhaps developers might even use different algorithms which achieve the same, or better, quality for less computation at the expense of more sensitivity to RAM, bandwidth, or what-have-you.
But, in the end, Epic Games did not get the ~2.5 teraFLOPs they originally hoped for when they created the Samaritan Demo. This likely explains, at least in part, why the Elemental Demo looked a little sad at Sony's press conference: it was a little FLOP.
Update, 5/24/2013: Mark Rein of Epic Games responds to the statement made by Rajat Taneja of EA. While we do not know his opinion on consoles... we know his opinion on EA's opinion:
— Mark Rein (@MarkRein) May 23, 2013
Subject: Editorial, Graphics Cards | May 23, 2013 - 12:08 PM | Ryan Shrout
Tagged: video, live, gtx 780, gk110
Missed the LIVE stream? You can catch the video reply of it below!
Hopefully by now you have read over our review of the new NVIDIA GeForce GTX 780 graphics card that launched this morning. Taking the GK110 GPU, cutting off some more cores, and setting a price point of $650 will definitely create some interesting discussion.
Join me today at 2pm ET / 11am PT as we discuss the GTX 780, our review and take your questions. You can leave them in the comments below, no registration required.
Subject: Editorial, General Tech | May 22, 2013 - 01:53 AM | Scott Michaud
Tagged: antivirus, antimalware
They might be a good means of guarding you from momentary lapses of judgment, but security is not equivalent to antivirus packages. You always need to consider how much your system is exposed to untrusted and even unsolicited data. Any software which accepts untrusted data has some surface with potential vulnerability to attack.
This, inherently, includes software which accepts data to scan it for malware.
Last week was host to Patch Tuesday, and one of its many updates fixed a vulnerability in Microsoft's Malware Protection Engine (MPE). The affected code is only present in applications which run the 64-bit version of the engine. For home users, these applications are: Microsoft Security Essentials (x86-64), Microsoft Malicious Software Removal Tool (x86-64), and all varieties of Windows Defender (x86-64). For enterprise users, MPE is also a part of Forefront and Endpoint applications and suites.
Despite the irony, I will not beat up on Microsoft. As far as I know, these vulnerabilities are semi-frequently patched in basically any antimalware application. At the very least, Microsoft declares and remedies problems with reasonable and appropriate policies; they could have just as easily buried this fix and pushed it out silently or worse, wait until it becomes actively exploited in the wild and even beyond.
But, and I realize I am repeating myself at this point, the biggest takeaway from this news: you cannot let the mere presence of antivirus suites permit you to be complacent. No scanner will detect everything, and some might even be the way in.
Subject: Editorial, General Tech, Graphics Cards, Processors, Systems | May 21, 2013 - 05:26 PM | Scott Michaud
Tagged: xbox one, xbox
Almost exactly three months have passed since Sony announced the Playstation 4 and just three weeks remain until E3. Ahead of the event, Microsoft unveiled their new Xbox console: The Xbox One. Being so close to E3, they are saving the majority of games until that time. For now, it is the box itself as well as its non-gaming functionality.
First and foremost, the raw specifications:
- AMD APU (5 billion transistors, 8 core, on-die eSRAM)
- 8GB RAM
- 500GB Storage, Bluray reader
- USB 3.0, 802.11n, HDMI out, HDMI in
The hardware is a definite win for AMD. The Xbox One is based upon an APU which is quite comparable to what the PS4 will offer. Unlike previous generations, there will not be too much differentiation based on available performance; I would not expect to see much of a fork in terms of splitscreen and other performance-sensitive features.
A new version of the Kinect sensor will also be present with all units which developers can depend upon. Technically speaking, the camera is higher resolution and more wide-angle; up to six skeletons can be tracked with joints able to rotate rather than just hinge. Microsoft is finally also permitting developers to use the Kinect along with a standard controller to, as they imagine, allow a user to raise their controller to block with a shield. That is the hope, but near the launch of the original Kinect, Microsoft filed a patent to allow sign language recognition: has not happened yet. Who knows whether the device will be successfully integrated into gaming applications.
Of course Microsoft is known most for system software, and the Xbox runs three lightweight operating environments. In Windows 8, you have the Modern interface which runs WinRT applications and you have the desktop app which is x86 compatible.
The Xbox One borrows more than a little from this model.
The home screen, which I am tempted to call the Start Screen, for the console has a very familiar tiled interface. They are not identical to Windows but they are definitely consistent. This interface allows for access to Internet Explorer and an assortment of apps. These apps can be pinned to the side of the screen, identical to Windows 8 modern app. I am expecting there to be "a lot of crossover" (to say the least) between this and the Windows Store; I would not be surprised if it is basically the same API. This works both when viewing entertainment content as well as within a game.
These three operating systems run at the same time. The main operating system is basically a Hyper-V environment which runs the two other operating systems simultaneously in sort-of virtual machines. These operating systems can be layered with low latency, since all you are doing is compositing them in a different order.
Lastly, they made reference to Xbox Live, go figure. Microsoft is seriously increasing their server capacity and expects developers to utilize Azure infrastructure to offload "latency-insensitive" computation for games. While Microsoft promises that you can play games offline, this obviously does not apply to features (or whole games) which rely upon the back-end infrastructure.
And yes, I know you will all beat up on me if I do not mention the SimCity debacle. Maxis claimed that much of the game requires an online connection due to the complicated server requirements; after a crack allowed offline functionality, it was clear that the game mostly operates fine on a local client. How much will the Xbox Live cloud service offload? Who knows, but that is at least their official word.
Now to tie up some loose ends. The Xbox One will not be backwards compatible with Xbox 360 games although that is no surprise. Also, Microsoft says they are allowing users to resell and lend games. That said, games will be installed and not require the disc, from what I have heard. Apart from the concerns about how much you can run on a single 500GB drive, once the game is installed rumor has it that if you load it elsewhere (the rumor is even more unclear about whether "elsewhere" counts accounts or machines) you will need to pay a fee to Microsoft. In other words? Basically not a used game.
Well, that has it. You can be sure we will add more as information comes forth. Comment away!
Subject: Editorial | May 17, 2013 - 10:36 AM | PCPer Staff
We often get asked about our favorite gaming notebooks and despite the somewhat aged styling, we still think Alienware makes some of the best. Today's deal offers up the small M14x R2 with a 1600x900 screen, Core i7-3630QM quad-core processors, GeForce GT 650M discrete graphics and 16GB of DDR3 memroy. The $1308 price is a nice $300+ discount over the normal price!
Subject: Editorial, General Tech | May 16, 2013 - 03:45 PM | Scott Michaud
Tagged: web browser, Malware, IE10
If you consider your browser security based solely on whether it will allow you to manually download a malicious executable: IE10 is the best browser ever!
Rod Trent over at Windows IT Pro seems to believe this when NSS labs released their report, "Socially Engineered Malware Blocking". In this report, Internet Explorer blocked the user from downloading nearly all known malware (clarification: all known malware within the test). Google Chrome came in second place with a little less than 17% fail rate and the other browsers were quite far behind with approximately a 90% failure rate.
Based on that one metric alone, Rod Trent used a cutesy chess image to proclaim IE the... king... of the hill. Not only that, he suggests Safari, Opera, and Firefox consider "shuttering their doors." After about a decade of Internet Explorer suffering from countless different and unique vectors of exploitation, now is the time to proclaim a victor for attacks which require explicit user action?
Buckle in, readers, it's a rant.
Firstly, this reminds me a little bit of Microsoft Security Essentials. Personally, I use it, because it provides enough protection for me. Unlike its competitors, MSE has next to no false positives because almost ignores zero-day exploits. The AV package drew criticism from lab tests which test zero-day exploits. Microsoft Security Essentials was ranked second-worst by this metric.
Well, time to shutter your doors Micr... oh wait Rod Trent lauded it as award-winning. Huh...
But while we are on the topic of false positives, how do you weigh those in your grading of a browser? According to the report, and common sense, achieving pure success in this metric is dead simple if you permit your browser to simply block every download, good or bad.
If a 100% false positive acceptance rate is acceptable, it is trivial to protect users from all malicious download. With just a few lines of code, Firefox, Safari, and Opera could displace Internet Explorer and Chrome as the leaders of protection against socially engineered malware. However, describing every download as "malicious" would break the internet. Finding a balance between accuracy and safety is the challenge for browsers at the front of protection technology.
A browser that is capable of blocking malware without blocking legitimate content would certainly be applause-worthy. I guess time will tell whether Internet Explorer 10 is able to walk the balance, or whether it will just be a nuisance like the first implementations of UAC.
OK, Google did actually release exactly one native Windows application at Google I/O: It's called Android Studio, an application that helps developers create apps that run on Android, Google’s answer to Windows. But don’t worry, Microsoft fans: Internet Explorer (IE) flags the Android Studio download as potential malware.
Ah crap... that was quick.
Now to be fair, Internet Explorer 10 and later have been doing things right. I am glad to see Microsoft support standards and push for an open web after so many years. This feature helps protect users from their own complacency.
Still, be careful when you call checkmate: some places may forfeit your credibility.
Subject: Editorial, General Tech, Cases and Cooling, Processors | May 10, 2013 - 04:23 PM | Scott Michaud
Tagged: c6, c7, haswell, PSU, corsair
I cannot do it captain! I don't have the not enough power!
We have been discussing the ultra-low power state of Haswell processors for a little over a week and how it could be detrimental to certain power supplies. Power supply manufacturers never quite expected that you could have as little as a 0.05 Amp (0.6W) draw on the 12V rail without being off. Since then, companies such as Enermax started to list power supplies which have been tested and are compliant with the new power requirements.
|AXi||AX1200i||Yes||100% Compatible with Haswell CPUs|
|AX860i||Yes||100% Compatible with Haswell CPUs|
|AX760i||Yes||100% Compatible with Haswell CPUs|
|AX||AX1200||Yes||100% Compatible with Haswell CPUs|
|AX860||Yes||100% Compatible with Haswell CPUs|
|AX850||Yes||100% Compatible with Haswell CPUs|
|AX760||Yes||100% Compatible with Haswell CPUs|
|AX750||Yes||100% Compatible with Haswell CPUs|
|AX650||Yes||100% Compatible with Haswell CPUs|
|HX||HX1050||Yes||100% Compatible with Haswell CPUs|
|HX850||Yes||100% Compatible with Haswell CPUs|
|HX750||Yes||100% Compatible with Haswell CPUs|
|HX650||Yes||100% Compatible with Haswell CPUs|
|TX-M||TX850M||Yes||100% Compatible with Haswell CPUs|
|TX750M||Yes||100% Compatible with Haswell CPUs|
|TX650M||Yes||100% Compatible with Haswell CPUs|
|TX||TX850||Yes||100% Compatible with Haswell CPUs|
|TX750||Yes||100% Compatible with Haswell CPUs|
|TX650||Yes||100% Compatible with Haswell CPUs|
|GS||GS800||Yes||100% Compatible with Haswell CPUs|
|GS700||Yes||100% Compatible with Haswell CPUs|
|GS600||Yes||100% Compatible with Haswell CPUs|
|CX-M||CX750M||Yes||100% Compatible with Haswell CPUs|
|CX600M||TBD||Likely compatible — currently validating|
|CX500M||TBD||Likely compatible — currently validating|
|CX430M||TBD||Likely compatible — currently validating|
|CX||CX750||Yes||100% Compatible with Haswell CPUs|
|CX600||TBD||Likely compatible — currently validating|
|CX500||TBD||Likely compatible — currently validating|
|CX430||TBD||Likely compatible — currently validating|
|VS||VS650||TBD||Likely compatible — currently validating|
|VS550||TBD||Likely compatible — currently validating|
|VS450||TBD||Likely compatible — currently validating|
|VS350||TBD||Likely compatible — currently validating|
Above is Corsair's slightly incomplete chart as of the time it was copied from their website, 3:30pm on May 10th, 2013; so far it is coming up all good. Their blog should be updated as new products get validated for the new C6 and C7 CPU sleep states.
The best part of this story is just how odd it is given the race to arc-welding (it's not a podcast so you can't Bingo! hahaha!) supplies we have been experiencing over the last several years. Simply put, some companies never thought that component manufacturers such as Intel would race to the bottom of power draws.
Subject: Editorial, Graphics Cards | May 8, 2013 - 11:37 PM | Ryan Shrout
Tagged: video, nvidia, live, frame rating, fcat
Update: Did you miss the live stream? Watch the on-demand replay below and learn all about the Frame Rating system, FCAT, input latency and more!!
I know, based solely on the amount of traffic and forum discussion, that our readers have really adopted and accepted our Frame Rating graphics testing methodology. Based on direct capture of GPU output via an external system and a high end capture card, our new systems have helped users see GPU performance a in more "real-world" light that previous benchmarks would not allow.
I also know that there are lots of questions about the process, the technology and the results we have shown. In order to try and address these questions and to facilitate new ideas from the community, we are hosting a PC Perspective Live Stream on Thursday afternoon.
Joining me will be NVIDIA's Tom Petersen, a favorite of the community, to talk about NVIDIA's stance on FCAT and Frame Rating, as well as just talk about the science of animation and input.
The primary part of this live stream will be about education - not about bashing one particular product line or talking up another. And part of that education is your ability to interact with us live, ask questions and give feedback. During the stream we'll be monitoring the chat room embedded on http://pcper.com/live and I'll be watching my Twitter feed for questions from the audience. The easiest way to get your question addressed though will be to leave a comment or inquiry here in this post below. It doesn't require registration and this will allow us to think about the questions before hand, giving it a better chance of being answered during the stream.
Frame Rating and FCAT Live Stream
11am PT / 2pm ET - May 9th
So, stop by at 2pm ET on Thursday, May 9th to discuss the future of graphics performance and benchmarking!
Subject: Editorial, General Tech, Graphics Cards, Processors | May 8, 2013 - 09:32 PM | Scott Michaud
Tagged: Volcanic Islands, radeon, ps4, amd
So the Southern Islands might not be entirely stable throughout 2013 as we originally reported; seismic activity being analyzed suggests the eruption of a new GPU micro-architecture as early as Q4. These Volcanic Islands, as they have been codenamed, should explode onto the scene opposing NVIDIA's GeForce GTX 700-series products.
It is times like these where GPGPU-based seismic computation becomes useful.
The rumor is based upon a source which leaked a fragment of a slide outlining the processor in block diagram form and specifications of its alleged flagship chip, "Hawaii". Of primary note, Volcanic Islands is rumored to be organized with both Serial Processing Modules (SPMs) and a Parallel Compute Module (PCM).
So apparently a discrete GPU can have serial processing units embedded on it now.
Heterogeneous Systems Architecture (HSA) is a set of initiatives to bridge the gap between massively parallel workloads and branching logic tasks. We usually make reference to this in terms of APUs and bringing parallel-optimized hardware to the CPU. In this case, we are discussing it in terms of bringing serial processing to the discrete GPU. According to the diagram, the chip within would contain 8 processor modules each with two processing cores and an FPU for a total of 16 cores. There does not seem to be any definite identification whether these cores would be based upon their license to produce x86 processors or their other license to produce ARM processors. Unlike an APU, this is heavily skewed towards parallel computation rather than a relatively even balance between CPU, GPU, and chipset features.
Now of course, why would they do that? Graphics processors can do branching logic but it tends to sharply cut performance. With an architecture such as this, a programmer might be able to more efficiently switch between parallel and branching logic tasks without doing an expensive switch across the motherboard and PCIe bus between devices. Josh Walrath suggested a server containing these as essentially add-in card computers. For gamers, this might help out with workloads such as AI which is awkwardly split between branching logic and massively parallel visibility and path-finding tasks. Josh seems skeptical about this until HSA becomes further adopted, however.
Still, there is a reason why they are implementing this now. I wonder, if the SPMs are based upon simple x86 cores, how the PS4 will influence PC gaming. Technically, a Volcanic Island GPU would be an oversized PS4 within an add-in card. This could give AMD an edge, particularly in games ported to the PC from the Playstation.
This chip, Hawaii, is rumored to have the following specifications:
- 4096 stream processors
- 16 serial processor cores on 8 modules
- 4 geometry engines
- 256 TMUs
- 64 ROPs
- 512-bit GDDR5 memory interface, much like the PS4.
20 nm Gate-Last silicon fab process
- Unclear if TSMC or "Common Platform" (IBM/Samsung/GLOBALFOUNDRIES)
Softpedia is also reporting on this leak. Their addition claims that the GPU will be designed on a 20nm Gate-Last fabrication process. While gate-last is considered to be not worth the extra effort in production, Fully Depleted Silicon On Insulator (FD-SOI) is apparently "amazing" on gate-last at 28nm and smaller fabrication. This could mean that AMD is eying that technology and making this design with intent of switching to an FD-SOI process, without a large redesign which an initially easier gate-first production would require.
Well that is a lot to process... so I will leave you with an open question for our viewers: what do you think AMD has planned with this architecture, and what do you like and/or dislike about what your speculation would mean?
Subject: Editorial, Mobile | May 7, 2013 - 12:07 AM | Scott Michaud
Tagged: unreal engine, firefox, asm.js
If, on the other hand, you wish to see an example of a large application compiled for the browser: would Unreal Engine 3 suffice?
Clearly a computer hardware website would take the effort required to run a few benchmarks, and we do not disappoint. Epic Citadel was run in its benchmark mode in Firefox 20.0.1, Firefox 22.0a2, and Google Chrome; true, it was not run for long on Chrome before the tab crashed, but you cannot blame me for trying.
Each benchmark was run at full-screen 1080p "High Performance" settings on a PC with a Core i7 3770, a GeForce GTX 670, and more available RAM than the browser could possibly even allocate. The usual Firefox framerate limit was removed; they were the only tab open on the same fresh profile; the setting layout.frame_rate.precise was tested in both positions because I cannot keep up what the state of requestAnimationFrame callback delay is; and each scenario was performed twice and averaged.
- layout.frame_rate.precise true: 54.7 FPS
- layout.frame_rate.precise false: 53.2 FPS
Firefox 22.0a2 (asm.js)
- layout.frame_rate.precise true: 147.05 FPS
- layout.frame_rate.precise false: 144.8 FPS
Google Chrome 26.0.1410.64
It is also very enticing for Epic as well. A little over a month ago, Mark Rein and Tim Sweeney of Epic were interviewed by Gamasutra about HTML5 support for Unreal Engine. Due in part to the removal of UnrealScript in favor of game code being scripted in C++, Unreal Engine 4 will support HTML5. They are working with Mozilla to make the browser a reasonable competitor to consoles; write once, run on Mac, Windows, Linux, or anywhere compatible browsers can be found. Those familiar with my past editorials know this excites me greatly.
So what do our readers think? Comment away!