All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: Editorial, General Tech | July 2, 2013 - 03:33 AM | Scott Michaud
Tagged: xbox one, xbox, microsoft, consolitis
Well that was unexpected...
Don Mattrick, a few months ahead of the Xbox One launch and less than two months after its unveiling, decided to leave his position at Microsoft as president of Interactive Entertainment Business. This news was first made official by a Zynga press release, which announced acquiring him as CEO. Steve Ballmer later published an open letter addressed all employees of Microsoft, open to the public via their news feed, wishing him luck and outlining the immediate steps to follow.
While subtle in the email, no replacement has been planned for after his departure on July 8th. Those who report to Don Mattrick will report directly to Steve Ballmer, himself, seemingly through the launch of Xbox One. As scary and unsettling as Xbox One PR has been lately, launching your flagship ship without a captain is a depressingly fitting apex. This would likely mean that either: Don gave minimal notice of his departure, he was being abruptly ousted from Microsoft and Zynga just happened to make convenient PR for all parties involved, or there is literally no sense to be made of the situation.
However the situation came about, Xbox One will likely launch from a team directly lead by Steve Ballmer and Zynga will have a new CEO. Will his goal be to turn the former social gaming giant back on course? Or will he be there to milk blood from the company before it turns to stone?
I wonder whether his new contract favors cash or stock...
Subject: Editorial, General Tech | July 2, 2013 - 02:12 AM | Scott Michaud
Tagged: google, spdy, QUIC
It missed being a recursive acronym by a single letter...
TCP is known for being the go-to protocol for stable connections over the internet. There are some things you can guarantee: you will not lose bits of data, packets will arrive in order, incorrect packets will be checked and redelivered, and both endpoints will be roughly metered to the least capacity. It is easy to develop applications around the TCP protocol, it does the hard problems for you.
UDP, on the other hand, frees its packets in a fountain to hopefully land where it is intended. This protocol is fast, but a pain for applications that need some level of reliability. Quick UDP Internet Connections (QUIC), from Google, leverages UDP to create multiple independent, even encrypted connections. While TCP could be made faster, it is beyond the jurisdiction of web browsers; support is embedded into the operating system itself. This leaves building upon UDP, suffering with TCP, or not being compatible with about every network hardware installed just about anywhere.
This comes on the heels of SPDY, Google's other open protocol. SPDY is built around HTTP and both presume a reliable protocol underneath, where TCP is the usual candidate. A large advantage of SPDY allows assets to simultaneously stream over a single connection. TCP will, unfortunately, freeze the entire connection (and thus each stream) when a single stream drops a packet. QUIC, based upon UDP, can then be used to accelerate SPDY further by allowing truly independent multiplexing.
QUIC will be used for "a small percentage of Chrome dev and canary channel traffic to some Google server", for experimentation purposes. The code itself is licensed under BSD and, as such, could migrate to other browsers in due time.
Subject: Editorial, General Tech | June 19, 2013 - 09:08 PM | Tim Verry
Tagged: xbox one, gaming, DRM, disc
Microsoft faced a major backlash from users following the unveiling of its latest Xbox One console. Users were rather unnerved at Microsoft’s reveal that the new console would be required to “phone home” at least once every 24 hours in order to authenticate games and allow sharing. Considering Sony carried forward the disc traditions of the PS3 combined with the user uproar, Microsoft has reconsidered and issued an update to users via a blog post titled (in part) “Your Feedback Matters.”
Amidst the uncertainty caused by various MS sources issuing statements about functionality and DRM that conflict with one another and an air of as-yet-un-announced secrecy pre-E3 where MS released just enough info about the DRM to get users scared (can you tell the way MS handled this irked me?), the company talked about the Xbox One moving forward and taking advantage of the ‘digital age.’ The new console would require online authentication (and daily check-ins), but would also allow sharing of your game library with up to 10 other people, re-downloadable games that can be installed on other consoles (and played) so long as you log into your Xbox Live account (the latter bit is similar in nature to Steam on the PC). Further, disc games could be resold or gifted if the publishers allow it.
That has changed now, however. Microsoft has reconsidered its position and is going back to the way things work(ed) on the existing Xbox 360. Instead of taking the logical approach of keeping with the plan but removing the daily authentication requirement for games if you keep the game disc in the tray, Microsoft has taken their
ball Xbox One controller and completely backtracked.
DRM on the Xbox One is now as follows, and these changes go in place of (not in addition to) the previously announced sharing and reselling functionalities.
For physical disc games:
According to Xbox Wire, after their initial setup and installation, disc-based games will not require an internet connection for offline functionality (though multiplayer components will, obviously, need an active connection). Even better, trading and reselling of disc-based games is no longer limited by publishers. Trading, selling, gifting, renting, et al of physical disc-based games "will work just as it does today on the Xbox 360." Microsoft is also not region locking physical games, which means that you will not have to worry about games purchased abroad working on your console at home.
In order to play disc-based games, you will need to keep the game disc in the tray, even if it is installed on the hard drive, however.
Changes to Downloaded games:
As far as downloadable games, Microsoft is restricting these titles such that they cannot be shared or resold. In the previous model, you would have been able to share the titles with your family, but not anymore. You will still be able to re-download the games.
There is no word on whether or not gamers will still lose access to all of the titles in their game library if their Xbox Live accounts are ever banned. It is likely that gamers will lose any downloadable games though as those are effectively tied to a single Xbox Live account.
While at first glance it may seem as though gamers won this round, in the end no one really won. Instead of Microsoft working around gamers concerns for physical media and moving forward together, it is as though Microsoft has thrown up its hands in frustration, and tossed out all of the innovative aspects for digital/downloadable titles along with the undesirable daily authentication and other invasive DRM measures that gamers clearly indicated they did not want.
I believe that Microsoft should have kept to the original game plan, but added an exception to the daily check-in rules so long as the console was able to authenticate the game offline by identifying a physical game disc in the tray. That way, gamers that are not comfortable with (or able to) keeping the Xbox One connected to the internet could continue to play games using discs while also allowing those with always-on Xbox One consoles the privileges of sharing their libraries. Doing so would have also helped ease the console gaming populance as a whole into Microsoft's ideal digital age once the next Xbox comes out. However, instead of simply toning down the changes, Microsoft has completely backtracked, and now no one wins. Sigh.
What are your thoughts on Microsoft's latest changes to the Xbox One? Was it the right move, or were you looking forward to increased freedom with your digitally-downloaded games?
- The PS4 and Xbox One Hardware Revealed, Console Makers Have Different Goals @ PC Perspective
- E3 2013: Microsoft can ban your Xbox One library @ PC Perspective
Subject: Editorial, General Tech | June 19, 2013 - 06:33 PM | Scott Michaud
Tagged: steam, DRM
You can learn a lot by scanning configuration, registry files, and so forth; many have made off with a successful bounty. Most recently, some Steam Beta users dug around in their user interface (UI) files to notice a few interesting lines, instructing the user that the title they are attempting to launch will kick off a friend it is currently being shared with.
"SteamUI_JoinDialog_SharedLicense_Title" "Shared game library"
"SteamUI_JoinDialog_SharedLicenseLocked_OwnerText" "Just so you know, your games are currently in use by %borrower%. Playing now will send %borrower% a notice that it's time to quit."
"SteamUI_JoinDialog_SharedLicenseLocked_BorrowerText" "This shared game is currently unavailable. Please try against later or buy this game for your own library."
Sure, this whole game DRM issue has been flipping some tables around the industry. Microsoft tried permitting users share games with their family, utilizing about the worst possible PR, and eventually needed to undo that decision. Users would like flexible licensing schemes, but the content industry (including the platform owners like Microsoft, Nintendo, and Sony, who receive license fees from game sales) are unwilling to cooperate unless they are assured that users are honest.
Of course, what usually happens is honest users get crapped on and pirates enjoy a better experience, after initial setup.
While there is not much difference, from a high level view, between Steam and the proposed Xbox One, there are a number of differences. The obvious difference is Steam's offline mode, but probably the larger reason is trust. Valve has demonstrated a lot of good faith to their customers; where Microsoft shuts down access to content people paid for, Valve has shown they have intentions for both long-term support and consideration for the user's experience.
Ultimately, I feel as if DRM is not a necessary evil, but while it exists at least there are companies such as Valve who earn trust and use DRM both for and against users. I expect that some day, the industry will turn against DRM either willingly, by legal intervention, or because companies like cdp.pl will use DRM-free as a promotional tool and nibble their way to dominance.
And yes, despite the fact that this will be confused with bias: if you prove that you are untrustworthy before, you will get away with less later regardless of your intentions.
Subject: Editorial, General Tech | June 19, 2013 - 06:16 PM | Scott Michaud
Tagged: DRM, The Witcher 3, GOG
cdp.pl, formerly CD Projekt, has been one of the last holdouts against DRM. Founders of GoG.com and developer/publisher for The Witcher franchise, they offer a DRM-free platform for users to purchase games. Sure, they are usually good and old ones, aptly enough, but they are confident enough to include their most ambitious titles, The Witcher and The Witcher 2.
With The Witcher 3, we will see the title launch without DRM on GoG, trusting their users will purchase the title and be honest.
Apparently, the game will have a world slightly larger than Skyrim.
Hopefully, with very little empty space.
I have long been a proponent of DRM-free media, as you could probably tell. I believe that DRM-free titles end up netting more sales than the same title would have with encryption; even if that were not true, society is harmed more than enough to justify its non-existence. Sure, we all know unapologetic jerks and they are, indeed, jerks. Just because these jerks exist does not mean your company should, or successfully will, be the alpha a-hole on the a-hole food-chain. Chances are you will just upset your actual customers, now former customers. There are reasons why I never purchased (never pirated either, I just flat-out ignored the entire franchise's existence) another Crysis title after the first one's SecuROM debacle wrecked my camcorder's DVD-authoring software.
So, when The Witcher 3 comes out, back it up on your external hard drive and maybe even keep a copy on your home theater PC. Most importantly, buy it... sometime in 2014.
Subject: Editorial, General Tech, Systems, Shows and Expos | June 17, 2013 - 03:16 AM | Scott Michaud
Tagged: xbox one, microsoft, ea, E3 13, E3
Update: Microsoft denies the statements from their support account... but this is still one of the major problems with DRM and closed platforms in general. It is stuff like this that you let them do.
Consumers, whether they acknowledge it or not, fear for the control that platform holders have over their content. It was hard for many to believe that having your EA account banned for whatever reason, even a dispute with a forum moderator, forfeited your license to games you play through that EA account. Sounds like another great idea for Microsoft to steal.
@dohertymark If your account is banned, you also forfeit the licenses to any games that have licenses tied to it as listed in the ToU. ^AC
— Xbox Support (@XboxSupport1) June 14, 2013
Not stopping there, later on in the thread they were asked what would happen in the event of a security breach. You know, recourse before destroying access to possibly thousands of dollars of content.
@KillerRamen Ensure your account security features are enabled, and security proofs details are correct. ^ML
— Xbox Support (@XboxSupport1) June 15, 2013
While not a "verified account", @xboxsupport is.
They acknowledge ownership of this account in the background image there.
Honestly, there shouldn't have been any doubt that these actually are Microsoft employees.
At this point, we have definitely surpassed absurdity. Sure, you typically need to do something fairly bad to have Microsoft stop charging your for Xbox Live. Removing access to your entire library of games, to me, is an attempt to limit cheating and the hardware community.
Great, encourage spite from the soldering irons, that works out well.
Don't worry, enthusiasts, you know the PC loves you.
Gaming as a form of entertainment is fundamentally different than gaming as a form of art. When content is entertainment, its message touches you without any intrinsic value and can be replaced with similar content. Sometimes a certain piece of content, itself, has specific value to society. It is these times where we should encourage efforts by organizations such as GoG, Mozilla and W3C, Khronos, and many others. Without help, it could be extremely difficult or impossible for content to be preserved for future generations and future civilizations.
It does not even need to get in the way of the industry and its attempt to profit from the gaming medium; a careless industry, on the other hand, can certainly get in the way of our ability to have genuine art. After all, this is the main reason why I am a PC gamer: the platform allows entertainment to co-exist with communities who support themselves when the official channels do not.
Of course, unless Windows learns a little something from the Xbox. I guess do not get your Windows Store account banned in the future?
Subject: Editorial, General Tech, Processors | June 15, 2013 - 07:02 PM | Scott Michaud
Tagged: Intel, Ivy Bridge-E, Haswell-E
In my analysis of the recent Intel Computex keynote, I noted that the displayed confidence came across more as repressing self-doubt. It did not seem, to me, like Intel wants to abandon the high-end enthusiast but rather catch up with their low performance and high efficiency competitors; they just know they are secure in that market. Of course, we could see mid-range choices dwindle and prices stagnate, but I cast doubt that Intel wants to exit the enthusiast market despite their silence about Ivy Bridge-E.
All Images, Credit: VR-Zone
And Intel, now, wants to return some confidence to their high-end consumers comma they are not slowing down exclamation point exclamation point.
VR-Zone, the site which published Ivy Bridge-E's lazy release roadmap, are also the ones to suggest Haswell-E will come before mainstream Broadwell offerings. Once again, all is right with the world. Slated for release around holiday 2014, just a year after Ivy Bridge-E, Haswell-E will come alongside the X99 chipset. Instead of Broadwell, the back to school window of 2014 will by filled by a refresh of 22nm Haswell products with a new 9-series chipset.
Seriously, it's like watching the face of Intel's Tick-Tock while a repairman is tweaking the gears.
In terms of specifications, Haswell-E will come in 8 and 6-core offerings with up to 20MB of cache. Apart from the inclusion of DDR4 support, the main advantage of Haswell-E over the upcoming Ivy Bridge-E is supposed to be raw performance; VR-Zone estimates up to 33-50% better computational strength. A depressingly novel area of improvement as of recent...
Lastly, with recent discussion of the awkwardly hobbled K-series parts, our readers might be happy to know that all Haswell-E parts will be unlocked to overclocking. This, again, leads me to believe that Intel is not hoping to suffocate the enthusiast market but rather sort their users: mid-range consumers will take what they are given and, if they object, send them on the bus to Funk-E town.
Note, while the headlining slide definitively says "All Processors Unlocked"...
... this slide says "For K and Extreme series products." I will assume the latter is out of date?
Which begs the question: what does our readers think about that potential strategy? It could lead to mainstream performance products being pushed down into BGA-territory, but cements the existence of an enthusiast platform.
Subject: Editorial, General Tech, Systems, Shows and Expos | June 11, 2013 - 04:06 AM | Scott Michaud
Tagged: wwdc 13, MacBook Air, Mac Pro, apple
Sometimes our "Perspective" is needed on Apple announcements because some big points just do not get covered by the usual sources. Other times, portions of the story can be relevant to our readers. This is one of those days where both are true. Either side should review our thoughts and analysis of Apple's recent ultrabook and, especially, their upcoming desktop offerings.
The MacBook Air has been, predictably, upgraded Intel's Haswell processors. Battery life is the first obvious benefit of the CPU, and that has been well reported. The 11-inch MacBook Air gains an extra four hours of battery life, usable for up to 9 hours between charges. The extra space on the 13-inch MacBook Air allows it to last 12 hours between charges.
Less discussed, both MacBook Airs will contain Intel's Iris iGPU more commonly known as Intel HD 5000. You cannot get Intel HD 5000 graphics without selecting a BGA socket component which you would install by soldering it in place. While there are several better solutions from competing GPU vendors, Apple will have one of the first shipping implementations of Haswell's canonical graphics processor. Iris is said to have double the performance of previous generation Ivy Bridge graphics for a fraction of its power consumption.
Also included in the MacBook Air is an 802.11a/b/g/n/ac WiFi network adapter and Bluetooth 4.0. Apple is not typically known to introduce new standards and often lags severely behind what is available on the PC unless they had a hand in trademarking it, USB 3.0 being the obvious and recent example.
The specifications will be somewhat customizable, the user is able to select between: an i5 and an i7 processor, 4GB or 8GB of RAM, and 128, 256, or 512GB SSD. It has shipped the day it was announced with base prices ranging between $999 for an entry-level 11-inch and $1099 for an entry-level 13-inch.
But now we move on to the dying industry, desktop PCs, where all innovation has died unless it is to graft a touch interface to anything and everything.
"Can't innovate any more, my ass", grunts Phil Schiller, on the keynote stage.
Whether you like it, or think "innovation" is the best word, it's a legitimate new design some will want.
While the new Mac Pro is not a system that I would be interested in purchasing, for issues I will outline soon, these devices are what some users really want. I have been a very strong proponent of OEM devices as they highlight the benefit of the PC industry: choice. You can purchase a device, like the new Mac Pro, from a vendor; alternatively, you can purchase the components individually to assemble yourself and save a lot of money; otherwise, you can hire a small business computer store or technician.
We need more companies, like Apple, to try new devices and paradigms for workstations and other high-performance devices. While it is less ideal for Apple to be the ones coming up with these redesigns, Apple's platform encourages applications to be vendor-specific (only run on a Mac), it can still benefit the PC industry by demonstrating that life and demand still exists; trying something new could reap large benefits. Not everyone wants to have a full ATX case with discrete components but still want workstation performance, and that is okay.
Now when it comes to actual specifications, the typical coverage glossed over what could be easily approximated by a trip to Wikipedia and Google. Sure, some may have been in a rush within the auditorium, but still.
The specifications are:
- Intel Xeon E5-2600 V2-class CPU, Ivy Bridge-E, 12 cores max (suggests single-socket)
- 4-channel DDR3 ECC RAM, apparently 4 DIMMS which suggests 4x16GB (Max).
Dual FirePro GPUs, 4096 total shaders with 2x6GB GDDR5.
- Pretty clearly based on FirePro W9000
- Seems to be slightly underclocked, losing about 0.5 Teraflop per GPU.
- PCIe SSD
- Thunderbolt 2, USB3.0, and WiFi ac (+ a/b/g/n??), Bluetooth 4.0
Now the downside is that basically anything you wish to add to the Mac Pro needs to be done through Thunderbolt, Bluetooth 4.0, or USB 3.0. When you purchase an all-in-one custom design, you forfeit your ability to reach in and modify the components. There is also no mention of pricing, and for a computer with this shoplist you should expect to pay a substantial invoice even without "The Apple Tax", but that is not the point of purchasing a high-end workstation. Apple certainly put in as close to the best-of-the-best as they could.
Now could people stop claiming the PC is dead and work towards sustaining it? I know people love stories of jarring industry shifts, but this is ridiculous.
Subject: Editorial, Processors | June 10, 2013 - 10:53 AM | Ryan Shrout
Tagged: SimCity, Richland, giveaway, contest, APU, amd, a10-6800k
Odd, turns out I found two brand new AMD A10-6800K Richland APUs sitting on my desk this morning. I called AMD to ask what this was all about and they said that if I didn't need them, I might as well give them away to our readers.
"Oh, and throw in a free copy of the new SimCity while you're at it," they told me.
Who am I to argue?
So let's have a giveaway!
We are handing out two AMD A10-6800K "Richland" APUs for you to build a brand new PC around and including a key for SimCity with each of them. If you haven't read Josh's review of the A10-6800K APU you should definitely do so; it will help educate you on exactly what you are getting - for FREE.
To enter, I need you to leave a comment on this very news post below telling us what you would build with a brand new A10 APU - you don't have to be registered to do so but we'd sure like it if you were. (Make sure you leave your correct email address so I can get in touch with you if you win.) Also, feel free to stop by the PC Perspective YouTube channel and either give our videos a gander or subscribe. I think we put out some great content there and we'd like more of you to see it.
I will pick one winner on June 17th and another on June 24th so you have two separate weeks to potentially win!
A big thanks goes out to AMD for supplying the APUs and copies of SimCity for this giveaway. Good luck!!
Subject: Editorial, General Tech, Graphics Cards, Shows and Expos | June 10, 2013 - 02:49 AM | Scott Michaud
Tagged: Ultra, geforce titan, computex
So long to Computex 2013, we barely knew thee. You poured stories all over our news feed for more than a whole week. What say you, another story for the... metaphorical road... between here... and... Taipei? Okay, so the metaphorical road is bumpy and unpaved, work with me.
It was substantially more difficult to decipher the name of a video card a number of years ago. Back then, products would be classified by their model numbers and often assigned a suffix like: "Ultra", "Pro", or "LE". These suffixes actually meant a lot, performing noticeably better (or maybe worse) than the suffix-less number and possibly even overlapping with other number-classes.
Just when they were gone long enough for us to miss them, the suffixes might make some measure of a return. On the show floor, Colorful exhibited the NVIDIA GeForce GTX Titan Ultra Edition. This card uses a standard slightly-disabled GK110-based GeForce GTX Titan GPU, with the usual 2688 CUDA cores, and 6GB of GDDR5. While the GK110 chip has potential for 2880 CUDA cores, NVIDIA has not released any product (not even Tesla or Quadro) with more than 2688 CUDA cores enabled. Colorful's Titan Ultra and the reference Titan are electrically identical; this "Ultra" version just adds a water block for a cooling system and defaults to some amount of a factory overclock.
But, this is not the first time we have heard of a Titan Ultra...
Back in April, ExtremeTech found a leak for two official products: the GTX Titan LE and the GTX Titan Ultra. While the LE would be slightly stripped down compared to the full GTX Titan, the GTX Titan Ultra would be NVIDIA's first release of a GK110 part without any CUDA cores disabled.
So if that rumor ends up being true, you could choose between Colorful's GTX Titan Ultra with its partially disabled GK110 based on the full GTX Titan design; or, you could choose the reference GTX Titan Ultra based on a full GK110 GPU unlike the partially disabled GK110 on the full GTX Titan.
If you are feeling nostalgic... that might actually be confusion... as this is why suffixes went away.
Subject: Editorial, General Tech, Systems, Shows and Expos | June 6, 2013 - 08:46 PM | Scott Michaud
Tagged: xbox one, E3 13, E3
So heading up to E3, Microsoft decided to drop their DRM bombshell so it would get buried over the next couple of days. In terms of permissiveness, the Xbox One is not nearly as bad as feared; of course, it is still terrible in certain ways.
Microsoft will allow games to be played offline on the Xbox One... for 24 hours. If your internet connection has been offline for longer than that period (unclear whether the timer starts when internet goes out or from last update) then your system will be locked to live TV and disc-based movies. Games and apps, even ones which should have no online functionality, will cease to function until you reconnect with Xbox servers.
This also means that if the Xbox servers have an outage lasting between 24 hours and "taken offline forever", all gaming and apparently apps will cease to function on the Xbox One.
It's like if Wall-E grew a Freddie Mercury
But at least they will allow some level of used-game transfer... if the publisher agrees. Check out this statement from Microsoft Studios:
In our role as a game publisher, Microsoft Studios will enable you to give your games to friends or trade in your Xbox One games at participating retailers. Third party publishers may opt in or out of supporting game resale and may set up business terms or transfer fees with retailers. Microsoft does not receive any compensation as part of this. In addition, third party publishers can enable you to give games to friends. Loaning or renting games won’t be available at launch, but we are exploring the possibilities with our partners.
So this will be an interesting experiment: how will revenue and profitability be affected for game publishers who deny used game sales? I honestly expect that used game sales actually promote the purchasing of more games and that initiatives to limit used game transfers will reduce user engagement. Of course Microsoft is now taking all of the flak from Sony, who may or may not be considering the same practice, but I am sure at least Microsoft is hoping that everyone will forget this when shiny new trailers erase the collective gamer memory.
In return, however, Microsoft is being fairly permissive when it comes to how many users can be licensed on a single disk. Up to ten family members are allowed access to your collective library.
And, after all, it should not be a surprise that a console game disappears when Microsoft shuts down their servers: consoles were always designed to be disposable. I have been proclaiming that for quite some time. The difference is now, people cannot really deny it.
Subject: Editorial, General Tech, Shows and Expos | June 6, 2013 - 05:42 PM | Scott Michaud
Tagged: unreal engine 4, ue4, E3 13, E3, computex
We are bleeding through the overlap between Computex and E3 media windows; this news has a somewhat relevant fit for both. Unreal Engine 4 is coming and I expect we will see one or more demos and UE4-powered titles over the next week. In fact, I would be fairly shocked if we do not see the end of the Elemental Demo with the Xbox One E3 keynote. We may also potentially see Unreal Engine 4 running on mobile devices and maybe even HTML5 at some point throughout the tradeshow, either canonically through Epic or via a licensee product.
This morning, Epic opened the Unreal Engine 4 Integrated Partners Program (IPP). Of course they already have a couple of members, most of which were partners with Unreal Engine 3.
The founding IPP partners are:
Wwise from Audiokinetic
- Manages large databases of sound effects and voice-overs
- Manages subtitles and multiple dubbings of voice clips
Autodesk Gameware from Autodesk
- Contains multiple packages including Beast, Navigation, and Scaleform
- Scaleform is a Flash rendering engine for HUDs, menus, etc. developed using Flash Professional in 2D or 3D. It is what StarCraft II, Mass Effect, and Borderlands uses.
- Beast is a lighting toolkit for global illumination, radiosity, etc.
- Navigation is an AI solver, predominantly for pathfinding.
Simplygon from Donya Labs
- Reduces polygon count of models so they take up less processing resources especially as they get further away from the camera.
Enlighten from Geomerics
- Another Global Illumination solver, most popular usage being Battlefield 3.
SpeedTree for Games from IDV
- Makes a bunch of efficient trees so studios do not need to hire as many minimum wage peons.
Intel Threading Building Blocks (TBB) from Intel
- Helps developers manage C++ threading for multicore systems.
- Deals with memory management and scheduling tasks
morpheme from NaturalMotion
- Animation and physics software for designers to create animations
- Works with NVIDIA PhysX
euphoria from NaturalMotion
- Simulates animations based on driving conditions via the CPU, most popular usage being GTA IV.
PhysX and APEX from NVIDIA
- You probably know this one.
- GPU-based rigid body, soft body, fluid, and cloth solvers.
- Allows for destructible environments and other complex simulations.
Oculus Rift from Oculus VR
- You probably also know this one, especially if you keep up with our Video Perspectives.
- Head-mounted display with motion tracking for VR.
Bink Video from Rad Game Tools
- ... is not included! Just kidding, that stuff'll survive a nuclear apocalypse.
- Seriously, check in just about any DirectX or OpenGL game's credits if it includes pre-rendered video cutscenes or video-textures.
- I'll wait here.
- In all seriousness, Rad Game Tools has been licensed in over 15,500 titles. It's been a meme to some extent for game programmers. This should be no surprise.
Telemetry Performance Visualizer from Rad Game Tools
- Allows developers to see graphs of what their hardware is working on over time.
- Helps developers know what benefits the most from optimization.
RealD Developer Kit (RDK) from RealD
- Helps game developers create stereoscopic 3D games.
Umbra 3 from Umbra Software
- Determines what geometry can be seen by the player and what should be unloaded to increase performance.
- Sits between artists and programmers to the former does not need to think about optimization, and the latter does not need to claw their eyes out.
IncrediBuild-XGE from Xoreax
- Apparently farms out tasks to idle PCs on your network.
- I am not sure, but I think it is mostly useful for creating a pre-render farm at a game studio for light-baking and such.
We still have a little while until E3 and so we do not know how E3 will be, but I highly expect to see Unreal Engine 4 be a recurring theme over the next week. Keep coming back to PC Perspective, because you know we have a deep interest in where Epic is headed.
Subject: Editorial, General Tech, Motherboards, Processors | June 4, 2013 - 10:40 AM | Ryan Shrout
Tagged: z87, video, overclocking, live, i7-4770k, haswell, ASUS ROG, asus
While we run around with our hair on fire trying to get ready for the Intel Haswell and Z87 product launch this weekend, I wanted to let everyone know about a live stream event we will be holding on Tuesday, June 4th. JJ from ASUS, a crowd favorite for sure, will be joining us LIVE in studio to talk all about the new lineup of ASUS Z87 motherboards. We'll also discuss performance and overclocking capabilities of the new processor and platform.
ASUS Z87 and Haswell Live Stream
10am PT / 1pm ET - June 4th
Be sure you stop by and join in the show! Questions will be answered, prizes will be given out and fun will be had! Who knows, maybe we can break some stuff live as well?? On hand to give away to those of you joining the live stream, we'll have these prizes:
- 2 x ASUS Z87 Motherboards
- 1 x ASUS Graphics card
Methods for winning will be decided closer to the event, but if you are watching live, you'll be included. And we'll ship anywhere in the world!
ASUS and I also want the event to be interactive, so we want your questions. We'll of course being paying attention to the chat room on our live page but you'll have better luck if you submit your questions about the ASUS Z87 products and Haswell processors before hand, in the comments section below. You don't have to register to ask and we'll have the ability to read them beforehand!
I'll update this post with more information after the reviews and stories start to hit, so keep an eye here for more details!!
Subject: Editorial, General Tech | June 4, 2013 - 03:44 AM | Scott Michaud
Tagged: WCS, starcraft 2, HoTS
A little eye-rest before another barrage of Computex news...
Blizzard took over the canon StarCraft II tournament scene as of last year. The goal was to create a unified ranking system between every tournament and help participants deal with scheduling, a problem in recent years. Throughout the entire year, Blizzard is hosting the 2013 StarCraft II World Championship Series. They seem to like breaking rankings into seasons and the 2013 series, alone, will incorporate three of them leading to the year's grand finals in November.
One year per series; three seasons and a grand finals per year; three regional tournaments and a finale per season. This season's finals will take place this weekend, June 8th and 9th, in South Korea.
Tournaments in Europe, Korea, and North America chose the 16 competitors for the 2013 Season 1 Finals this weekend in Korea. The top five competitors in each tournament (top six for Korea) earned their invite. In all: 3 Protoss, 5 Terrans, and 8 Zerg will be participating. I guess their hearts are only half of the swarm.
If the regional matches were any indication, the seasonal finals should be a very entertaining bridge between Computex coverage and E3 2013. Players are getting much better at the game mechanics while still being able to surprise their opponents and even the audience with unusual strategies. Players exploit windows of weakness in their opponents with a moment of strength; the entertainment mostly comes from seeing each player attempt to delay or lengthen those windows all while hiding their own weak periods into times where the opponent is unable to reasonably exploit it.
What are your opinions of "eSports"? Good concept, bad name?
Subject: Editorial, General Tech, Systems | May 29, 2013 - 07:16 PM | Scott Michaud
Tagged: windows blue, Windows 8.1, windows, microsoft
Personally, I really cannot care too much about the user experience quirks inherent to Windows modernization; the wedge slowly being shoved between the user and their machine is far too concerning. No matter how they modify the interface, restricting what users and developers can install and create on their machine is a deal breaker. But, after that obligatory preface reminding people not to get wound up in UX hiccups and be complacent to the big issues, Windows Blue will certainly address many of those UX hiccups.
As we reported, last month, boot-to-desktop and the Start Button were planned for inclusion with Windows 8.1. Then, the sources were relentless to emphasize: "Until it ships, anything can change."
Images courtesy, Paul Thurrott.
Mary Jo Foley gathered quite a few details since then. Firstly, the option (as in, disabled by default) to boot directly to desktop will be there; from the sounds of it, it looks like it will be disabled by default but not exclusive to Enterprise SKUs. This is somewhat promising, as it would be slightly less likely for Microsoft to kill support for the desktop (and, by extension, x86 applications) if they feel pressure to punctuate it. Still, assuming because "it makes sense" is a bad way to conduct business.
Also available (albeit, enabled by default) is the Start Button, seen in higher quality above. This will be, as far as we know, enabled by default. Its functionality will be to bring up the Start Screen or, alternatively, a new All Apps screen visible at ZDNet. Now this has me interested: while I actually like the Start Screen, a list of apps should provide functionality much closer to the Start Menu than Microsoft was previously comfortable with. Previously, the Start Screen attempted to make the desktop applications feel less comfortable than modern apps; this interface appears like it would feel more comfortable to the desktop. While probably still jarring, it looks to make finding desktop applications easier and quickly gets out of the way of your desktop experience.
According to Paul Thurrott, for those who wish to personalize the Start Screen, you will have the option to share your desktop wallpaper with the it. For tasteful backgrounds, like the one above, I can see this being of good use.
Just please, do not grief someone with a background full of fake tiles.
As a final note, there is still no word about multiple monitor support for "Modern Apps". If you have tried to use them in the past, you know what I am talking about: basically only one at a time, it will jump between monitors if you bring up the Start Screen, and so forth.
Subject: Editorial, General Tech, Cases and Cooling | May 29, 2013 - 02:03 PM | Scott Michaud
Tagged: Windows key, mouse, microsoft, I Hate This Key
Has this ever happened to you while playing a shooter? You need to get to a position so you mash the alt key to sprint and... aw crap I hit the Windows key... well, now I am dead. Have you ever considered purchasing software or a gaming keyboard which allows you disable that button?
Have you ever considered purchasing a mouse which also has that button to give both hands something to fear?
Definitely not a member of their Sidewinder product line.
Okay, so I should be fair: the Microsoft Sculpt Comfort mouse is not designed for gaming and Windows 8-like user experiences revolve heavily around the start button. The mouse button is also more useful than a redundant Windows key; the blue pad also has swipe functionality for extra functions. According to how it is described on its product page, slide gestures are bound to respond to the computer as mouse buttons 4 and 5.
So you can probably bind them to game functions, if you feel daring.
But, in the end, I still need to congratulate Microsoft for trying to innovate computer hardware. This is more than just trying to graft touch functionality to a mouse surface, as both Apple and Microsoft have tried in the past, and tries to make the classical mouse experience better. I doubt it is for most of our audience, but not everything needs to be.
Subject: Editorial | May 27, 2013 - 05:08 PM | PCPer Staff
The Dell Inspiron 15R is a good choice for anyone looking for a reasonably powerful and lightweight laptop. It is powered by a 1.8GHz Core i5-3537U, has 8GB RAM, a 1TB HDD and integral DVD burner, with a 15.6" 1366 x 768 LED-backlit LCD powered by the HD4000 on the i5. Not exactly a gaming PC but at 4.9lbs it is an easy way to bring your work with you wherever you go and have more processing power than a tablet will offer.
Dell Home is offering 3rd generation 15.6" Inspiron 15R 5521 Core i7 Ivy Bridge Laptop with 8GB RAM, 1TB Hard Drive & Touchscreen for $799.99 with FREE shipping. Use $289 instant savings and extra $100 coupon code: JS3KG6045QZ2BR to get final price.
Subject: Editorial, General Tech, Systems | May 27, 2013 - 03:08 AM | Scott Michaud
Tagged: xbox one, ps4, consolitis, consoles
So, as Wired editorial states it: hardcore console gamers don't want much, just the impossible. They want a "super-powered box" tethered to their TV; they want the blockbuster epics and innovative indie titles; they want it to "just work" for what they do. The author, Chris Kohler, wrote his column to demonstrate how this is, and has for quite some time been, highly unprofitable.
I think the bigger problem is that the console manufacturers want the impossible.
Console manufacturers have one goal: get their platform in your house and require their hand be in the pocket of everything you do with it. They need to make an attractive device for that to be true, so they give it enough power to legitimately impress the potential buyer and price it low enough to catch the purchasing impulse. Chances are this involves selling the box under cost at launch and for quite some time after.
But, if all of this juicy control locks the user into overspending in the long run, then it is worth it...
But Microsoft should be thankful that I cost them money to be acquired as a customer.
Well, looking at the Wired article, not only are console gamers ultimately overspending: it is still not enough! Consoles truly benefit no-one! The console manufacturers are not doing any more than maybe breaking even, at some point, eventually, down the line, they hope. Microsoft and Sony throw obnoxious amounts of money against one another in research, development, and marketing. Redundant technologies are formed to pit against their counterparts with billions spent in marketing to try to prove why either choice is better.
All of this money is spent to corral users into a more expensive experience where they can pocket the excess.
Going back to the editorial's claims: with all of this money bleeding out, Microsoft wants to appeal more broadly and compensate the loss with more cash flowing in. Sure, Microsoft has wanted a foothold in the living room for decades at this point, but the Xbox Division bounces between profitability and huge losses; thus, they want to be an entertainment hub if just for the cash alone.
But think back to the start, these troubles are not because it is impossible to satisfy hardcore gamers. These troubles are because Microsoft and Sony cannot generate revenue from their acquired control quicker than they can bleed capital away trying to acquire that control, or at least generate it more than just barely fast enough.
The other solution, which I have felt for quite some time is the real answer (hence why I am a PC gamer), has a large group of companies create an industry body who governs an open standard. Each company can make a substantial profit by focusing on a single chunk of the platform -- selling graphics processors, maintaining a marketplace, or what-have-you -- by leveraging the success of every other chunk.
This model does work, and it is the basis for one of humanity's most successful technology products: the internet.
As a side note: this is also why PC gaming was so successful... Microsoft, developers, Steam/GoG/other marketplaces, and hardware vendors were another version of this... albeit Microsoft had the ability to override them and go in whatever direction they wanted. They didn't, until Windows RT.
And the internet might even be the solution. The web browser is capable, today, of providing amazing gaming experiences and it does not even require a plugin. It is getting more powerful, even faster than the rate at which underlying hardware has evolved.
To end on an ironic note, that makes a web browser more capable of offline play than our current understanding of the Xbox One (and Sony has said nothing either way, for that matter).
I guess the takeaway message is: love the web browser, it "just works".
Subject: Editorial, General Tech, Graphics Cards, Systems | May 23, 2013 - 06:40 PM | Scott Michaud
Tagged: xbox one, xbox, unreal engine, ps4, playstation 4, epic games
Unreal Engine 4 was presented at the PlayStation 4 announcement conference through a new Elemental Demo. We noted how the quality seemed to have dropped in the eight months following E3 while the demo was being ported to the console hardware. The most noticeable differences were in the severely reduced particle counts and the non-existent fine lighting details; of course, Epic pumped the contrast in the PS4 version which masked the lack of complexity as if it were a stylistic choice.
Still, the demo was clearly weakened. The immediate reaction was to assume that Epic Games simply did not have enough time to optimize the demo for the hardware. That is true to some extent, but there are theoretical limits on how much performance you can push out of hardware at 100% perfect utilization.
Now that we know both the PS4 and, recently, the Xbox One: it is time to dissect more carefully.
A recent LinkedIn post from EA Executive VP and CTO, Rajat Taneja, claims that the Xbox One and PS4 are a generation ahead of highest-end PC on the market. While there are many ways to interpret that statement, in terms of raw performance that statement is not valid.
As of our current knowledge, the PlayStation 4 contains an eight core AMD "Jaguar" CPU with an AMD GPU containing 18 GCN compute units, consisting of a total of 1152 shader units. Without knowing driving frequencies, this chip should be slightly faster than the Xbox One's 768 shader units within 12 GCN compute units. The PS4 claims their system has a total theoretical 2 teraFLOPs of performance and the Xbox One would almost definitely be slightly behind that.
Back in 2011, the Samaritan Demo was created by Epic Games to persuade console manufacturers. This demo was how Epic considered the next generation of consoles to perform. They said, back in 2011, that this demo would theoretically require 2.5 teraFLOPs of performance for 30FPS at true 1080p; ultimately their demo ran on the PC with a single GTX 680, approximately 3.09 teraFLOPs.
This required performance, (again) approximately 2.5 teraFLOPs, is higher than what is theoretically possible for the consoles, which is less than 2 teraFLOPs. The PC may have more overhead than consoles, but the PS4 and Xbox One would be too slow even with zero overhead.
Now, of course, this does not account for reducing quality where it will be the least noticeable and other cheats. Developers are able to reduce particle counts and texture resolutions in barely-noticeable places; they are also able to render below 1080p or even below 720p, as was the norm for our current console generation, to save performance for more important things. Perhaps developers might even use different algorithms which achieve the same, or better, quality for less computation at the expense of more sensitivity to RAM, bandwidth, or what-have-you.
But, in the end, Epic Games did not get the ~2.5 teraFLOPs they originally hoped for when they created the Samaritan Demo. This likely explains, at least in part, why the Elemental Demo looked a little sad at Sony's press conference: it was a little FLOP.
Update, 5/24/2013: Mark Rein of Epic Games responds to the statement made by Rajat Taneja of EA. While we do not know his opinion on consoles... we know his opinion on EA's opinion:
— Mark Rein (@MarkRein) May 23, 2013
Subject: Editorial, Graphics Cards | May 23, 2013 - 12:08 PM | Ryan Shrout
Tagged: video, live, gtx 780, gk110
Missed the LIVE stream? You can catch the video reply of it below!
Hopefully by now you have read over our review of the new NVIDIA GeForce GTX 780 graphics card that launched this morning. Taking the GK110 GPU, cutting off some more cores, and setting a price point of $650 will definitely create some interesting discussion.
Join me today at 2pm ET / 11am PT as we discuss the GTX 780, our review and take your questions. You can leave them in the comments below, no registration required.