All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: General Tech, Shows and Expos | September 23, 2013 - 09:38 PM | Scott Michaud
Tagged: JavaOne, JavaOne 2013, gpgpu
Are the enterprise users still here? Oh, hey!
GPU acceleration throws a group of many similar calculations at thousands of simple cores. Their architecture makes it very cheap and power efficient for the amount of work they achieve. Gamers, obviously, enjoy the efficiency at tasks such as calculating pixels on a screen or modifying thousands of vertex positions. This technology has evolved more generally than graphics. Enterprise and research applications have been taking notice over the years.
GPU discussion, specifically, starts around 16 minutes.
Java, a friend of scientific and "big-data" developers, is also evolving in a few directions including "offload".
IBM's CTO of Java, John Duimovich, discussed a few experiments they created when optimizing the platform to use new hardware. Sorting arrays, a common task, saw between a 2-fold and 48-fold increase of performance. Including the latency of moving data and initializing GPU code, a 32,000-entry array took less than 1.5ms to sort, compared to about 3ms on the CPU. The sample code was programmed in CUDA.
The goal of these tests is, as far as I can tell, to (eventually) automatically use specialized hardware for Java's many built-in libraries. The pitch is free performance. Of course there is only so much you can get for free. Still, optimizing the few usual suspects is an obvious advantage, especially if it just translates average calls to existing better-suited libraries.
Hopefully they choose to support more than just CUDA whenever they take it beyond experimentation. The OpenPOWER Consortium, responsible for many of these changes, currently consists of IBM, Mellanox, TYAN, Google, and NVIDIA.
Subject: Editorial, General Tech, Systems, Mobile, Shows and Expos | September 16, 2013 - 09:15 PM | Scott Michaud
Tagged: Steam Box, LinuxCon, Gabe Newell
Valve Software, as demonstrated a couple of days ago, still believe in Linux as the future of gaming platforms. Gabe Newell discussed this situation at LinuxCon, this morning, which was streamed live over the internet (and I transcribed after the teaser break at the bottom of the article). Someone decided to rip the stream, not the best quality but good enough, and put it on Youtube. I found it and embed it below. Enjoy!
Gabe Newell highlights, from the seventh minute straight through to the end, why proprietary platforms look successful and how they (sooner-or-later) fail by their own design. Simply put, you can control what is on it. Software you do not like, or even their updates, can be stuck in certification or even excluded from the platform entirely. You can limit malicious software, at least to some extent, or even competing products.
Ultimately, however, you limit yourself by not feeding in to the competition of the crowd.
If you wanted to get your cartridge made you bought it, you know, FOB in Tokyo. If you had a competitive product, miraculously, your ROMs didn't show up until, you know, 3 months after the platform holder's product had entered market and stuff like that. And that was really where the dominant models for what was happening in gaming ((came from)).
But, not too surprisingly, open systems were advancing faster than the proprietary systems had. There used to be these completely de novo graphics solutions for gaming consoles and they've all been replaced by PC-derived hardware. The openness of the PC as a hardware standard meant that the rate of innovation was way faster. So even though, you would think, that the console guys would have a huge incentive to invest in it, they were unable to be competitive.
Microsoft attempts to exert control over their platform with modern Windows which is met by a year-over-year regression in PC sales; at the same time, PC gaming is the industry hotbed of innovation and it is booming as a result. In a time of declining sales in PC hardware, Steam saw a 76% growth (unclear but it sounds like revenue) from last year.
Valve really believes the industry will shift toward a model with little divide between creator and consumer. The community has been "an order of magnitude" more productive than the actual staff of Team Fortress 2.
Does Valve want to compete with that?
This will only happen with open platforms. Even the consoles, with systems sold under parts and labor costs to exert control, have learned to embrace the indie developer. The next gen consoles market indie developers, prior to launch, seemingly more than the industry behemoths and that includes their own titles. They open their platforms a little bit but it might still not be enough to hold off the slow and steady advance of PC gaming be it through Windows, Linux, or even web standards.
Speaking of which, Linux and web standards are oft criticized because they are fragmented. Gabe Newell, intentionally or unintentionally, claimed proprietary platforms are more fragmented. Open platforms have multiple bodies push and pull the blob but it all tends to flow in the same direction. Proprietary platforms have lean bodies with control over where they can go, just many of them. You have a dominant and a few competing platforms for each sector: phones and tablets, consoles, desktops, and so forth.
He noted each has a web browser and, because the web is an open standard, is the most unified experience across devices of multiple sectors. Open fragmentation is small compared to the gaps between proprietary silos across sectors. ((As a side note: Windows RT is also designed to be one platform for all platforms but, as we have been saying for a while, you would prefer an open alternative to all RT all the time... and, according to the second and third paragraphs of this editorial, it will probably suffer from all of the same problems inherent to proprietary platforms anyway.))
Everybody just sort of automatically assumes that the internet is going to work regardless of wherever they are. There may be pluses or minuses of their specific environment but nobody says, "Oh I'm in an airplane now, I'm going to use a completely different method of accessing data across a network". We think that should be more broadly true as well. That you don't think of touch input or game controllers or living rooms as being things which require a completely different way for users to interact or acquire assets or developers to program or deliver to those targets.
Obviously if that is the direction you are going in, Linux is the most obvious basis for that and none of the proprietary, closed platforms are going to be able to provide that form of grand unification between mobile, living room, and desktop.
Next week we're going to be rolling out more information about how we get there and what are the hardware opportunities that we see for bringing Linux into the living room and potentially pointing further down the road to how we can get it even more unified in mobile.
Well, we will certainly be looking forward to next week.
Personally, for almost two years I found it weird how Google, Valve, and Apple (if the longstanding rumors were true) were each pushing for wearable computing, Steam Box/Apple TV/Google TV, and content distribution at the same time. I would not be surprised, in the slightest, for Valve to add media functionality to Steam and Big Picture and secure a spot in the iTunes and Play Store market.
As for how wearables fit in? I could never quite figure that out but it always felt suspicious.
Subject: Processors, Shows and Expos | September 10, 2013 - 02:47 PM | Ryan Shrout
Tagged: idf, idf 2013, Intel, keynote, live blog
We are preparing for the second day of keynotes at IDF so sign up below to get a reminder for our live blog! After the first keynote saw the introduction of Intel Quark SoCs, showcases of the first 14nm Broadwell processor and a 22nm LTE smartphone, day 2 could be even more exciting!
The event starts at 9am PT / 12pm ET on Wednesday the 11th!
Subject: Processors, Shows and Expos | September 10, 2013 - 02:31 PM | Ryan Shrout
Tagged: quark, Intel, idf 2013, idf
In a very interesting and surprising announcement at the first Intel Developer Forum keynote this morning, Intel's new CEO Brian Krzanich showed the first samples of Quark, a new SoC design that will enter into smaller devices that even Atom can reach.
Quark is the family name for the new line of SoCs that are open, synthesizable and support with industry standard software. An open SoC is simply one that will allow third-party IP integration with the processor cores while a synthesizable one can be moved and migrated to other production facilities as well. This opens up Intel to take Quark outside of its own fabrication facilities (though Krzanich said they would prefer not during Q&A) and allow partners to more easily integrate their own silicon with the Quark DNA. Intel had previously announced that Atom would be able to integrate with third-party IP but that seems to have been put on the back burner in favor of this.
Quark will not be an open core design in the same way that ARM's core can be, but instead Intel is opening up the interface fabric for direct connection to computing resources.
The Quark SoC is square in the middle
Krzanich showed off the chip on stage that is 1/5 the size of Atom and 1/10 the power levels of Atom (though I am not sure if we are referring to Clover Trail or Bay Trail for the comparison). That puts it in a class of products that only ARM-based designs have been able to reach until now and Intel displayed both reference systems and wearable designs.
UPDATE: Intel later clarified with me that the "1/5 size, 1/10 power" is for a Quark core against an Atom core at 22nm. It doesn't refer to the entire SoC package.
Intel hasn't yet told us what microarchitecture Quark is based on but if I were a betting man I would posit that it is related to the Silvermont designs we are looking at on Bay Trail but with a cut down feature set. Using any other existing design from Intel would result in higher than desired power consumption and die size levels but it could also be another ground up architecture as well.
I'll be poking around IDF for more information on Quark going forward but for now, it appears that Intel is firmly planting itself on a collision course with ARM and Qualcomm.
UPDATE 1: I did get some more information from Intel on the Quark SoC. It will be the first product based on the 14nm manufacturing process and is a 32-bit, single core, single thread chip based on a "Pentium ISA compatible CPU core." This confirms that it is an x86 processor though not exactly what CPU core it is based on. More soon!
Subject: Processors, Shows and Expos | September 10, 2013 - 11:02 AM | Ryan Shrout
Tagged: live blog, keynote, Intel, idf 2013, idf
UPDATE: You can see the replay of our live blog below from Day 1 of IDF but be sure you head over to the Day 2 Live Blog page to set a reminder! Join us on Wednesday at 9am PT / 12pm ET!!
Today is the beginning of the 2013 Intel Developer Forum in San Francisco! Join me at 9am PT for the first of three live blogs of the main Intel keynotes where we will learn what direction Intel is taking on many fronts!
Subject: General Tech, Systems, Shows and Expos | September 4, 2013 - 12:11 PM | Scott Michaud
Tagged: zenbook, ifa 2013, asus
How about some Ultrabooks? We got Ultrabooks. Thin, light, and metal brushed with their characteristic circular pattern. They are proud of that design, proud enough to cover it, the top of the lid, with a layer of Corning Gorilla Glass 3 to protect it from scratches and gouges. It feels a little absurd to say, but covering metal in glass for increased durability seems to make sense and could help your premium laptop look new for longer.
These 13.3-inch laptops come in two resolutions: 1920x1080 Full HD is the lower offering, with 2560x1440 for higher-end tastes. Both monitors are IPS-based with 10-point multi-touch capabilities.
The raw specifications for the UX301 are:
- Intel Core i5-4200U or i7-4500U or i7-4558U
- 4GB or 8GB DDR3L
- Intel HD 5100 Graphics
- SATA 3 SSD (up to 512GB RAID0)
- 13.3-inch 2560 x 1440 WQHD or 1920 x 1080 Full HD IPS multitouch
- 802.11ac (dual-band), Bluetooth 4.0
- Mini DisplayPort, Micro-HDMI 1.4
- 2x USB3.0, 3.5mm headphone/mic, SD card reader
- Windows 8 or Windows 8 Pro
For the UX302:
- Intel Core i5-4200U or i7-4500U
- 4GB DDR3L
- Intel HD 4400 or NVIDIA GeForce GT 730M (2GB)
- Up to a 750GB HDD with 16 GB SSD cache
- 1920 x 1080 Full HD IPS multitouch
- 802.11ac (dual-band), Bluetooth 4.0
- Mini Displayport, HDMI 1.4
- 3x USB 3.0, 3.5mm headphone/mic, SD card reader
- Windows 8 or Windows 8 Pro
The UX301 seems to be the more-premium device with obviously higher specs, at least for options, despite being almost 2mm thinner (15.5mm vs 17.2 for the UX302) and a quarter of a pound lighter (1.38kg vs 1.5 kg for the UX302). Both models are listed as a 50W battery, which I assume means 50Wh since watts are not a unit of electrical storage, but I am not entirely sure.
No information about pricing and availability has been released.
Subject: General Tech, Mobile, Shows and Expos | September 4, 2013 - 11:34 AM | Scott Michaud
Tagged: TF701T, ifa 2013, asus
Among the ASUS announcements is their new Transformer Pad TF701T. Being a Transformer Pad, the TF701T is an Android Tablet which can be used alone or docked in a keyboard for extra battery life and USB 3.0 support -- or, of course, for a keyboard. The touch display is IPS-based, 10.1", and with a native resolution of 2560x1600.
The other raw specifications include:
- NVIDIA Tegra 4 T40X quad-core SoC.
- 2GB DDR3L RAM
- WiFi B/G/N (dual-band) with Miracast support
- Bluetooth 3.0+EDR
- Speaker with ASUS "SonicMaster" technology
- 32GB and 64GB options
- MicroSDXC port on tablet if you need more storage.
- SDXC port on the dock if you need even more storage or, of course, to load pictures from a camera
- USB 3.0 (on dock).
- 3.5mm headphone/microphone jack.
I find it somewhat interesting that ASUS listed the Tegra 4 "T40X". It seems odd to declare a specific model if, unless I completely missed something, Tegra 4 is not announced to be binned in to multiple SKUs. This might suggest Tegra 4 will have more options than simply, "Get Tegra 4 or wait for Tegra 4i with the built-in Icera modem". Then again, it could be another case of over-description. Either way, it is something we will watch closely and report further on.
Pricing and availability information has not yet been released.
Subject: General Tech, Shows and Expos | August 23, 2013 - 11:31 PM | Scott Michaud
Tagged: battlefield 4, gamescom
Battlefield 4 is still at Gamescom, possibly the last big trade show before launch, and they are still trickling out news. DICE just released their Levelution teaser, which is a cheesy name and I will not pretend to ignore that, but an interesting premise for level design. Keeping in line with the original premise for Battlefield, levels are designed for too many options to fully consider, leading to the characteristic "Battlefield Moments".
A Battlefield Moment is when something amazing, genius, or plain stupid happens in a multiplayer match and draws your attention.
Some popular examples include:
- The airborne murdering of pilots.
- Hijacking jets semi-automatically.
- Getting to the choppah.
- Sure, you can die instead.
- Indecisive snipers.
Of course capturing these moments are much more common since Bad Company 2 just due to how common video editing is. Battlefield 2 also had its moments. A personal favorite is when I dropped anti-tank mines, while parachuting, atop a heavy tank. It turns out the time it took to fall was enough for it to arm. I also enjoy proclaiming myself "The Kool-Aid Man" before running someone in my Teamspeak channel down by crashing a jeep through a wall.
DICE wants to show off their engine technology by including many user-triggered events, big and small, to give players options. You can destroy cover, lower doors, and otherwise succeed with out-of-box ideas. Check out the video, above, for more examples.
I am pretty sure the marketing people like showing off video of a skyscraper crumbling, too.
Battlefield 4 will launch October 29th in North America.
Subject: General Tech, Shows and Expos | August 23, 2013 - 03:39 PM | Scott Michaud
Tagged: battlefield 4, gamescom
EA TV conducted a broadcast of Battlefield 4, spectating gameplay from the attendees of Gamescom. Almost 13 minutes of were posted to Youtube and, because our readers are awesome, you get to see it here. Destructibility exceeding even Bad Company 2 is displayed in this map, Paracel Storm, as Russia and China fight over an island cluster.
... And the observer tools look interesting, too!
The hook of the map, at least one of them, is a windmill out in the ocean anchoring a destroyed frigate. Once the windmill is destroyed the wreckage is released, runs aground near a control point, and destroys a multi-level structure.
The video also announced another gametype, "Obliteration", which seems somewhat like a play on the Rush mode from Bad Company and Assault from Halo 2. Each team has 3 "MCOM" points which can be destroyed by planting a randomly spawning bomb. An interesting mashup.
I am, still, slightly upset that we have yet to see free bombing since Battlefield 2. Rumors have circulated, due to the promotional images from China Rising DLC, that bombers will make a reappearance in, at least, that DLC. The mechanic of bombers allows you to either: in the sky, carpet bomb to clear predictable attack paths; on the ground, lure enemies into bombing paths and chat with teammates in the sky.
Of course this is less relevant outside of the clan skirmish sphere, but still fun with a handful of friends on a public server.
What are you thoughts on the video? What are you looking forward to about Battlefield 4? What are you hoping for in Battlefield 4?
Stay tuned in case anything more comes out from Gamescom. Battlefield 4 launches October 29th with a Beta coming earlier that month.
Subject: General Tech, Shows and Expos | August 7, 2013 - 03:52 PM | Scott Michaud
Tagged: DOTA 2, valve
Valve has just commenced, with the first match setting up and taking place (between teams Na'Vi and Orange) as I type, the third iteration of their giant DOTA 2 tournament: The International 3.
The prize pool, starting at 1.6 million dollars, is increased by $2.50 for each $9.99 tournament Compendum sold. Almost 500,000 were sold by this point yielding a purse of $2.84 million USD; the prize for first place, alone, is currently $1.42 million USD. This continues The International's trend of being among the most lucrative video game tournaments, almost doubling Blizzard's 2013 StarCraft II World Championship Series which are also along this weekend.
The live Twitch stream, ignoring fans at the venue and cheering from a variety of pubs, currently distributes to 154,000 viewers. Check out their website to watch live, check the schedule, and view past results. The tournament will continue until Sunday. The first match will begin just a minute or two after this publishes.
Subject: General Tech, Graphics Cards, Mobile, Shows and Expos | July 24, 2013 - 05:15 PM | Scott Michaud
Tagged: Siggraph, kepler, mobile, tegra, nvidia, unreal engine 4
SIGGRAPH 2013 is wrapping up in the next couple of days but, now that NVIDIA removed the veil surrounding Mobile Kepler, people are chatting about what is to follow Tegra 4. Tim Sweeney, founder of Epic Games, contributed to NVIDIA Blogs the number of ways that certain attendees can experience Unreal Engine 4 at the show. As it turns out, NVIDIA engineers have displayed the engine both on Mobile Kepler as well as behind closed doors on desktop PCs.
Not from SIGGRAPH, this is a leak from, I believe, GTC late last March.
Also, this is Battlefield 3, not Unreal Engine 4.
Tim, obviously taking the developer standpoint, is very excited about OpenGL 4.3 support within the mobile GPU. In all, he did not say too much of note. They are targeting Unreal Engine 4 at a broad range of platforms: mobile, desktop, console, and, while absent from this editorial, web standards. Each of these platforms are settling on the same set of features, albeit with huge gaps in performance, allowing developers to focus on a scale of performance instead of a flowchart of capabilities.
Unfortunately for us, there have yet to be leaks from the trade show. We will keep you up-to-date if we find any, however.
Subject: General Tech, Systems, Shows and Expos | June 28, 2013 - 03:11 AM | Scott Michaud
Tagged: Windows 8.1, MakerBot, BUILD 2013, BUILD
Even Microsoft believes that 3D printing is a cool movement.
Windows 8.1 will include native support for the 3D printers, CNC machines, and laser cutting devices. According to a stage demo at BUILD, Microsoft expects printing in 3D will be as easy as printing in 2D. It might be hard to think of more than a few practical applications for a home user to have access to such hardware, but often people will not realize when they avoid what could have been easily solved with the right tools.
- A standardized driver model for 3D Printers, CNC machines, and laser cutting devices
- APIs for apps to interface with the above drivers
- Device apps and extensions through the Windows Store
- Job spooling and queuing
- Easy ways to query what the device and its capabilities
The reliance upon the Windows Store might tell the larger tale. It appears that Microsoft is giving the nod to the maker community, not out of excitement, but to enable app developers to interface with these devices. Could the "modern" Windows APIs provide enough flexibility for 3D printing apps to exist without Microsoft's support? What about the next classification of peripherals?
All pondering aside...
The demo involved Antoine Leblond, of Microsoft, printing a vase from a MakerBot Replicator 2. According to TechCrunch, MakerBot will not only invade Windows 8.1 but also be stocked at Microsoft Stores. This is a solid retail win for the maker movement, giving users a chance throw one of these in the back seat of the car and drive it home from the store.
Subject: General Tech, Mobile, Shows and Expos | June 28, 2013 - 02:05 AM | Scott Michaud
Tagged: BUILD, BUILD 2013, internet explorer, IE11, Windows 7
Windows 8.1 will be bundled with Microsoft's latest web browser, Internet Explorer 11. The line of browsers, starting with Internet Explorer 9, are very competent offerings which approach and eclipse many competitors. Microsoft has made some errors since then, breaking standards for personal gain, but their recent efforts in supporting W3C – and even arch-nemesis Khronos – displays genuine good faith.
HTML5 Developer Tools rivalling even Mozilla and Google
But Windows XP never surpassed Internet Explorer 8, and apart from glitch and vulnerability fixes, Windows 7 is in almost exactly the same state as the day Windows 8 shipped. Internet Explorer 10 made it to the platform, late and reluctantly, along with severely neutered back-ports of Windows 8 DirectX enhancements. The platform update was welcome, but lacks the importance of a full service pack.
More importantly, the hesitation to bring IE10 to Windows 7 suggested that it would be the last first-party web browser the platform would see.
Not true, apparently. During their Build conference, Engadget claims to have spoke with a Microsoft representative who confirmed Windows 7 will receive the latest Internet Explorer. This is good news for every user of IE and every web designer with a cool WebGL implementation but is held back by browser market share concerns.
Honestly, my main concern is with the future of Internet Explorer, 12 and beyond. I am encouraged by the recent effort by Microsoft, but with Windows RT demanding for every browser to be built atop Internet Explorer, it better keep up or we are screwed. The web browser might be our operating system in the near future, applications should not be held back by the least of all possible platforms – be it Internet Explorer, Webkit, or any other dominant browser.
Of course, I should note that Engadget was not being specific with their source, so some grain of salts would be wise until it ships.
Subject: General Tech, Shows and Expos | June 25, 2013 - 07:41 PM | Scott Michaud
Tagged: Windows 8.1, BUILD 2013, BUILD, Blue
June 26th will be the first time for the general public to, legitimately, experience the preview build of Windows 8.1, Windows Server 2012 R2, and the all blue'd-up Windows Server Essentials. This preview will be available through the Windows Store, but will not be upgradable to the final code.
Again, this preview will require, basically, Windows re-installation when final code hits.
Disclaimer aside, MSDN and TechNet subscribers got the updates ahead of time for server operating systems. Included with this release, a feature dubbed "Desktop Experience", providing the same interface as Windows 8.1.
Ars Technica played around with the updates and compiled several captioned screenshots into a carousel. Check them out. Of course, things could change, but this should be a better indication than -- for instance -- Windows 8 Developer Preview.
The final version of Windows 8.1 is expected to be released at some point later this year. The update will be free for current Windows 8 users.
Subject: General Tech, Cases and Cooling, Shows and Expos | June 22, 2013 - 01:43 PM | Scott Michaud
Tagged: noise cancellation, noctua, computex 2013, computex
Update (June 22-2013, 4:43pm EDT): I was contacted by Noctua about the TDP ratings... quoting from their email:
As for the question regarding the TDP rating of the original NH-D14, I'd like to stress that the cooler can *easily* handle any 130W CPU! Our D14 is renowned to be among the best performing heatsinks for overclocking on the market and and many users have pushed their CPUs well beyond 250W using this cooler.
Noctua apparently does not like including TDP values for their coolers because it varies heavily on the conditions (such as, of course, room and case temperature). It makes sense, of course, because then customers would go looking at reviews and see what overclocks were achieved with the system.
Yes, I know Computex is long over, but I missed something that I want to cover.
Noctua has been teasing active noise cancellation (ANC) for their CPU coolers for quite some time now; Tim published his brief thoughts, 13 months ago, on their press release leading up to Computex 2012. The prototype, this year, is a full unit rather than the fan from last year.
This design is a modified NH-D14 cooler with added technology from RotoSub AB to sample its own noise and destructively interfere. According to Noctua, this will be the first ANC cooling unit for a CPU. The plan, as their press release suggests, is to release a cooler with the model named "R-ANC" after its (R)otoSub (A)ctive (N)oise (C)ancellation (R-ANC) technology. To me, this seems like a confusing choice in name as it breaks away from their existing standard and limits choice in name for future models based on this technology. Personally, I would have preferred to see "NH-D14R" or "NH-D14ANC", but alas I am not a marketer.
Also, in the process of researching for this article, I have been unable to find a canonical TDP-rating for this device. I was not too surprised to have a difficult time finding it for this unreleased product, but TDP is even omitted from the established, albeit louder, default NH-D14. Some sources claim this cooler can support an Intel i7 Extreme processor, which typically requires a 130W thermal dissipation; other sources say you should be somewhat cautious with this cooler with CPUs >95W TDP; some even claim it is great for air-only overclocking. Rolling all of these sources together, assuming a kernel of truth in each, I would assume this cooler (and, by extension, its upcoming R-ANC variant) would be good for decent air-only overclocks until you reach the -E series.
But, grain of salt, have some.
No word of pricing, but Noctua believes they will have it available spring/summer of next year. For some reference, the default NH-D14 can be found for about $75-$100; expect the R-ANC to be slightly north of that.
Subject: Editorial, General Tech, Systems, Shows and Expos | June 17, 2013 - 03:16 AM | Scott Michaud
Tagged: xbox one, microsoft, ea, E3 13, E3
Update: Microsoft denies the statements from their support account... but this is still one of the major problems with DRM and closed platforms in general. It is stuff like this that you let them do.
Consumers, whether they acknowledge it or not, fear for the control that platform holders have over their content. It was hard for many to believe that having your EA account banned for whatever reason, even a dispute with a forum moderator, forfeited your license to games you play through that EA account. Sounds like another great idea for Microsoft to steal.
@dohertymark If your account is banned, you also forfeit the licenses to any games that have licenses tied to it as listed in the ToU. ^AC
— Xbox Support (@XboxSupport1) June 14, 2013
Not stopping there, later on in the thread they were asked what would happen in the event of a security breach. You know, recourse before destroying access to possibly thousands of dollars of content.
@KillerRamen Ensure your account security features are enabled, and security proofs details are correct. ^ML
— Xbox Support (@XboxSupport1) June 15, 2013
While not a "verified account", @xboxsupport is.
They acknowledge ownership of this account in the background image there.
Honestly, there shouldn't have been any doubt that these actually are Microsoft employees.
At this point, we have definitely surpassed absurdity. Sure, you typically need to do something fairly bad to have Microsoft stop charging your for Xbox Live. Removing access to your entire library of games, to me, is an attempt to limit cheating and the hardware community.
Great, encourage spite from the soldering irons, that works out well.
Don't worry, enthusiasts, you know the PC loves you.
Gaming as a form of entertainment is fundamentally different than gaming as a form of art. When content is entertainment, its message touches you without any intrinsic value and can be replaced with similar content. Sometimes a certain piece of content, itself, has specific value to society. It is these times where we should encourage efforts by organizations such as GoG, Mozilla and W3C, Khronos, and many others. Without help, it could be extremely difficult or impossible for content to be preserved for future generations and future civilizations.
It does not even need to get in the way of the industry and its attempt to profit from the gaming medium; a careless industry, on the other hand, can certainly get in the way of our ability to have genuine art. After all, this is the main reason why I am a PC gamer: the platform allows entertainment to co-exist with communities who support themselves when the official channels do not.
Of course, unless Windows learns a little something from the Xbox. I guess do not get your Windows Store account banned in the future?
Subject: General Tech, Shows and Expos | June 14, 2013 - 04:06 AM | Scott Michaud
Tagged: E3, E3 13, ea, dice
How could I resist?
I was surprised, the EA keynote -- usually an event which dances past, carefully not leaving anything like "an impression" on its way out -- stuck with me more than any other keynote. Sure, throughout the EA Sports segment I was cleaning my "office" and only modestly paying any level of attention, but I feel that DICE swept the show when they appeared. This, and the rest of the week brought good, bad, and awesome news for us PC gamers.
You have probably seen the Battlefield 4 multiplayer demo by this point. We linked to it, we discussed it. It seems like the destructibility found in the Battlefield 3 single player campaign was absent from the multiplayer not because of a technical reason but rather a design decision. Sure, we can see the radio tower collapse, but building destruction was quite simplified even when compared to Bad Company 2.
The Skyscraper collapse seems like it is a legitimate aspect of the game this time around and not just a baloney promotional piece. When the building collapses you can notice the control point disappear from the mini-map in the bottom left corner of the HUD. That gameplay element required quite a bit of design thought, even Bad Company 2 made buildings with Conquest flags indestructible. Maybe the harsh limitations on Battlefield 3 destructibility was more to keep unified game play between the PC and the 24 player-limited consoles?
Sadly, during E3 we have found that mod support will not be available for Battlefield 4. I must compliment GM of DICE, Karl-Magnus Troedsson, for his blunt honesty. It would be much simpler to kick your feet and say wait and see for something you know will never see the light of day; but, he gave us the straight answer. Sure, he said then engine is not ready for a public release but even then he admitted that it was not for our benefit. They do not have a good idea what boundaries they want to allow modders to access. While disappointing, at least it does not have a condescending tone like we experienced with Bad Company 2 and Battlefield 3 mod support requests.
Karl-Magnus Troedsson, DICE GM: We get that question a lot. I always answer the same thing, and then the community calls me bad names. We get the feedback, we understand it. We also would like to see more player-created content, but we would never do something like this if we feel we couldn’t do this 100 percent. That means we need to have the right tools available, we need to have the right security around this regarding what parts of the engine we let loose, so to say. So for BF4 we don’t have any planned mod support, I have to be blunt about saying that. We don’t.
Moving on, though. As we know, Disney decided that LucasArts properties would be best left to the hands at EA. The internet simultaneously joy-teared at the thought of a Star Wars Battlefront title developed by DICE. Sure enough, Star Wars: Battlefront 3 is a thing, and it will be developed using the Frostbite 3 engine.
Still no word on an Indiana Jones titled based on Mirror's Edge. Heh heh heh.
Oh by the way, the announcement I am, by far, most excited for is Mirror's Edge. I absolutely loved the first game, despite its terrible dialog, for how genuine and intrinsically valuable it felt. It gave the impression of a passion project, both in gameplay and in narrative theme. Thankfully, the game is being developed and it will come to the PC.
We also found out that Mirror's Edge is planned to be an "open world action adventure title". Normally that would scare me, but, that was what we were expecting of the first Mirror's Edge before their linear bait-and-switch.
Cannot tell if good or bad... but we will see at some point in the future.
Subject: General Tech, Systems, Shows and Expos | June 13, 2013 - 04:17 AM | Scott Michaud
Tagged: E3, E3 13, dell, alienware, alienware x51
The launch of Haswell led to many new product launches, and so did E3. The overlap? The Alienware X51 gaming desktop has been refreshed with some very compelling components at a surprisingly compelling price.
Unfortunately, there is a slight difference between the Canadian and the American offerings; it is not a case of one citizen paying more than the another, however, as things are more shuffled around than outright better. Our Canadian readers start with a base price of $1499.99, and Americans start out at $1449.99. Americans can spend an extra $100 to upgrade their DVD reader to a Blu-Ray drive, Canadians get Blu-Ray by default. Therefore, if you desire a Blu-Ray drive, it is $50 cheaper to be Canadian; otherwise, it is $50 cheaper to be American.
Whether you are Canadian or American, I would personally recommend spending the extra $100 upgrading your RAM from 8GB to 16 GB. Sure, 8GB is a lot, but the extra can go a long way especially with the direction that web browsers have been heading. You each, also, have the option of spending $300 and receiving a 256GB SSD albeit also at the expense of, beyond the $300, reducing your 2TB HDD down to a slower, 5400RPM 1TB drive.
In all, this actually looks quite compelling for someone who wishes to have a console-esque form-factor near their TV. Unfortunately there are currently no Ubuntu-based options for this X51, although you may freely ($0) choose between Windows 7 Home Premium 64-bit and Windows 8 64-bit.
Subject: General Tech, Graphics Cards, Processors, Shows and Expos | June 13, 2013 - 02:26 AM | Scott Michaud
Tagged: E3, E3 13, amd
The Electronic Entertainment Expo (E3) is the biggest event of the year for millions of gamers. The majority of coverage ends up gawking over the latest news out of Microsoft, Sony, or Nintendo, and we certainly will provide our insights in those places if we believe they have been insufficiently explained, but E3 is also a big time for PC gamers too.
5 GHz and unlocked to go from there.
AMD, specifically, has a lot to say this year. In the year of the next-gen console reveals, AMD provides the CPU architecture for two of the three devices and have also designed each of the three GPUs. This just leaves a slight win for IBM, who is responsible for the WiiU main processor, for whatever that is worth. Unless the Steam Box comes to light and without ties to AMD, it is about as close to a clean sweep as any hardware manufacturer could get.
But for the PCs among us...
For those who have seen the EA press conference, you have probably seen lots of sports. If you stuck around after the sports, you probably saw Battlefield 4 being played by 64 players on stage. AMD has been pushing, very strongly, for developer relations over the last year. DICE, formerly known for being an NVIDIA-friendly developer, did not exhibit Battlefield 4 "The Way It's Meant to be Played" at the EA conference. According to one of AMD's Twitter accounts:
— AMD Radeon Graphics (@AMDRadeon) June 12, 2013
On the topic of "Gaming Evolved" titles, AMD is partnering with Square Enix to optimize Thief for GCN and A-Series APUs. The Press Release specifically mentioned Eyefinity and Crossfire support along with a DirectX 11 rendering engine; of course, the enhancements with real, interesting effects are the seemingly boring ones they do not mention.
The last major point from their E3 event was the launch of their 5 GHz FX processors. For more information on that part, check out Josh's thoughts from a couple of days ago.
Subject: Mobile, Shows and Expos | June 12, 2013 - 08:47 PM | Ryan Shrout
Tagged: E3, razer, blade, haswell, gtx 765m, geforce
With the launch of Intel's Haswell processor, accessory maker-turned notebook vendor Razer announced a pretty slick machine, the Blade. Based on a quad-core, 37 watt Core i7 Haswell CPU and a GeForce GTX 765M GPU, the Razer Blade packs a lot of punch.
It also includes 8GB of DDR3-1600 memory, an mSATA SSD and integrates a 14-in 1600x900 display. The design of the unit looks very similar to that of the MacBook Pro but the black metal finish is really an attractive style change.
The embedded battery is fairly large at 70 Whr and Razer claims this will equate to a 6 hour battery life when operating non-gaming workloads. With a weight just barely creeping past 4 lbs, the Razer Blade is both portable and powerful it seems.
The price tag starts at $1799 so you won't be able to pick one of these up on the cheap, but for users like me that are willing to pay a bit more for performance and style in a slim chassis, the Blade seems like a very compelling option. There are a lot of questions left to answer on this notebook including the thermal concerns of packing that much high frequency silicon into a thin and light form factor. Does the unit get hot in bad places? Can the screen quality match the performance of Haswell + Kepler?
We are working with Razer to get a model in very soon to put it to the test and I am looking forward to answering if we have found the best gaming portable on the market.