An new era for computing? Or, just a bit of catching up?
Early Tuesday, at 2am for viewers in eastern North America, Intel performed their Computex 2013 keynote to officially kick off Haswell. Unlike ASUS from the night prior, Intel did not announce a barrage of new products; the purpose is to promote future technologies and the new products of their OEM and ODM partners. In all, there was a pretty wide variety of discussed topics.
Intel carried on with the computational era analogy: the 80's was dominated by mainframes; the 90's were predominantly client-server; and the 2000's brought the internet to the forefront. While true, they did not explicitly mention how each era never actually died but rather just bled through: we still use mainframes, especially with cloud infrastructure; we still use client-server; and just about no-one would argue that the internet has been displaced, despite its struggle against semi-native apps.
Intel believes that we are currently in the two-in-one era, which they probably mean "multiple-in-one" due to devices such as the ASUS Transformer Book Trio. They created a tagline, almost a mantra, illustrating their vision:
"It's a laptop when you need it; it's a tablet when you want it."
But before elaborating, they wanted to discuss their position in the mobile market. They believe they are becoming a major player in the mobile market with key design wins and outperforming some incumbent system on a chips (SoCs). The upcoming Silvermont architecture pines to be fill in the gaps below Haswell, driving smartphones and tablets and stretching upward to include entry-level notebooks and all-in-one PCs. The architecture promises to scale between offering three-fold more performance than its past generation, or a fifth of the power for equivalent performance.
Ryan discussed Silvermont last month, be sure to give his thoughts a browse for more depth.
Subject: Editorial, General Tech, Motherboards, Processors | June 4, 2013 - 10:40 AM | Ryan Shrout
Tagged: z87, video, overclocking, live, i7-4770k, haswell, ASUS ROG, asus
While we run around with our hair on fire trying to get ready for the Intel Haswell and Z87 product launch this weekend, I wanted to let everyone know about a live stream event we will be holding on Tuesday, June 4th. JJ from ASUS, a crowd favorite for sure, will be joining us LIVE in studio to talk all about the new lineup of ASUS Z87 motherboards. We'll also discuss performance and overclocking capabilities of the new processor and platform.
ASUS Z87 and Haswell Live Stream
10am PT / 1pm ET - June 4th
Be sure you stop by and join in the show! Questions will be answered, prizes will be given out and fun will be had! Who knows, maybe we can break some stuff live as well?? On hand to give away to those of you joining the live stream, we'll have these prizes:
- 2 x ASUS Z87 Motherboards
- 1 x ASUS Graphics card
Methods for winning will be decided closer to the event, but if you are watching live, you'll be included. And we'll ship anywhere in the world!
ASUS and I also want the event to be interactive, so we want your questions. We'll of course being paying attention to the chat room on our live page but you'll have better luck if you submit your questions about the ASUS Z87 products and Haswell processors before hand, in the comments section below. You don't have to register to ask and we'll have the ability to read them beforehand!
I'll update this post with more information after the reviews and stories start to hit, so keep an eye here for more details!!
Subject: Editorial, General Tech | June 4, 2013 - 03:44 AM | Scott Michaud
Tagged: WCS, starcraft 2, HoTS
A little eye-rest before another barrage of Computex news...
Blizzard took over the canon StarCraft II tournament scene as of last year. The goal was to create a unified ranking system between every tournament and help participants deal with scheduling, a problem in recent years. Throughout the entire year, Blizzard is hosting the 2013 StarCraft II World Championship Series. They seem to like breaking rankings into seasons and the 2013 series, alone, will incorporate three of them leading to the year's grand finals in November.
One year per series; three seasons and a grand finals per year; three regional tournaments and a finale per season. This season's finals will take place this weekend, June 8th and 9th, in South Korea.
Tournaments in Europe, Korea, and North America chose the 16 competitors for the 2013 Season 1 Finals this weekend in Korea. The top five competitors in each tournament (top six for Korea) earned their invite. In all: 3 Protoss, 5 Terrans, and 8 Zerg will be participating. I guess their hearts are only half of the swarm.
If the regional matches were any indication, the seasonal finals should be a very entertaining bridge between Computex coverage and E3 2013. Players are getting much better at the game mechanics while still being able to surprise their opponents and even the audience with unusual strategies. Players exploit windows of weakness in their opponents with a moment of strength; the entertainment mostly comes from seeing each player attempt to delay or lengthen those windows all while hiding their own weak periods into times where the opponent is unable to reasonably exploit it.
What are your opinions of "eSports"? Good concept, bad name?
Subject: Editorial, General Tech, Systems | May 29, 2013 - 07:16 PM | Scott Michaud
Tagged: windows blue, Windows 8.1, windows, microsoft
Personally, I really cannot care too much about the user experience quirks inherent to Windows modernization; the wedge slowly being shoved between the user and their machine is far too concerning. No matter how they modify the interface, restricting what users and developers can install and create on their machine is a deal breaker. But, after that obligatory preface reminding people not to get wound up in UX hiccups and be complacent to the big issues, Windows Blue will certainly address many of those UX hiccups.
As we reported, last month, boot-to-desktop and the Start Button were planned for inclusion with Windows 8.1. Then, the sources were relentless to emphasize: "Until it ships, anything can change."
Images courtesy, Paul Thurrott.
Mary Jo Foley gathered quite a few details since then. Firstly, the option (as in, disabled by default) to boot directly to desktop will be there; from the sounds of it, it looks like it will be disabled by default but not exclusive to Enterprise SKUs. This is somewhat promising, as it would be slightly less likely for Microsoft to kill support for the desktop (and, by extension, x86 applications) if they feel pressure to punctuate it. Still, assuming because "it makes sense" is a bad way to conduct business.
Also available (albeit, enabled by default) is the Start Button, seen in higher quality above. This will be, as far as we know, enabled by default. Its functionality will be to bring up the Start Screen or, alternatively, a new All Apps screen visible at ZDNet. Now this has me interested: while I actually like the Start Screen, a list of apps should provide functionality much closer to the Start Menu than Microsoft was previously comfortable with. Previously, the Start Screen attempted to make the desktop applications feel less comfortable than modern apps; this interface appears like it would feel more comfortable to the desktop. While probably still jarring, it looks to make finding desktop applications easier and quickly gets out of the way of your desktop experience.
According to Paul Thurrott, for those who wish to personalize the Start Screen, you will have the option to share your desktop wallpaper with the it. For tasteful backgrounds, like the one above, I can see this being of good use.
Just please, do not grief someone with a background full of fake tiles.
As a final note, there is still no word about multiple monitor support for "Modern Apps". If you have tried to use them in the past, you know what I am talking about: basically only one at a time, it will jump between monitors if you bring up the Start Screen, and so forth.
Subject: Editorial, General Tech, Cases and Cooling | May 29, 2013 - 02:03 PM | Scott Michaud
Tagged: Windows key, mouse, microsoft, I Hate This Key
Has this ever happened to you while playing a shooter? You need to get to a position so you mash the alt key to sprint and... aw crap I hit the Windows key... well, now I am dead. Have you ever considered purchasing software or a gaming keyboard which allows you disable that button?
Have you ever considered purchasing a mouse which also has that button to give both hands something to fear?
Definitely not a member of their Sidewinder product line.
Okay, so I should be fair: the Microsoft Sculpt Comfort mouse is not designed for gaming and Windows 8-like user experiences revolve heavily around the start button. The mouse button is also more useful than a redundant Windows key; the blue pad also has swipe functionality for extra functions. According to how it is described on its product page, slide gestures are bound to respond to the computer as mouse buttons 4 and 5.
So you can probably bind them to game functions, if you feel daring.
But, in the end, I still need to congratulate Microsoft for trying to innovate computer hardware. This is more than just trying to graft touch functionality to a mouse surface, as both Apple and Microsoft have tried in the past, and tries to make the classical mouse experience better. I doubt it is for most of our audience, but not everything needs to be.
Subject: Editorial | May 27, 2013 - 05:08 PM | PCPer Staff
The Dell Inspiron 15R is a good choice for anyone looking for a reasonably powerful and lightweight laptop. It is powered by a 1.8GHz Core i5-3537U, has 8GB RAM, a 1TB HDD and integral DVD burner, with a 15.6" 1366 x 768 LED-backlit LCD powered by the HD4000 on the i5. Not exactly a gaming PC but at 4.9lbs it is an easy way to bring your work with you wherever you go and have more processing power than a tablet will offer.
Dell Home is offering 3rd generation 15.6" Inspiron 15R 5521 Core i7 Ivy Bridge Laptop with 8GB RAM, 1TB Hard Drive & Touchscreen for $799.99 with FREE shipping. Use $289 instant savings and extra $100 coupon code: JS3KG6045QZ2BR to get final price.
Subject: Editorial, General Tech, Systems | May 27, 2013 - 03:08 AM | Scott Michaud
Tagged: xbox one, ps4, consolitis, consoles
So, as Wired editorial states it: hardcore console gamers don't want much, just the impossible. They want a "super-powered box" tethered to their TV; they want the blockbuster epics and innovative indie titles; they want it to "just work" for what they do. The author, Chris Kohler, wrote his column to demonstrate how this is, and has for quite some time been, highly unprofitable.
I think the bigger problem is that the console manufacturers want the impossible.
Console manufacturers have one goal: get their platform in your house and require their hand be in the pocket of everything you do with it. They need to make an attractive device for that to be true, so they give it enough power to legitimately impress the potential buyer and price it low enough to catch the purchasing impulse. Chances are this involves selling the box under cost at launch and for quite some time after.
But, if all of this juicy control locks the user into overspending in the long run, then it is worth it...
But Microsoft should be thankful that I cost them money to be acquired as a customer.
Well, looking at the Wired article, not only are console gamers ultimately overspending: it is still not enough! Consoles truly benefit no-one! The console manufacturers are not doing any more than maybe breaking even, at some point, eventually, down the line, they hope. Microsoft and Sony throw obnoxious amounts of money against one another in research, development, and marketing. Redundant technologies are formed to pit against their counterparts with billions spent in marketing to try to prove why either choice is better.
All of this money is spent to corral users into a more expensive experience where they can pocket the excess.
Going back to the editorial's claims: with all of this money bleeding out, Microsoft wants to appeal more broadly and compensate the loss with more cash flowing in. Sure, Microsoft has wanted a foothold in the living room for decades at this point, but the Xbox Division bounces between profitability and huge losses; thus, they want to be an entertainment hub if just for the cash alone.
But think back to the start, these troubles are not because it is impossible to satisfy hardcore gamers. These troubles are because Microsoft and Sony cannot generate revenue from their acquired control quicker than they can bleed capital away trying to acquire that control, or at least generate it more than just barely fast enough.
The other solution, which I have felt for quite some time is the real answer (hence why I am a PC gamer), has a large group of companies create an industry body who governs an open standard. Each company can make a substantial profit by focusing on a single chunk of the platform -- selling graphics processors, maintaining a marketplace, or what-have-you -- by leveraging the success of every other chunk.
This model does work, and it is the basis for one of humanity's most successful technology products: the internet.
As a side note: this is also why PC gaming was so successful... Microsoft, developers, Steam/GoG/other marketplaces, and hardware vendors were another version of this... albeit Microsoft had the ability to override them and go in whatever direction they wanted. They didn't, until Windows RT.
And the internet might even be the solution. The web browser is capable, today, of providing amazing gaming experiences and it does not even require a plugin. It is getting more powerful, even faster than the rate at which underlying hardware has evolved.
To end on an ironic note, that makes a web browser more capable of offline play than our current understanding of the Xbox One (and Sony has said nothing either way, for that matter).
I guess the takeaway message is: love the web browser, it "just works".
Subject: Editorial, General Tech, Graphics Cards, Systems | May 23, 2013 - 06:40 PM | Scott Michaud
Tagged: xbox one, xbox, unreal engine, ps4, playstation 4, epic games
Unreal Engine 4 was presented at the PlayStation 4 announcement conference through a new Elemental Demo. We noted how the quality seemed to have dropped in the eight months following E3 while the demo was being ported to the console hardware. The most noticeable differences were in the severely reduced particle counts and the non-existent fine lighting details; of course, Epic pumped the contrast in the PS4 version which masked the lack of complexity as if it were a stylistic choice.
Still, the demo was clearly weakened. The immediate reaction was to assume that Epic Games simply did not have enough time to optimize the demo for the hardware. That is true to some extent, but there are theoretical limits on how much performance you can push out of hardware at 100% perfect utilization.
Now that we know both the PS4 and, recently, the Xbox One: it is time to dissect more carefully.
A recent LinkedIn post from EA Executive VP and CTO, Rajat Taneja, claims that the Xbox One and PS4 are a generation ahead of highest-end PC on the market. While there are many ways to interpret that statement, in terms of raw performance that statement is not valid.
As of our current knowledge, the PlayStation 4 contains an eight core AMD "Jaguar" CPU with an AMD GPU containing 18 GCN compute units, consisting of a total of 1152 shader units. Without knowing driving frequencies, this chip should be slightly faster than the Xbox One's 768 shader units within 12 GCN compute units. The PS4 claims their system has a total theoretical 2 teraFLOPs of performance and the Xbox One would almost definitely be slightly behind that.
Back in 2011, the Samaritan Demo was created by Epic Games to persuade console manufacturers. This demo was how Epic considered the next generation of consoles to perform. They said, back in 2011, that this demo would theoretically require 2.5 teraFLOPs of performance for 30FPS at true 1080p; ultimately their demo ran on the PC with a single GTX 680, approximately 3.09 teraFLOPs.
This required performance, (again) approximately 2.5 teraFLOPs, is higher than what is theoretically possible for the consoles, which is less than 2 teraFLOPs. The PC may have more overhead than consoles, but the PS4 and Xbox One would be too slow even with zero overhead.
Now, of course, this does not account for reducing quality where it will be the least noticeable and other cheats. Developers are able to reduce particle counts and texture resolutions in barely-noticeable places; they are also able to render below 1080p or even below 720p, as was the norm for our current console generation, to save performance for more important things. Perhaps developers might even use different algorithms which achieve the same, or better, quality for less computation at the expense of more sensitivity to RAM, bandwidth, or what-have-you.
But, in the end, Epic Games did not get the ~2.5 teraFLOPs they originally hoped for when they created the Samaritan Demo. This likely explains, at least in part, why the Elemental Demo looked a little sad at Sony's press conference: it was a little FLOP.
Update, 5/24/2013: Mark Rein of Epic Games responds to the statement made by Rajat Taneja of EA. While we do not know his opinion on consoles... we know his opinion on EA's opinion:
— Mark Rein (@MarkRein) May 23, 2013
Subject: Editorial, Graphics Cards | May 23, 2013 - 12:08 PM | Ryan Shrout
Tagged: video, live, gtx 780, gk110
Missed the LIVE stream? You can catch the video reply of it below!
Hopefully by now you have read over our review of the new NVIDIA GeForce GTX 780 graphics card that launched this morning. Taking the GK110 GPU, cutting off some more cores, and setting a price point of $650 will definitely create some interesting discussion.
Join me today at 2pm ET / 11am PT as we discuss the GTX 780, our review and take your questions. You can leave them in the comments below, no registration required.
Subject: Editorial, General Tech | May 22, 2013 - 01:53 AM | Scott Michaud
Tagged: antivirus, antimalware
They might be a good means of guarding you from momentary lapses of judgment, but security is not equivalent to antivirus packages. You always need to consider how much your system is exposed to untrusted and even unsolicited data. Any software which accepts untrusted data has some surface with potential vulnerability to attack.
This, inherently, includes software which accepts data to scan it for malware.
Last week was host to Patch Tuesday, and one of its many updates fixed a vulnerability in Microsoft's Malware Protection Engine (MPE). The affected code is only present in applications which run the 64-bit version of the engine. For home users, these applications are: Microsoft Security Essentials (x86-64), Microsoft Malicious Software Removal Tool (x86-64), and all varieties of Windows Defender (x86-64). For enterprise users, MPE is also a part of Forefront and Endpoint applications and suites.
Despite the irony, I will not beat up on Microsoft. As far as I know, these vulnerabilities are semi-frequently patched in basically any antimalware application. At the very least, Microsoft declares and remedies problems with reasonable and appropriate policies; they could have just as easily buried this fix and pushed it out silently or worse, wait until it becomes actively exploited in the wild and even beyond.
But, and I realize I am repeating myself at this point, the biggest takeaway from this news: you cannot let the mere presence of antivirus suites permit you to be complacent. No scanner will detect everything, and some might even be the way in.