CEO Jen-Hsun Huang Sells Windows RT... A Little Bit.

Subject: Editorial, General Tech, Processors, Shows and Expos | March 20, 2013 - 06:26 PM |
Tagged: windows rt, nvidia, GTC 2013

NVIDIA develops processors, but without an x86 license they are only able to power ARM-based operating systems. When it comes to Windows, that means Windows Phone or Windows RT. The latter segment of the market has disappointing sales according to multiple OEMs, which Microsoft blames them for, but the jolly green GPU company is not crying doomsday.

surface-cover.jpg

NVIDIA just skimming the Surface RT, they hope.

As reported by The Verge, NVIDIA CEO Jen-Hsun Huang was optimistic that Microsoft would eventually let Windows RT blossom. He noted how Microsoft very often "gets it right" at some point when they push an initiative. And it is true, Microsoft has a history of turning around perceived disasters across a variety of devices.

They also have a history of, as they call it, "knifing the baby."

I think there is a very real fear for some that Microsoft could consider Intel's latest offerings as good enough to stop pursuing ARM. Of course, the more the pursue ARM, the more their business model will rely upon the-interface-formerly-known-as-Metro and likely all of its certification politics. As such, I think it is safe to say that I am watching the industry teeter on a fence with a bear on one side and a pack of rabid dogs on the other. On the one hand, Microsoft jumping back to Intel would allow them to perpetuate the desktop and all of the openness it provides. On the other hand, even if they stick with Intel they likely will just kill the desktop anyway, for the sake of user confusion and the security benefits of cert. We might just have less processor manufacturers when they do that.

So it could be that NVIDIA is confident that Microsoft will push Windows RT, or it could be that NVIDIA is pushing Microsoft to continue to develop Windows RT. Frankly, I do not know which would be better... or more accurately, worse.

Source: The Verge

Introducing PC Perspective Podcast Bingo!

Subject: Editorial | February 27, 2013 - 02:26 PM |
Tagged: Podcast Bingo, Bingo

It's Bumpday! No, no I'm not reviving that, at least not yet.

But it is Wednesday and as such we will be gathering for another fine PC Perspective Podcast. As always, if you wish to join us: head down to pcper.com/live or click on the little radio tower in the upcoming events box on the right side of your screen.

Yes, I know, there are two P's. Deal with it.

Starting this week, we will have a new activity for our viewers to follow along with. Play along with the first official PC Perspective Podcast Bingo Card! Keep track of every episode we mention “Arc Welding” by putting us to task and marking off the square.

Be sure to call out the spaces you mark off and whatever Bingo patterns you manage to make out of it.

Our IRC room during the podcast (it's usually active 24/7) is: irc.mibbit.net #pcper

Of course this is just for fun – but fun is fun!

Unreal Engine 4 Demo for PS4, Reduced Quality?

Subject: Editorial, General Tech, Systems | February 26, 2013 - 08:07 PM |
Tagged: ps4, unreal engine 4

Unreal Engine 4 was present at Sony's Playstation 4 press conference, but that is no surprise. Epic Games has been present at several keynotes for new console launches. Last generation, Unreal Engine 3 kicked off both Xbox 360 and PS3 with demos of Gears of War and Unreal Tournament 2007, respectively. The PS4 received a continuation of the Elemental Demo first released at the end of E3 last June.

All I could think about when I watched the was, “This looks pretty bad. What happened?”

If you would like to follow along at home, both demos are available on Youtube:

As you can see from the animated GIF above, particle count appears to have been struck the worst. The eyes contain none of the particle effects in the PS4 version. There appears to be an order of magnitude or two more particles on the PC version than the PS4. There are no particle effects around the eyes of the statue. Whole segments of particles are not even rendered.

UE4_2_PCvPS4.jpg

In this screenshot, downsampled to 660x355, the loss of physical detail is even more apparent. The big cluster of particles near the leg are not present in the PS4 version and the regular cluster is nowhere near as densely packed.

And the lighting, oh the lighting.

On the PS4 everything looks a lot higher contrast without a lot of the subtle lighting information. This loss of detail is most apparent with the volcano smoke and the glow of the hammer but are also obvious in the character model when viewed in the video.

Despite the 8GB of RAM, some of the textures also seem down-resolution. Everything appears to have much more of a plastic look to it.

Still, while computers still look better, at least high-end PC gaming will still be within the realm of scalability for quite some time. We have been hampered by being so far ahead of consoles that it was just not feasible to make full use of the extra power. At least that is looking to change.

Happy 0th Birthday Firefox OS

Subject: Editorial, General Tech, Systems, Mobile, Shows and Expos | February 26, 2013 - 04:19 AM |
Tagged: Firefox OS, mozilla, firefox, MWC, MWC 13

Mobile World Congress is going on at Barcelona and this year sees the official entry of a new contender: Firefox OS.

Mozilla held their keynote speech the day before the official start to the trade show. If there is anything to be learned from CES, it would be that there is an arms race to announce your product before everyone else steals media attention while still being considered a part of the trade show. By the time the trade show starts, most of the big players have already said all that they need to say.

firefoxos.jpg

If you have an hour to spare, you should check it out for yourself. The whole session was broadcast and recorded on Air Mozilla.

The whole concept of Firefox OS as I understand it is to open up web standards such that it is possible to create a completely functional mobile operating system from it. Specific platforms do not matter, the content will all conform to a platform of standards which anyone would be able to adopt.

I grin for a different reason: should some content exist in the future that is intrinsically valuable to society, its reliance on an open-based platform will allow future platforms to carry it.

Not a lot of people realize that iOS and Windows RT disallow alternative web browsers. Sure, Google Chrome the app exists for iOS, but it is really a re-skinned Safari. Any web browser in the Windows Store will use Trident as its rendering engine by mandate of their certification rules. This allows the platform developer to be choosey with whichever standards they wish to support. Microsoft has been very vocally against any web standard backed by Khronos. You cannot install another browser if you run across a web application requiring one of those packages.

When you have alternatives, such as Firefox OS, developers are promoted to try new things. The alternative platforms promote standards which generate these new applications and push the leaders to implement those standards too.

And so we creep ever-closer to total content separation from platform.

Source: Mozilla

Next Generation Consoles Likely Not Compatible

Subject: Editorial, General Tech | February 16, 2013 - 02:08 AM |
Tagged: consoles, consolitis, pc gaming

If you really enjoy an Xbox or Playstation game, better hope your console does not die: it is likely that nothing else will play it. This news comes from a statement made by Blake Jorgensen, CFO of Electronic Arts. Clearly EA is a trusted partner of all console developers and not just an anonymous tipster.

5-depressing.png

You mean, Devil May Stop Crying?

I tend to rant about this point quite often. For a market so devoted to the opinion that video games are art, the market certainly does not care about its preservation as art. There is always room for consumable and even disposable entertainment, but the difference with art is that it cannot be substituted with another piece of content.

There would be a difference if someone magically replaced every copy of Schindler’s List, including the vaulted masters, with The Boy in the Striped Pajamas. I could safely assume that the vast majority of the audience for either film was not just browsing the Holocaust movie genre. I would expect the viewer was seeking out the one or the other for a specific reason.

This is incompatible with the console ecosystem by its design. The point of the platform is to be disposable and its content is along for the ride while it lasts. They often deliver the console for less than their parts and labor fees: research, development, and marketing costs regardless. The business model is to eliminate as many big fees as possible and then jack up the price of everything else ten bucks here and there. Over time you will not be given a bargain, over time you will give them more than they made you think you saved. They then spend this extra money keeping content exclusively under their control, not yours. Also, profits... give or take.

Again, there is always room for consumable entertainment. The consoles are designed to be very convenient, but not cheap and not suitable for timeless art. Really, the only unfortunate element is how these impairments are viewed as assets and all the while examples such as this one dance around the background largely shrugged off without being pieced together.

As for your favorite game? Who knows, maybe you will get lucky and it will be remade on some other platform for you to purchase again. You might be lucky, it might even be available on the PC.

Source: Ars Technica

Some Stakeholders Yell, "Oh... DELL No!"

Subject: Editorial, General Tech | February 16, 2013 - 01:19 AM |
Tagged: dell

There have been some groups opposed to the planned deal to cease publicly trading Dell and release their shares. It would seem that for many, a short-term payout of 25 percent over trading price is insufficient and, they believe, undervalues the company. I mean, the price is totally not derived from the value you gave it when you just finished trading stocks at 80 percent of what Dell is offering you or anything. Yes, I am making a joke: some investors were almost definitely going long on Dell. I still suspect that some are just playing hardball, hoping that a quarter on the dollar raise is just a starting bid.

Buckle in, I will separate stockholders opinions into two categories: investment firms and employees.

dell.jpg

Ars Technica clearly had football on the mind when they wrote a very Superbowl-themed editorial. Early in the month, Southeastern Asset Management sent a letter to Dell management expressing their stance to vote against a deal to go private. The investment firm controls 8.5 percent of Dell which means their opinion has a fair amount of sway. A short few days later, T. Rowe Price stepped up to likewise oppose the deal. This firm owns 4.4 percent of Dell, which means combined they have roughly a 13 percent vote.

Factor in a bunch of smaller investors and you are looking at almost a fifth of the company wanting to keep it public. That combined voting power slightly overtakes the 16 percent control owned by Micheal Dell and could hamper the festivities.

Employees, meanwhile, are upset all the same. Again, according to Ars Technica and their vigilant coverage states that some employees were force to sell their stock acquired as a part of their 401k at $9 per share – substantially lower than the 13.65$ being offered to investors.

There are several other ways which employees get their stake in the company reduced or hampered, but I would direct you to the Ars Technica article so I do not butcher any details.

Unfortunately these sorts of practices are fairly commonplace when it comes to investment deals. It would appear as if this deal trots on common ground instead of taking the high road.

God, I hate mixed metaphors.

Source: Ars Technica

It's More than Windows Feeling "Blue"

Subject: Editorial, General Tech | February 9, 2013 - 02:49 AM |
Tagged: windows blue

Could the sadness Microsoft feels with their OEM partners make the whole company feel just a little Blue?

I have been thinking about this while reading the latest news from Mary Jo Foley over at ZDNet. This has not been the first time that we have mentioned the color. Blue was, and still is, a codename for the first major feature-update of Windows 8. What we learned is that now it seems that “Blue” covers much more.

As many know, Microsoft has shifted their branding into four color-coded divisions: blue is for Windows; red is for Office; green is for Xbox, and yellow has yet to be disclosed. As far as we know, the Windows division encompasses Windows Phone, Internet Explorer, official apps, and so forth. Apparently “Blue”, the codenamed update, will start Microsoft on an annual update schedule for the Windows division. This means that Internet Explorer as well as the Mail, Calendar, Bing app, and other “Windows Services” such as SkyDrive and Hotmail will shift towards the yearly timer.

WinSetup13.PNG

As I read Mary Jo's article, I focused on a point buried late in the second act of the column:

Instead of RTMing a new version of Windows once every three or so years, and then hoping/praying OEMs can get the final bits tested and preloaded on new hardware a few months later, Microsoft is going to try to push Blue out to users far more quickly, possibly via the Windows Store, my contact said.

While I have speculated about Microsoft and their desires to shift business models to a subscription service for quite some time, I have not considered OEM partners as a prominent reason. Microsoft has been wrestling with their manufacturers, that has recently been made obvious. The release of a new operating system drives users to go out and purchase new hardware. The PC industry bounces forward with software and hardware enhancements chained in lockstep to the three year Windows cycle, even the enthusiast market to some extent.

Perhaps Microsoft is trying to let the hardware itself drive the market. Instead of pushing the industry forward in big leaps, would it be possible that Microsoft wants the hardware to evolve and a new version of Windows to be there waiting for it?

Source: ZDNet

Psychonauts 2 Fell Through. Oh Well.

Subject: Editorial, General Tech | February 5, 2013 - 05:44 PM |
Tagged: Psychonauts, Notch, Tim Schafer

You cannot knock Tim Schafer: he abides by “Shut up and take my money”.

Last year we reported on the public negotiations between the heads of Mojang and Double Fine for a potential sequel to Psychonauts. The game was supposed to take “a couple” of million to make, which was later clarified to at least $13 million USD. This prompted the famous response from Notch, “Yeah, I can do that.”

Some time later, the deal fell by the wayside.

psychonauts.jpg

A storm never came that day, barely a ripple brushed against his wooden canoe.

Recently Notch was on Reddit and commented about the status of the sequel. The final budget ended up being around $18 million USD which ended up being beyond what Notch felt comfortable investing in. It was not for a lack of funds, however. Markus stated that he just did not have the time to be involved in an $18 million dollar deal.
 

The biggest point I would like to make is how little damage was caused by discussing this out in the open: the game fell through, at least for the moment, and no effigies were burnt. We might be approaching a time and an industry where these sorts of discussions will not need to be performed in strict secrecy.

Congratulations to Markus Persson and Tim Schafer for being brave or eccentric enough to trust the internet. We are sorry it didn't work out, but wish you luck in the future.

Source: PC Gamer

Dell Goes Private, Microsoft Loans Some Help

Subject: Editorial, General Tech, Systems, Mobile | February 5, 2013 - 05:10 PM |
Tagged: dell

Dell, dude, you're getting a Dell!

So it is official that Dell is going private. Michael Dell, CEO, as well as: Silver Lake, MSD Capital, several banks, and Dell itself will buy back stocks from investors 25% above the January 11th trading price. The whole deal would be worth $24.4 billion USD.

dell.jpg

Going private allows the company to make big shifts in their business without answering to investors on a quarterly basis. We can see how being a publicly traded company seems to hinder businesses after they grow beyond what a cash infusion can assist. Even Apple finds it necessary to keep an absolutely gigantic pile of cash to play with, only recently paying dividends to investors.

Also contributing to the buyback, as heavily reported, is a $2 billion USD loan from Microsoft. While it sounds like a lot in isolation, it is only just over 8% of the whole deal. All you really can pull is that it seems like Microsoft supports Dell in their decision and is putting their money where their intentions are.

Source: The Verge

Win FREE Stuff! Seasonic M12II 850 watt and 750 watt PSU up for grabs!!

Subject: Editorial, Cases and Cooling | February 5, 2013 - 12:55 PM |
Tagged: seasonic, PSU, power supply, m12ii, giveaway, contest

We know that our readers love to win free stuff, and who can blame you?  Building PCs can sometimes be a burden on your wallet and we do our best to help that by showing you the best deals and occasionally having contests like this!

Our good friends at Seasonic wanted to offer up a couple of power supplies for our community and we were obviously excited to facilitate!  Here are the goods you can win:

mk12ii-850.jpg

Seasonic M12II SS-850AM 850 watt Power Supply ($130 value - Newegg Link)

mk12ii-750.jpg

Seasonic M12II SS-750AM 750 watt Power Supply ($120 value - Newegg Link)

Seasonic M12II Bronze Series has been the ever-popular power supply series of the semi modular category. Now Seasonic extends the semi-modular series by introducing M12II Bronze 650/750/850 to provide consumers a larger selection in the entry level 80PLUS Bronze certified category.

The M12II Bronze new models have built in a full protection feature including OCP, OPP,OTP, OVP, SCP & UVP and meet worldwide safety and environmental standards. The all-new M12II-650/750/850 units are the new leaders in the 80 PLUS Bronze category and another great addition to the Seasonic Retail power supply family.

If you are looking to build a new PC or upgrade your current system, either of these two power supplies will make a great backbone for all the other components. 

How do you win? 

  1. Visit your favorite PC Perspective pages like our YouTube channel, Facebook page and Twitter account.  You should subscribe, like and follow us, you know...if you want to.  We'd appreciate it!
  2. Also, stop by the Seasonic Facebook page and give it a look - they are always posting contests and giveaways there!!
  3. Leave a comment here on this post telling us what you would be able to do better if your system was powered by one of these power supplies!

We'll pick a winner on Wednesday the 13th of February, so get your entries in NOW!  A big thank you goes out to Seasonic for supporting PC Perspective and for supporting our loyal readers!

Source: Seasonic

Microsoft Likes That Modern Will Not Get Them Sued: Compatibility Website "modern.IE" Launches

Subject: Editorial, General Tech | February 2, 2013 - 06:23 PM |
Tagged: webkit, w3c, microsoft, internet explorer, html5

Microsoft has been doing their penance for the sins against web developers of the two decades past. The company does not want developers to target specific browsers and opt to include W3C implementations of features if they are available.

What an ironic turn of events.

Microsoft traditionally fought web standards, forcing developers to implement ActiveX and filters to access advanced features such as opacity. Web developers would program their websites multiple times to account for the... intricacies... of Internet Explorer when compared to virtually every other browser.

ie-easier.png

Now Google and Apple, rightfully or otherwise (respectively, trollolol), are heavily gaining in popularity. This increase in popularity leads to websites implementing features exclusively for Webkit-based browsers. Internet Explorer is not the browser which gets targeted for advanced effect. If there is Internet Explorer-specific code in sites it is usually workarounds for earlier versions of the browser and only muck up Microsoft's recent standards-compliance by feeding it non-standard junk.

It has been an uphill battle for Microsoft to push users to upgrade their browsers and web developers to upgrade their sites. “modern.IE” is a service which checks for typical incompatibilities and allows for developers to test their site across multiple versions of IE.

Even still, several web technologies are absent in Internet Explorer as they have not been adopted by the W3C. WebGL and WebCL seek to make the web browser into high-performance platform for applications. Microsoft has been vocal about not supporting these Khronos-backed technologies on the grounds of security. Instead of building out web browsers as a cross-platform application platform Microsoft is pushing hard to not get their app marketplace ignored.

I am not sure what Microsoft should fear most: that their app marketplace will be smothered by their competitors, or whether they only manage to win the battle after the war changes theaters. You know what they say, history repeats itself.

Source: Ars Technica

ST Ericsson Shows off First FD-SOI Product

Subject: Editorial | January 16, 2013 - 09:41 PM |
Tagged: ST Ericsson, planar, PD-SOI, L8580, FinFET, FD-SOI, Cortex A9, cortex a15, arm

SOI has been around for some time now, but in partially depleted form (PD-SOI).  Quite a few manufacturers have utilized PD-SOI for their products, such as AMD and IBM (probably the two largest producers of SOI based parts).  Oddly enough, Intel has shunned SOI wafers altogether.  One would expect Intel to spare no expense to have the fastest semiconductor based chips on the market, but SOI did not provide enough advantages for the chip behemoth to outweigh the nearly 10% increase in wafer and production costs.  There were certainly quite a few interesting properties to PD-SOI, but Intel was able to find ways around bulk silicon’s limitations.  These non-SOI improvements include stress and strain, low-K dialectrics, high-K metal gates, and now 3D FinFET Technology.  Intel simply did not need SOI to achieve the performance they were looking for while still using bulk silicon wafers.

stlogo.jpg

Things started looking a bit grim for SOI as a technology a few years back.  AMD was starting to back out of utilizing SOI for sub-32 nm products, and IBM was slowly shifting away from producing chips based on their Power technology.  PD-SOI’s days seemed numbered.  And they are.  That is ok though, as the technology will see a massive uptake with the introduction of Fully Depleted SOI wafers.  I will not go into the technology in full right now, but expect another article further into the future.  I mentioned in a tweet some days ago that in manufacturing, materials are still king.  This looks to hold true with FD-SOI.

Intel had to utilize 3D FinFETs on 22 nm because they simply could not get the performance out of bulk silicon and planar structures.  There are advantages and disadvantages to these structures.  The advantage is that better power characteristics can be attained without using exotic materials all the while keeping bins high, but the disadvantage is the increased complexity of wafer production with such structures.  It is arguable that the increase in complexity completely offsets the price premium of a SOI based solution.  We have also seen with the Intel process that while power consumption is decreased as compared to the previous 32 nm process, the switching performance vs. power consumption is certainly not optimal.  Hence the reason why we have not seen Intel release Ivy Bridge parts that are clocked significantly faster than last generation Sandy Bridge chips. 

FD-SOI and planar structures at 22 nm and 20 nm promise the improve power characteristics as compared to bulk/FinFET.  It also looks to improve overall power vs. clockspeed as compared to bulk/FinFET.  In a nutshell this means better power consumption as well as a jump in clockspeed as compared to previous generations.  Gate first designs using FD-SOI could be very good, but industry analysts say that gate last designs could be “spectacular”.

SOIConsortiumFDSOIBulk.jpg

So what does this have to do with ST Ericsson?  They are one of the first companies to show a products based on 28 nm FD-SOI technology.    The ARM based NovaThore L8580 is a dual Cortex A9 design with the graphics portion being the IMG SGX544.  At first glance we would think that ST is behind the ball, as other manufacturers are releasing Cortex A15 parts which improve IPC by a significant amount.  Then we start digging into the details.

The fastest Cortex A9 designs that we have seen so far have been clocked around 1.5 GHz.  The L8580 can be clocked up to 2.5 GHz.  Whatever IPC improvements we see with A15 are soon washed away by the sheer clockspeed advantage that the L8580 has.  While it has been rumored that the Tegra 4 will be clocked up to 2 GHz in tablet form, ST is able to get the L8580 to 2.5 GHz in a smartphone.  NVIDIA utilizes a 5th core to improve low power performance, but ST was able to get their chip to run at 0.6v in low power mode.  This decrease in complexity combined with what appears to be outstanding electrical and thermal characteristics makes this a very interesting device.

The Cortex A9 cores are not the only ones to see an improvement in clockspeed and power consumption.  The well known and extensively used SGX544 graphics portion runs at 600 MHz in a handheld device, and is around 20% faster clocked than other comparable parts.

L8580.jpg

When we add all these things together we have a product that appears to be head and shoulders above current parts from Qualcomm and Samsung.  It also appears that these parts are comparable, if not slightly ahead, of the announced next generation of parts from the Cortex A15 crowd.  It stands to reason that ST Ericsson will run away with the market and be included in every new handheld sold from now until the first 22/20 nm parts are released?  Unfortunately for ST Ericsson, this is not the case.  If there was an Achilles Heel to the L8580 it is that of production capabilities.  ST Ericsson started production on FD-SOI wafers this past spring, but it was processing hundreds of wafers a month vs. the thousands that are required for full scale production.  We can assume that ST Ericsson has improved this situation, but they are not exactly a powerhouse when it comes to manufacturing prowess.  They simply do not seem to have the FD-SOI production capabilities to handle orders from more than a handful of cellphone and table manufacturers.

ST Ericsson has a very interesting part, and it certainly looks to prove the capabilities of FD-SOI when compared to competing products being produced on bulk silicon.  The Nova Thor L8580 will gain some new customers with its combination of performance and power characteristics, even though it is using the “older” Cortex A9 design.  FD-SOI has certainly caught the industrys’ attention.  There are more FD-SOI factoids floating around that I want to cover soon, but these will have to wait.  For the time being ST Ericsson is on the cutting edge when it comes to SOI and their proof of concept L8580 seems to have exceeded expectations.

Source: ST Ericsson

CES 2013: Valve Announces Steam Box Not Announced in '13

Subject: Editorial, General Tech, Systems, Shows and Expos | January 9, 2013 - 02:44 PM |
Tagged: CES, ces 2013, valve, Steam Box

CES opened with excitement from Xi3 Corporation and their announcement of the Piston. When Gabe Newell spoke with Kotaku at the VGAs he said that we would see Big Picture PCs this year. With word that Xi3 received funding from Valve some of us, including myself, wondered if this Piston was Valve’s “Nexus” Steam Box.

Ben Krasnow, hardware engineer at Valve, comments in the video below about whether we have seen Valve’s canonical Steam Box or if there are any planned announcements for 2013.

Thank you Ben.

There seems to be some discrepancies between statements from Krasnow and Valve Managing Director, Gabe Newell. The major deviation concerns whether the official Steam Box will be based on Linux or another operating system. When interviewed by The Verge, Gabe Newell claimed the official box will be based on Linux with no unclear terms:

We’ll come out with our own and we’ll sell it to consumers by ourselves. That’ll be a Linux box, [and] if you want to install Windows you can. We’re not going to make it hard. This is not some locked box by any stretch of the imagination.

Krasnow in an email discussion with Engadget was somewhat more timid in future plans. The Engadget article was published on the same day as the The Verge interview which makes neither position particularly out of date. His statement:

"The box might be linux-based, but it might not," he continued. "It's true that we are beta-testing Left for Dead 2 on Linux, and have also been public about Steam Big Picture Mode. We are also working on virtual and augmented reality hardware, and also have other hardware projects that have not been disclosed yet, but probably will be in 2013."

At the same point this might all become irrelevant very quickly. As reported yesterday, Gabe Newell in the very same interview seemed to strongly suggest that post-Kepler GPUs will bring virtualization to the consumer market. If that is the case, then the only barrier between Linux and Windows would be for a company to provide a user-friendly virtual machine. Having your host operating system as one or the other would not particularly matter if the user could run gaming applications from the other platform.

Coverage of CES 2013 is brought to you by AMD!

PC Perspective's CES 2013 coverage is sponsored by AMD.

Follow all of our coverage of the show at http://pcper.com/ces!

Source: Engadget

Also Not CES 2013: RIP Messenger, 7/22/1999 - 3/15/2013

Subject: Editorial, General Tech | January 9, 2013 - 04:08 AM |
Tagged: MSN, MSN Messenger

Windows Live Messenger, formerly MSN Messenger, almost lived to see its fourteenth birthday.

The messaging service will continue to live on in our hearts, minds, and China. A butterfly flapped its wings and stirred up a storm of trouble for AOL with its superior webcam capabilities. The purchase of a competing service by Microsoft should have come as a well Timed Warner of an impending fate similar to which AIM suffered.

MSN_Messenger_logo.png

We should remember Messenger as a fighter who swam against the Windows Live Waves. While it now looks down on us from the Skype I imagine it is at peace. It knows that it did all that it could. It knows that Skype fairly beat it at its own game.

Getting progressively worse with each and every version.

So long MSN! My Trillian contact list will feel emptier without you! But please do not be offended by my eulogy, as I say this in the utmost respect: Yahoo!

Coverage of CES 2013 is brought to you by AMD!

PC Perspective's CES 2013 coverage is sponsored by AMD.

Follow all of our coverage of the show at http://pcper.com/ces!

Source: Forbes

CES 2013: The Verge Interviews Gave Newell for Steam Box. Valve's Director Hints Post-Kepler GPUs Can Be Virtualized!

Subject: Editorial, General Tech, Graphics Cards, Networking, Systems, Shows and Expos | January 8, 2013 - 11:11 PM |
Tagged: valve, gaben, Gabe Newell, ces 2013, CES

So the internet has been in a roar about The Steam Box and it probably will eclipse Project Shield as topic of CES 2013. The Verge scored an interview to converse about the hardware future of the company and got more than he asked for.

Gaben.jpg

Now if only he would have discussed potential launch titles.

Wow! That *is* a beautiful knife collection.

The point which stuck with me most throughout the entire interview was directed at Valve’s opinion of gaming on connected screens. Gabe Newell responded,

The Steam Box will also be a server. Any PC can serve multiple monitors, so over time, the next-generation (post-Kepler) you can have one GPU that’s serving up eight simulateneous [sic] game calls. So you could have one PC and eight televisions and eight controllers and everybody getting great performance out of it. We’re used to having one monitor, or two monitors -- now we’re saying lets expand that a little bit.

This is pretty much confirmation, assuming no transcription errors on the part of The Verge, that Maxwell will support the virtualization features of GK110 and bring it mainstream. This also makes NVIDIA Grid make much more sense in the long term. Perhaps NVIDIA will provide some flavor of a Grid server for households directly?

The concept gets me particularly excited. One of the biggest wastes of money the tech industry has is purchasing redundant hardware. Consoles are a perfect example: not only is the system redundant to your other computational device which is usually at worst a $200 GPU away from a completely better experience, you pay for software to be reliant on that redundant platform which will eventually disappear along with said software. In fact, many have multiple redundant consoles because the list of software they desire is not localized to just one system so they need redundant redundancies. Oy!

A gaming server should help make the redundancy argument more obvious. If you need extra interfaces then you should only need to purchase the extra interfaces. Share the number crunching and only keep it up to date.

Also check out the rest of the interview over at The Verge. I decided just to cover a small point with potentially big ramifications.

Coverage of CES 2013 is brought to you by AMD!

PC Perspective's CES 2013 coverage is sponsored by AMD.

Follow all of our coverage of the show at http://pcper.com/ces!

Source: The Verge

CES 2013: Valve Talks Piston, Better Listen. Steam Box!

Subject: Editorial, General Tech, Systems, Shows and Expos | January 8, 2013 - 02:06 PM |
Tagged: Xi3, valve, trinity, Steam Box, ces 2013, CES, amd

Going from a failed Kickstarter to Valve’s premier console? Sounds like a good anecdote to tell.

Valve has finally discussed the Steam Box in more concrete details. Get ready for some analysis; there are a bunch of hidden stories to be told. We will tell them.

Update for clarity: As discussed in IRC technically this was an Xi3 announcement that Valve will have at their booth but not an official Valve announcement. That said, Valve will have it at their booth and Valve funded Xi3.

Another Update for new information: Turns out this is not the Valve-official device. Ben Krasnow, Valve hardware engineer, made a statement that the official Steam Box is not planned to be announced in 2013. What we will see this year is 3rd Party implementations, and that should be it. News story to follow.

ValvePiston.jpg

Image by Engadget

As everyone is reporting, Valve hired out Xi3 Corporation to develop the Steam Box under the codename Piston. Xi3Corporation was founded in 2010 and revealed their first product at CES two years ago. In late September, Xi3 launched an unsuccessful Kickstarter to fund their latest designs: The X7A and the X3A.

The X7A Modular Computer is the most interesting as it seems to be what the Piston is based on. Regardless of the Kickstarter’s failure, Valve still reached out to Xi3 Corporation chequebook in hand. According to the Kickstarter page, the X7A has the following features:

  • 64-Bit Quad-Core x86 processor up to 3.2 GHz with 384 graphics shader cores.
    • My personal best guess is the AMD A10-4600M Trinity APU.
  • 8GB of DDR3 RAM
  • 1 TB of “Superfast” Solid State Memory
  • Four USB 3.0
  • Four USB 2.0
  • Four eSATAp
  • Gigabit E
  • 40Watt under load
  • “Under $1000” although that includes 1TB of SSD storage.
    • Also Valve could take a loss, because Steam has no problem with attach rate.

The key piece of information is the 40Watt declaration. According to Engadget who went hands-on with the Valve Piston, it too is rated for 40Watt under load. This means that it is quite likely for the core specifications of the Kickstarter to be very similar to the specifications of the Piston.

Benchmarks for the 7660G have the device running Far Cry 3 on low settings at around 34 FPS as well as Black Ops 2 running on Medium at 42 FPS. That said, with a specific hardware platform to target developers will be able to better optimize.

During the SpikeTV VGAs, Gabe Newell stated in an interview with Kotaku that third parties would also make “Steam Boxes”. They are expected to be available at some point in 2013.

Coverage of CES 2013 is brought to you by AMD!

PC Perspective's CES 2013 coverage is sponsored by AMD.

Follow all of our coverage of the show at http://pcper.com/ces!

Source: Engadget

CES 2013: ZOTAC Has a New ZBOX mini-PC

Subject: Editorial, General Tech, Systems, Shows and Expos | January 7, 2013 - 02:00 PM |
Tagged: zotac, nuc, ces 2013, CES

Zotac-ZBOXbanner.jpg

If you were interested in the Intel NUC review from mid-December then you might be interested in its competitors.

ZOTAC has been making small form factor PCs for three years at this point. This, 3rd, iteration contains the NVIDIA GeForce GT 610 graphics cart with a 2nd Generation Intel Core processor. With the ZBOX you can stream video and other content using dual Gigabit Ethernet or dual external Wi-Fi antennas. Unlike Intel, ZOTAC is making a big deal about its cooling capabilities of its new chassis.

Zotac-ZBOX2.jpg

They will also be keeping their 2nd generation ZBOX chassis available, presumably for those who would be upset about a 7mm increase in size, with an Intel HD 4000 GPU. No discussion that I could find about price or release date however.

Press release after the break.

Coverage of CES 2013 is brought to you by AMD!

PC Perspective's CES 2013 coverage is sponsored by AMD.

Follow all of our coverage of the show at http://pcper.com/ces!

Source: ZOTAC

CES 2013: NVIDIA Grid to Fight Gaikai and OnLive?

Subject: Editorial, General Tech, Graphics Cards, Systems, Mobile, Shows and Expos | January 7, 2013 - 01:07 AM |
Tagged: CES, ces 2013, nvidia

The second act of the NVIDIA keynote speech re-announced their Grid cloud-based gaming product first mentioned back in May during GTC. You have probably heard of its competitors, Gaikai and OnLive. The mission of these services is to have all of the gaming computation done in a server somewhere and allow the gamer to log in and just play.

phpzXHEk9IMG_8955.JPG

The NVIDIA Grid is their product top-to-bottom. Even the interface was created by NVIDIA and, as they laud, rendered server-side using the Grid. It was demonstrated to stream to an LG smart TV directly or Android tablets. A rack will contain 20 servers with 240 GPUs with a total of 200 Teraflops of computational power. Each server will initially be able to support 24 players, which is interesting, given the last year of NVIDIA announcements.

Last year, during the GK110 announcement, Kepler was announced to support hundreds of clients to access a single server for professional applications. It seems only natural that Grid would benefit from that advancement: but it apparently does not. With a limit of 24 players per box, equating to a maximum of two players per GPU, it seems odd that a limit would be in place. The benefit of stacking multiple players per GPU is that you can achieve better-than-linear scaling in the long-tail of games.

Then again, all they need to do is solve the scaling problem before they have a problem with scaling their service.

Coverage of CES 2013 is brought to you by AMD!

PC Perspective's CES 2013 coverage is sponsored by AMD.

Follow all of our coverage of the show at http://pcper.com/ces!

HTML5 Games: The Legacy of PC Gaming?

Subject: Editorial, General Tech, Mobile | December 30, 2012 - 04:48 PM |
Tagged: webgl, w3c, html5

I use that title in quite a broad sense.

I ran across an article on The Verge which highlighted the work of a couple of programmers to port classic Realtime Strategy games to the web browser. Command and Conquer along with Dune II, two classics of PC Gaming, are now available online for anyone with a properly standards-compliant browser.

CandCHTML5.png

These games, along with the Sierra classics I wrote about last February, are not just a renaissance of classic PC games: they preserve them. It is up to the implementer to follow the standard, not the standards body to approve implementations. So long as someone still makes a browser which can access a standards-based game, the game can continue to be supported.

A sharp turn from what we are used to with console platforms, right?

I have been saying this for quite some time now: Blizzard and Valve tend to support their games much longer than console manufacturers support their whole platforms. You can still purchase at retail, and they still manufacture, the original StarCraft. The big fear over “modern Windows” is that backwards compatibility will be ended and all applications would need to be certified by the Windows Store.

When programmed for the browser -- yes, even hosted offline on local storage -- those worries disappear. Exceptions for iOS and Windows RT where they only allow you to use Safari or Trident (IE10+) which still leaves you solely at their mercy to follow standards.

Still, as standards get closer to native applications in features and performance, we will have a venue for artists to create and preserve their work for later generations to experience. The current examples might be 2D and of the pre-Pentium era but even now there are 3D-based shooters developed from websites. There is even a ray tracing application built on WebGL (although that technically is reliant on both the W3C and Khronos standards bodies) that just runs in a decent computer with plain-old Firefox or Google Chrome.

Source: The Verge

Phoronix on OpenCL Driver Optimization, NVIDIA vs. AMD

Subject: Editorial, General Tech, Graphics Cards | December 28, 2012 - 02:43 AM |
Tagged: opencl, nvidia, amd

The GPU is slowly becoming the parallel processing complement to your branching logic-adept CPU. Developers have been slow to adopt this new technology but that does not hinder the hardware manufacturers from putting on a kettle of tea for when guests arrive.

While the transition to GPGPU is slower than I am sure many would like, developers are rarely quick on the uptake of new technologies. The Xbox 360 was one of the first platforms where unified shaders became mandatory and early developers avoided them by offloading vertex code to the CPU. On that note: how much software still gets released without multicore support?

7-TuxGpu.png

Phoronix, practically the arbiter of all Linux news, decided to put several GPU drivers and their manufacturers to the test. AMD was up first and their results showed a pretty sizeable jump in performance at around October of this year through most of their tests. The article on NVIDIA arrived two days later and saw performance trended basically nowhere since February with the 295.20 release.

A key piece of information is that both benchmarks were performed with last generation GPUs: the GTX 460 on the NVIDIA side, with the 6950 holding AMD’s flag. You might note that 295.20 was the last tested driver to be released prior to the launch of Kepler.

These results seem to suggest that upon the launch of Kepler, NVIDIA did practically zero optimizations to their older "Fermi" architecture at least as far as these Linux OpenCL benchmarks are concerned. On the AMD side, it seems as though they are more willing to go back and advance the performance of their prior generation as they release new driver versions.

There are very few instances where AMD beats out NVIDIA in terms of driver support -- it is often a selling point for the jolly green giant -- but this appears to be a definite win for AMD.

Source: Phoronix