Imagination Technologies Unleashes Warrior MIPS P5600 CPU Core Aimed at Embedded and Mobile Devices

Subject: Editorial, General Tech, Networking, Processors, Mobile | October 18, 2013 - 10:45 PM |
Tagged: SoC, p5600, MIPS, imagination

Imagination Technologies, a company known for its PowerVR graphics IP, has unleashed its first Warrior P-series MIPS CPU core. The new MIPS core is called the P5600 and is a 32-bit core based on the MIPS Release 5 ISA (Instruction Set Architecture).

The P5600 CPU core can perform 128-bit SIMD computations, provide hardware accelerated virtualization, and access up to a 1TB of memory via virtual addressing. While the MIPS 5 ISA provides for 64-bit calculations, the P5600 core is 32-bit only and does not include the extra 64-bit portions of the ISA.

Imagination Technologies Warrior MIPS P5600 CPU Core.png

The MIPS P5600 core can scale up to 2GHz in clockspeed when used in chips built on TSMC's 28nm HPM manufacturing process (according to Imagination Technologies). Further, the Warrior P5600  core can be used in processors and SoCs. As many as six CPU cores can be combined and managed by a coherence manager and given access to up to 8MB of shared L2 cache. Imagination Technologies is aiming processors containing the P5600 cores at mobile devices, networking appliances (routers, hardware firewalls, switches, et al), and micro-servers.

MIPS-P5600-Coherent-multicore-system.png

A configuration of multiple P5600 cores with L2 cache.

I first saw a story on the P5600 over at the Tech Report, and found it interesting that Imagination Technologies was developing a MIPS processor aimed at mobile devices. It does make sense to see a MIPS CPU from the company as it owns the MIPS intellectual property. Also, a CPU core is a logical step for a company with a large graphics IP and GPU portfolio. Developing its own MIPS CPU core would allow it to put together an SoC with its own CPU and GPU components. With that said, I found it interesting that the P5600 CPU core was being aimed at the mobile space, where ARM processors currently dominate. ARM is working to increase performance while Intel is working to bring its powerhouse x86 architecture to the ultra low power mobile space. Needless to say, it is a highly competitive market and Imagination Technologies new CPU core is sure to have a difficult time establishing itself in that space of consumer smartphone and tablet SoCs. Fortunately, mobile chips are not the only processors Imagination Technologies is aiming the P5600 at. It is also offering up the MIPS Series 5 compatible core for use in processors powering networking equipment and very low power servers and business appliances where the MIPS architecture is more commonplace.

In any event, I'm interested to see what else IT has in store for its MIPS IP and where the Warrior series goes from here!

More information on the MIPS 5600 core can be found here.

Win a copy of Batman: Arkham Origins courtesy of NVIDIA

Subject: Editorial, General Tech, Graphics Cards | October 10, 2013 - 12:28 PM |
Tagged: podcast, nvidia, contest, batman arkham origins

UPDATE: We picked our winner for week 1 but now you can enter for week 2!!!  See the new podcast episode listed below!!

Back in August NVIDIA announced that they would be teaming up with Warner Bros. Interactive to include copies of the upcoming Batman: Arkham Origins game with select NVIDIA GeForce graphics cards. While that's great and all, wouldn't you rather get one for free next week from PC Perspective?

batmanao.jpg

Great, you're in luck!  We have a handful of keys to give out to listeners and viewers of the PC Perspective Podcast.  Here's how you enter:

  1. Listen to or watch episode #272 of the PC Perspective Podcast and listen for the "secret phrase" as mentioned in the show!
  2. Subscribe to our RSS feed for the podcast or subscribe to our YouTube channel.
  3. Fill out the form at the bottom of this podcast page with the "secret phrase" and you're entered!

I'll draw a winner before the next podcast and announce it on the show!  We'll giveaway one copy each of the next two weeks!  Our thanks goes to NVIDIA for supplying the Batman: Arkham Origins keys for this contest!!

No restrictions on winning, so good luck!!

Manufacturer: Scott Michaud

A new generation of Software Rendering Engines.

We have been busy with side projects, here at PC Perspective, over the last year. Ryan has nearly broken his back rating the frames. Ken, along with running the video equipment and "getting an education", developed a hardware switching device for Wirecase and XSplit.

My project, "Perpetual Motion Engine", has been researching and developing a GPU-accelerated software rendering engine. Now, to be clear, this is just in very early development for the moment. The point is not to draw beautiful scenes. Not yet. The point is to show what OpenGL and DirectX does and what limits are removed when you do the math directly.

Errata: BioShock uses a modified Unreal Engine 2.5, not 3.

In the above video:

  • I show the problems with graphics APIs such as DirectX and OpenGL.
  • I talk about what those APIs attempt to solve, finding color values for your monitor.
  • I discuss the advantages of boiling graphics problems down to general mathematics.
  • Finally, I prove the advantages of boiling graphics problems down to general mathematics.

I would recommend watching the video, first, before moving forward with the rest of the editorial. A few parts need to be seen for better understanding.

Click here, after you watch the video, to read more about GPU-accelerated Software Rendering.

Manufacturer: Microsoft

If Microsoft was left to their own devices...

Microsoft's Financial Analyst Meeting 2013 set the stage, literally, for Steve Ballmer's last annual keynote to investors. The speech promoted Microsoft, its potential, and its unique position in the industry. He proclaims, firmly, their desire to be a devices and services company.

The explanation, however, does not befit either industry.

microsoft-ballmer-goodbye.jpg

Ballmer noted, early in the keynote, how Bing is the only notable competitor to Google Search. He wanted to make it clear, to investors, that Microsoft needs to remain in the search business to challenge Google. The implication is that Microsoft can fill the cracks where Google does not, or even cannot, and establish a business from that foothold. I agree. Proprietary products (which are not inherently bad by the way), as Google Search is, require one or more rivals to fill the overlooked or under-served niches. A legitimate business can be established from that basis.

It is the following, similar, statement which troubles me.

Ballmer later mentioned, along the same vein, how Microsoft is among the few making fundamental operating system investments. Like search, the implication is that operating systems are proprietary products which must compete against one another. This, albeit subtly, does not match their vision as a devices and services company. The point of a proprietary platform is to own the ecosystem, from end to end, and to derive your value from that control. The product is not a device; the product is not a service; the product is a platform. This makes sense to them because, from birth, they were a company which sold platforms.

A platform as a product is not a device nor is it service.

Keep reading to see what Microsoft... probably still cannot.

Gabe Newell LinuxCon Keynote. Announcement Next Week.

Subject: Editorial, General Tech, Systems, Mobile, Shows and Expos | September 16, 2013 - 06:15 PM |
Tagged: Steam Box, LinuxCon, Gabe Newell

Valve Software, as demonstrated a couple of days ago, still believe in Linux as the future of gaming platforms. Gabe Newell discussed this situation at LinuxCon, this morning, which was streamed live over the internet (and I transcribed after the teaser break at the bottom of the article). Someone decided to rip the stream, not the best quality but good enough, and put it on Youtube. I found it and embed it below. Enjoy!

Gabe Newell highlights, from the seventh minute straight through to the end, why proprietary platforms look successful and how they (sooner-or-later) fail by their own design. Simply put, you can control what is on it. Software you do not like, or even their updates, can be stuck in certification or even excluded from the platform entirely. You can limit malicious software, at least to some extent, or even competing products.

Ultimately, however, you limit yourself by not feeding in to the competition of the crowd.

If you wanted to get your cartridge made you bought it, you know, FOB in Tokyo. If you had a competitive product, miraculously, your ROMs didn't show up until, you know, 3 months after the platform holder's product had entered market and stuff like that. And that was really where the dominant models for what was happening in gaming ((came from)).

But, not too surprisingly, open systems were advancing faster than the proprietary systems had. There used to be these completely de novo graphics solutions for gaming consoles and they've all been replaced by PC-derived hardware. The openness of the PC as a hardware standard meant that the rate of innovation was way faster. So even though, you would think, that the console guys would have a huge incentive to invest in it, they were unable to be competitive.

Microsoft attempts to exert control over their platform with modern Windows which is met by a year-over-year regression in PC sales; at the same time, PC gaming is the industry hotbed of innovation and it is booming as a result. In a time of declining sales in PC hardware, Steam saw a 76% growth (unclear but it sounds like revenue) from last year.

Valve really believes the industry will shift toward a model with little divide between creator and consumer. The community has been "an order of magnitude" more productive than the actual staff of Team Fortress 2.

Does Valve want to compete with that?

This will only happen with open platforms. Even the consoles, with systems sold under parts and labor costs to exert control, have learned to embrace the indie developer. The next gen consoles market indie developers, prior to launch, seemingly more than the industry behemoths and that includes their own titles. They open their platforms a little bit but it might still not be enough to hold off the slow and steady advance of PC gaming be it through Windows, Linux, or even web standards.

Speaking of which, Linux and web standards are oft criticized because they are fragmented. Gabe Newell, intentionally or unintentionally, claimed proprietary platforms are more fragmented. Open platforms have multiple bodies push and pull the blob but it all tends to flow in the same direction. Proprietary platforms have lean bodies with control over where they can go, just many of them. You have a dominant and a few competing platforms for each sector: phones and tablets, consoles, desktops, and so forth.

He noted each has a web browser and, because the web is an open standard, is the most unified experience across devices of multiple sectors. Open fragmentation is small compared to the gaps between proprietary silos across sectors. ((As a side note: Windows RT is also designed to be one platform for all platforms but, as we have been saying for a while, you would prefer an open alternative to all RT all the time... and, according to the second and third paragraphs of this editorial, it will probably suffer from all of the same problems inherent to proprietary platforms anyway.))

Everybody just sort of automatically assumes that the internet is going to work regardless of wherever they are. There may be pluses or minuses of their specific environment but nobody says, "Oh I'm in an airplane now, I'm going to use a completely different method of accessing data across a network". We think that should be more broadly true as well. That you don't think of touch input or game controllers or living rooms as being things which require a completely different way for users to interact or acquire assets or developers to program or deliver to those targets.

Obviously if that is the direction you are going in, Linux is the most obvious basis for that and none of the proprietary, closed platforms are going to be able to provide that form of grand unification between mobile, living room, and desktop.

Next week we're going to be rolling out more information about how we get there and what are the hardware opportunities that we see for bringing Linux into the living room and potentially pointing further down the road to how we can get it even more unified in mobile.

Well, we will certainly be looking forward to next week.

Personally, for almost two years I found it weird how Google, Valve, and Apple (if the longstanding rumors were true) were each pushing for wearable computing, Steam Box/Apple TV/Google TV, and content distribution at the same time. I would not be surprised, in the slightest, for Valve to add media functionality to Steam and Big Picture and secure a spot in the iTunes and Play Store market.

As for how wearables fit in? I could never quite figure that out but it always felt suspicious.

Read on for our transcript of the keynote speech. Bare with us, it is a little bit rough.

Source: LinuxCon

Steam Family Sharing Is Different From Xbox One

Subject: Editorial, General Tech | September 11, 2013 - 05:31 PM |
Tagged: xbox one, valve, steam

I know there will be some comparison between the recent Steam Family Sharing announcement and what Microsoft proposed, to a flock of airborne tomatoes I might add, for the Xbox One. Steam integrates some level of copy discouragement by accounts which identify a catalog of content with an individual. This account, user name and password, tends to be more precious to the licensee than a physical disk or a nondescript blob of bits.

The point is not to prevent unauthorized copying, however; the point is to increase sales.

SteamRequestingAccess.jpg

Account information is used, not just for authentication, but to add value to the service. If you log in to your account from a friend's computer, you have access to your content and it can be installed to their machine. This is slightly more convenient, given a fast internet connection, than carrying a DRM-free game on physical storage (unless, of course, paid licenses are revoked or something). Soon, authorized friends will also be able to borrow your library when you are not using it if their devices are authorized by your account.

Microsoft has a similar authentication system through Xbox Live. The Xbox One also proposed a sharing feature with the caveat that all devices would need a small, few kilobyte, internet connection once every 24 hours.

The general public went mental.

The debate (debacle?) between online sharing and online restrictions saw fans of the idea point to the PC platform and how Steam has similar restrictions. Sure, Steam has an offline mode, but it is otherwise just as restrictive; Valve gets away with it, Microsoft should too!

It is true, Microsoft has a more difficult time with public relations than Valve does with Steam. However, like EA and their troubles with Origin, they have shown themselves to be less reliable than Valve over time. When a purchase is made on Steam, it has been kept available to the best of their abilities. Microsoft, on the other hand, bricked the multiplayer and online features of each and every original Xbox title. Microsoft did a terrible job explaining how the policy benefits customers, and that is declared reason for the backlash, but had they acquired trust from their customers over the years then this might have just blown over. Even still, I find Steam Family Sharing to be a completely different situation from what we just experienced in the console space.

So then, apart from banked good faith, what is the actual difference?

Steam is not the only place to get PC games!

Games could be purchased at retail or competing online services such as GoG.com. Customers who disagree with the Xbox One license have nowhere else to go. In the event that a game is available only with restrictive DRM, which many are, the publisher and/or developer holds responsibility. There is little stopping a game from being released, like The Witcher 3, DRM-free at launch and trusting the user to be ethical with their bits.

Unfortunately for Xbox Division, controlling the point of sale is how they expect to recover the subsidized hardware. Their certification and retail policies cannot be circumvented because that is their business model: lose some money acquiring customers who then have no choice but to give you money in return.

This is not the case on Windows, Mac, and Linux. It is easy to confuse Steam with "PC Gaming", however, due to how common it is. They were early, they were compelling, but most of all they were consistent. Their trust was earned and, moreover, is not even required to enjoy the PC platform.

Source: Steam
Author:
Subject: Editorial
Manufacturer: AMD

Retiring the Workhorses

There is an inevitable shift coming.  Honestly, this has been quite obvious for some time, but it has just taken AMD a bit longer to get here than many have expected.  Some years back we saw AMD release their new motto, “The Future is Fusion”.  While many thought it somewhat interesting and trite, it actually foreshadowed the massive shift from monolithic CPU cores to their APUs.  Right now AMD’s APUs are doing “ok” in desktops and are gaining traction in mobile applications.  What most people do not realize is that AMD will be going all APU all the time in the very near future.

AMD_Fusion.jpg

We can look over the past few years and see that AMD has been headed in this direction for some time, but they simply have not had all the materials in place to make this dramatic shift.  To get a better understanding of where AMD is heading, how they plan to address multiple markets, and what kind of pressures they are under, we have to look at the two major non-APU markets that AMD is currently hanging onto by a thread.  In some ways, timing has been against AMD, not to mention available process technologies.

Click here to read the entire editorial!

 

PC Perspective Hardware Workshop 2013 @ Quakecon 2013 in Dallas, TX

Subject: Editorial, General Tech | August 6, 2013 - 11:30 AM |
Tagged: workshop, video, streaming, quakecon, prizes, live, giveaways

UPDATE: Did you miss the live event?  Check out the replay below!  Thanks to everyone that helped make it possible and see you next year!

It is that time of year again: another installment of the PC Perspective Hardware Workshop!  Once again we will be presenting on the main stage at Quakecon 2013 being held in Dallas, TX August 1-4th.  

workshop_logo.png

Main Stage - Quakecon 2013

Saturday, August 3rd, 12:30pm CT

Our thanks go out to the organizers of Quakecon for allowing us and our partners to put together a show that we are proud of every year.  We love giving back to the community of enthusiasts and gamers that drive us to do what we do!  Get ready for 2 hours of prizes, games and raffles and the chances are pretty good that you'll take something out with you - really, they are pretty good!

Our thanks for this year's workshop logo goes to John Pastor!!

Our primary partners at the event are those that threw in for our ability to host the workshop at Quakecon and for the hundreds of shirts we have ready to toss out!  Our thanks to NVIDIA, Western Digital and Corsair!!

nvidia_logo_small.png

wdc_logo_small.png

corsair_logo_small.png

Live Streaming

If you can't make it to the workshop - don't worry!  You can still watch the workshop live on our page right here as we stream it over one of several online services.  Just remember this URL: http://pcper.com/workshop and you will find your way!

 

PC Perspective LIVE Podcast and Meetup

We are planning on hosting any fans that want to watch us record our weekly PC Perspective Podcast (http://pcper.com/podcast) on Wednesday evening in our meeting room at the Hilton Anatole.  I don't yet know exactly WHEN or WHERE the location will be, but I will update this page accordingly on Wednesday July 31st when we get the data.  You might also consider following me on Twitter for updates on that status as well.

After the recording, we'll hop over the hotel bar for a couple drinks and hang out.  We have room for at leaast 50-60 people to join us in the room but we'll still be recording if just ONE of you shows up.  :)

 

Prize List (will continue to grow!)

Continue reading to see the list of prizes for the workshop!!!

Qualcomm works out their ARMs and not their core?

Subject: Editorial, General Tech, Processors, Mobile | August 3, 2013 - 04:21 PM |
Tagged: qualcomm, Intel, mediatek, arm

MediaTek, do you even lift?

According to a Taiwan Media Roundtable transcript, discovered by IT World, Qualcomm has no interest, at least at the moment, in developing an octo-core processor. MediaTek, their competitor, recently unveiled an eight core ARM System on a Chip (SoC) which can be fully utilized. Most other mobile SoCs with eight cores function as a fast quad-core and a slower, but more efficient, quad-core processor with the most appropriate chosen for the task.

qualcomm.png

Anand Chandrasekher of Qualcomm believes it is desperation.

So, I go back to what I said: it's not about cores. When you can't engineer a product that meets the consumers' expectations, maybe that’s when you resort to simply throwing cores together. That is the equivalent of throwing spaghetti against the wall and seeing what sticks. That's a dumb way to do it and I think our engineers aren't dumb.

The moderator, clearly amused by the reaction, requested a firm clarification that Qualcomm will not launch an octo-core product. A firm, but not clear, response was given, "We don't do dumb things". Of course they would not commit to swearing off eight cores for all eternity, at some point they may find core count to be their bottleneck, but that is not the case for the moment. They will also not discuss whether bumping the clock rate is the best option or whether they should focus on graphics performance. He is just assured that they are focused on the best experience for whatever scenario each product is designed to solve.

And he is assured that Intel, his former employer, still cannot catch them. As we have discussed in the past: Intel is a company that will spend tens of billions of dollars, year over year, to out-research you if they genuinely want to play in your market. Even with his experience at Intel, he continues to take them lightly.

We don't see any impact from any of Intel's claims on current or future products. I think the results from empirical testers on our products that are currently shipping in the marketplace is very clear, and across a range of reviewers from Anandtech to Engadget, Qualcomm Snapdragon devices are winning both on experience as well as battery life. What our competitors are claiming are empty promises and is not having an impact on us.

Qualcomm has a definite lead, at the moment, and may very well keep ahead through Bay Trail. AMD, too, kept a lead throughout the entire Athlon 64 generation and believed they could beat anything Intel could develop. They were complacent, much as Qualcomm sounds currently, and when Intel caught up AMD could not float above the sheer volume of money trying to drown them.

Then again, even if you are complacent, you may still be the best. Maybe Intel will never get a Conroe moment against ARM.

Your thoughts?

Source: IT World

NVIDIA CloudLight: White Paper for Lighting in the Cloud

Subject: Editorial, General Tech | August 3, 2013 - 01:03 PM |
Tagged: nvidia, CloudLight, cloud gaming

Trust the cloud... be the cloud.

The executives on stage might as well have waved their hands while reciting that incantation during the announcement of the Xbox One. Why not? The audience would have just assumed Don Mattrick was trying to get some weird Kinect achievement on stage. You know, kill four people with one laser beam while trying to sink your next-generation platform in a ranked keynote. 50 Gamerscore!

cloudlight.png

Microsoft stated, during and after the keynote, that each Xbox One would have access to cloud servers for certain processing tasks. Xbox Live would be receiving enough servers such that each console could access three times its performance, at launch, to do... stuff. You know, things that are hard to calculate but are not too dependent upon latency. You know what we mean, right?

Apparently Microsoft did not realize that was a detail they were supposed to sell us on.

In the mean time, NVIDIA has been selling us on offloaded computation to cloud architectures. We knew Global Illumination (GI) was a very complicated problem. Most of the last couple decades has been progressively removing approximations to what light truly does.

CloudLight is their research project, presented at SIGRAPH Asia and via Williams College, to demonstrate server-processed indirect lighting. In their video, each of the three effects are demonstrated at multiple latencies. The results look pretty good until about 500ms which is where the brightest points are noticeably in the wrong locations.

cloudlight-2.jpg

Again, the video is available here.

The three methods used to generate indirect lighting are: irradiance maps, where lightmaps are continuously calculated on a server and streamed by H.264; photons, which raytraces lighting for the scene as previous rays expire and streams only the most current ones to clients who need it; and voxels, which stream fully computed frames to the clients. The most interesting part is that as you add more users, in most cases, server-processing remains fairly constant.

It should be noted, however, that each of these demonstrations only moved the most intense lights slowly. I would expect an effect such as switching a light on in an otherwise dark room would create a "pop-in" effect if it lags too far behind user interaction or the instantaneous dynamic lights.

That said, for a finite number of instant switches, it would be possible for a server to render both results and have the client choose the appropriate lightmap (or the appropriate set of pixels from the same, large, lightmap). For an Unreal Tournament 3 mod, I was experimenting with using a Global Illumination solver to calculate lighting. My intention was to allow users to turn on and off a handful of lights in each team's base. As lights were shot out or activated by a switch, the shader would switch to the appropriate pre-rendered solution. I would expect a similar method to work here.

What other effects do you believe can withstand a few hundred milliseconds of latency?

Source: NVIDIA