Author:
Subject: Editorial
Manufacturer: iD

3+ Hours of discussion later...

IMG_2469.jpg

The beginning of QuakeCon is always started by several hours of John Carmack talking about very technical things.  This two hour keynote typically runs into the three to four hour range, and it was no different this time.  John certainly has the gift of gab when it comes to his projects, but unlike others his gab is chock full of useful information, often quite beyond the understanding of those in the audience.

 

RAGE-logo1.png

The first topic of discussion was that of last year’s Rage launch.  John was quite apologetic about how it went, especially in terms of PC support.  For a good portion of users out there, it simply would not work due to driver issues on the AMD side.  The amount of lessons they learned from Rage were tremendous.  iD simply cannot afford to release two games in one decade.  Rage took some six plus years of development.  Consider that Doom 3 was released in 2004, and we did not see Rage until Fall 2011.  The technology in Rage is a big step up due to the use of iD Tech 5, and the art assets of the title are very impressive.

iD also made some big mistakes in how they have marketed the title.  Many people were assuming that it would be a title more in line with Bethesda’s Fallout 3 with a lot of RPG type missions and storyline.  Instead of a 80 hour title that one would expect, it was a 10+ hour action title.  So marketing needs to create a better representation of what the game entails.  They also need to stay a bit more focused on what they will be delivering, and be able to do so in a timely manner.

Read the rest of John's Keynote by clicking here.

Author:
Manufacturer: SiliconDust

An HTPC Perspective on home theater PC technology

We conducted a reader survey a few weeks ago, and one of the tech topics that received a surprising amount of interest in was HTPC coverage. You, our awesome readers, wanted to know more about the hardware and software behind them. I’ll admit that I was ardent about the prospects of talking HTPCs with you. As a relatively new entrant to that area of tech myself, I was excited to cover it, and give you more coverage on a topic you wanted to see more of!

Today we won't be talking about home theater PCs in the sense of a computer in the living room AV rack (Ryan covered that earlier this week), but rather a related technology that makes the HTPC possible: the CableCARD-equipped TV tuner.

I will forewarn you that this article is quite a bit more informal than my usual writings, especially if you only follow my PC Perspective postings. In the future, it may not be that way, but I wanted to give some backstory and some personal thoughts on the matter to illustrate how I got into rolling my own DVR and why I’m excited about it (mainly: it saves money and is very flexible).

Preface/Background

Despite my previous attempts to “cut the cord” and use only Internet-based services for television, me and my girlfriend slowly but surely made our way back to cable TV. For about a year we survived on Netflix, Hulu, and the various networks’ streaming videos on their respective websites but as the delays between a shows airing and web streaming availability increased and Netflix instant Streaming started losing content the price of cable started to look increasingly acceptable.

She was probably the first one to feel the effects of a lack of new content – especially with a newfound love for a rather odd show called True Blood. It was at some point thereafter, once she had caught up with as many seasons offered on Netflix of various shows as possible that she broke down and ordered U-Verse. U-Verse is an interesting setup of television delivery using internet protocol (IPTV). While we did have some issues at first with the Residential Gateway and signal levels, it was eventually sorted out and it was an okay setup. It offered a lot of channels – with many in HD. In the end though, after the promotional period was up, it got very expensive to stay subscribed to. Also, because it was IPTV, it was not as flexible as traditional cable as far as adding extra televisions and the DVR functionality. Further, the image quality for the HD streams, while much better than SD, was not up to par with the cable and satellite feeds I’ve seen.

Being with Comcast for Internet for about three years now, I’ve been fairly happy with it. One day I saw a promotion for currently subscribed customers for TV + Blast internet for $80, which was only about $20 more than I was paying each month for its Performance tier. After a week of hell Therefore, I decided to sign up for it. Only, I did not want to rent a Comcast box, so I went searching for alternatives.

Enter the elusive and never advertised CableCARD

It was during this search that I learned a great deal about CableCARDs and the really cool things that they enabled. Thanks to the FCC, cable television providers in the United States have to give their customers an option other than renting a cable box for a monthly fee – customers have to be able to bring their own equipment if they wish (they can still charge you for the CableCARD but at a reduced rate, and not all cable companies charge a fee for them). But what is a CableCARD? In short, it is a small card that resembles a PCMIA expansion card – a connector that can commonly be found in older laptops (think Windows XP-era). It is to be paired with a CableCARD tuner and acts as the key to decrypt the encrypted television stations in your particular subscriber package. They are added much like a customer-owned modem is, by giving the cable company some numbers on the bottom of the card that act as a unique identifier. The cable company then connects that particular card to your account and sends it a profile of what channels you are allowed to tune into.

Cablecard_for_PC Perspective_Tim_Verry.jpg

There are some drawbacks, however. Mainly that On Demand does not work with most CableCARDS. Do note that this is actually not a CableCARD hardware issue, but a support issue on the cable company side. You could, at least in theory, get a CableCARD and tuner that could tune in On Demand content, but right now that functionality seems to be limited to some Tivos and the rental cable boxes (paradoxically some of those are actually CableCARD-equipped). It’s an unfortunate situation, but here’s hoping that it is supported in the future. Also, if you do jump into the world of CableCARDs, it is likely that you will find yourself in a situation where you know more about them than the cable installer as cable companies do not advertise them, and only a small number of employees are trained on them. Don’t be too hard on the cable tech though, it's primarily because cable companies would rather rent you a (expensive) box, and a very small number of people actually know about and need a tech to support the technology. I was lucky enough to get one of the “CableCARD guys,” on my first install, but I’ve also gotten techs that have never seen one before and it made for an interesting conversation piece as they diagnosed signal levels for the cable modem (heh). Basically, patience is key when activating your CableCARD, and I highly recommend asking around forums like DSLReports for the specific number(s) to call to get to the tier 2 techs that are familiar with CableCARDs for your specific provider when calling to activate it if you opt to do a self-install. Even then, you may run into issues. For example, something went wrong with activation on the server side at Comcast so it took a couple of hours for them to essentially unlock all of my HD channels during my install.

Continue reading to find out why I'm so excited about CableCARDs and home theater PCs!

Author:
Subject: Editorial
Manufacturer:

Introduction, Hardware To Look For

windows98.png

Every year the college I graduated from, Beloit College, publishes its not-that-famous “mindset list.” It’s a collection of one-liners, such as “Clint Eastwood is better known as a director than as Dirty Harry,” meant to humorously remind professors that the experiences of their generation are not the same as the generation about to show up in their classrooms.
 
I’ve sometimes felt a need for a similar reminder among gamers. Arcade classics like Pac-Man and DOS legends such as Prince Of Persia are often cited in conversations of old-school gaming, yet many gamers (including myself) never enjoyed the experience of playing these titles when they first hit store shelves. 
 
interstate76.png
 
I enjoyed a different generation of classics. My original copy of Interstate ’76 is nestled in a binder of old CDs. A boxed copy of Mechwarrior 2 sits on my book shelf. I have Baldur’s Gate, Sid Meier’s Alpha Centauri, Total Annihilation 2, Starcraft, SimCity 2000, The Elder Scrolls: Daggerfall and Age of Empires II, to name a few. These were my formative gaming experiences. Some have always been with me  – others, lost or destroyed, have been re-acquired from thrift stores for a few bucks each.
 
Yet I can’t play most of these games without buying them again (via a service like Good Old Games) or resorting to virtualization. The reliability of Window’s compatibility mode is spotty to say the least.
Even if a game does run on my Windows 7 PC, something is missing. The old controllers of yesterday usually don’t agree with – or can’t physically connect to – my modern desktop. The graphics, designed for the CRT era, often don’t translate well to a high-resolution LCD. Random bugs and errors can occur, stopping the games in their tracks.
 
I’ve finally decided that there is only one solution. If you want to run a game from the 1990s and enjoy them properly you should also have hardware that can play games from that era as originally intended. That means putting together a legacy gaming system.
 
This is something that I think anyone should be able to do without spending more than $150. But can you, and if so, is it worth your time?
 
 
Author:
Subject: Editorial
Manufacturer: AMD

Less Risk, Faster Product Development and Introduction

There have been quite a few articles lately about the upcoming Bulldozer refresh from AMD, but a lot of the information that they have posted is not new.  I have put together a few things that seem to have escaped a lot of these articles, and shine a light on what I consider the most important aspects of these upcoming releases.  The positive thing that most of these articles have achieved is increasing interest in AMD’s upcoming products, and what they might do for that company and the industry in general.

fx_chip.jpg

The original FX-8150 hopefully will only be a slightly embarrasing memory for AMD come Q3/Q4 of this year.

The current Bulldozer architecture that powers the AMD FX series of processors is not exactly an optimal solution.  It works, and seems to do fine, but it does not surpass the performance of the previous generation Phenom II X6 series of chips in any meaningful way.  Let us not mention how it compares to Intel’s Sandy Bridge and Ivy Bridge products.  It is not that the design is inherently flawed or bad, but rather that it was a unique avenue of thought that was not completely optimized.  The train of thought is that AMD seems to have given up on the high single threaded performance that Intel has excelled at for some time.  Instead they are going for good single threaded performance, and outstanding multi-threaded performance.  To achieve this they had to rethink how to essentially make the processor as wide as possible, keep the die size and TDP down to reasonable sizes, and still achieve a decent amount of performance in single threaded applications.

Bulldozer was meant to address this idea, and its success is debatable.  The processor works, it shows up as an eight logical core processor, and it seems to scale well with multi-threading.  The problem, as stated before, is that it does not perform like a next generation part.  In fact, it is often compared to Intel’s Prescott, which was a larger chip on a smaller process than the previous Northwood processor, but did not outperform the earlier part in any meaningful way (except in heat production).  The difference between Intel and AMD in this aspect is that as compared to Prescott, Bulldozer as an entirely new architecture as compared to the Prescott/Northwood lineage.  AMD has radically changed the way it designs processors.  Taking some lessons from the graphics arm of the company and their successful Radeon brand, AMD is applying that train of thought to processors.

Continue reading our thoughts on AMD, Vishera, and Beyond!!

Author:
Subject: Editorial
Manufacturer: Google

Introduction

Search engine giant Google took the wraps off its long rumored cloud storage service called Google Drive this week. The service has been rumored for years, but is (finally) official. In the interim, several competing services have emerged and even managed to grab significant shares of the market. Therefore, it will be interesting to see how Google’s service will stack up. In this article, we’ll be taking Google Drive on a test drive from installation to usage to see if it is a worthy competitor to other popular storage services—and whether it is worth switching to!

How we test

In order to test the service, I installed the Google desktop application (we’ll be taking a look at the mobile app soon) and uploaded a variety of media file types including documents, music, photos, and videos in numerous formats. The test system in question is an Intel i7 860 based system with 8GB of RAM and a wired Ethernet connection to the LAN. The cable ISP I used offers approximately two to three mpbs uploads (real world speeds, 4mbps promised) for those interested.

gd1.JPG
 

Overview

Google’s cloud service was officially unveiled on Tuesday, but the company is still rolling out activations for people’s accounts (my Google Drive account activated yesterday [April 27, 2012], for example). And it now represents the new single storage bucket for all your Google needs (Picasa, Gmail, Docs, App Inventor, ect; although people can grandfather themselves into the cheaper Picasa online storage).

Old Picasa Storage vs New Google Drive Storage Plans

Storage Tier (old/new) Old Plan Pricing (per year) New Plan Pricing (per year)
20 GB/25 GB $5 $29.88
80 GB/100 GB $20 $59.88
200 GB $50 $119.88
400 GB $100 $239.88
1 TB $256 $599.88
2 TB $512 $1,199.88
4 TB $1,024 $2,399.88
8 TB $2,048 $4,799.88
16 TB $4,096 $9,599.88

(Picasa Plans were so much cheaper–hold onto them if you're able to!)

The way Google Drive works is much like that of Dropbox wherein a single folder is synced between Google’s servers and the user’s local machine (though sub-folders are okay to use and the equivalent of "labels" on the Google side). The storage in question is available in several tiers, though the tier that most people will be interested in is the free one. On that front, Google Drive offers 5GB of synced storage, 10GB of Gmail storage, and 1GB of Picasa Web Albums photo backup space. Beyond that, Google is offering nine paid tiers from an additional 25GB of "Drive and Picasa" storage (and 25GB of Gmail email storage) for $2.49 a month to 16TB of Drive and Picasa Web Albums storage with 25GB of Gmail email storage for $799.99 a month. The chart below details all the storage tiers available.

Google Drive Storage Plans
Storage Tiers Drive/Picasa Storage Gmail Storage Price (per month)
Free 5GB/1GB 10GB $0 (free)
25GB 25GB (shared) 25GB $2.49
100GB 100GB (shared) 25GB $4.99
200GB 200GB (shared) 25GB $9.99
400GB 400GB (shared) 25GB   $19.99
1TB 1TB (shared) 25GB   $49.99
2TB 2TB (shared) 25GB   $99.99
4TB 4TB (shared) 25GB   $199.99
8TB 8TB (shared) 25GB   $399.99
16TB 16TB (shared) 25GB   $799.99

1024MB = 1GB, 1024GB = 1TB

The above storage numbers do not include the 5GB of free drive storage that is also applied to any paid tiers.  The free 1GB of Picasa storage does not carry over to the paid tiers.

Even better, Google has not been stingy with their free storage. They continue to allow users to upload as many photos as they want to Google+ (they are resized to a max of 2048x2048 pixels though). Also, Google Documents stored in the Docs format continue to not count towards the storage quota. Videos uploaded to Google+ under 15 minutes in length are also free from storage limitations. As far as Picasa Web Albums (which also includes photos uploaded to blogger blogs) goes, any images under 2048x2048 and videos under 15 minutes in length do not count towards the storage quota either. If you exceed the storage limit, Google will still allow you to access all of your files, but you will not be able to create any new files until you delete enough files to get below the storage quota. The one exception to that rule is the “storage quota free” file types mentioned above–Google will still let you create/upload those. For Gmail storage, Google allows you to receive and store as much email as you want up to the quota. After you reach the quota, any new email will hard bounce and you will not be able to receive new messages.

In that same vein, Google’s paid tiers are not the cheapest but are still fairly economical. They are less expensive per GB than Dropbox, for example, but are more expensive than Microsoft’s new Skydrive tiers. One issue that many users face with online storage services is the file size limit placed on individual files. While Dropbox places no limits (other than overall storage quota) on individual file size, many other services do. Google offers a compromise to users in the form of 10GB per file size limits. While you won’t be backing up Virtualbox hard drives or drive image backups to Google, they’ll let you backup anything else (within reason).

Continue reading for the full Google Drive preview.

Manufacturer: PC Perspective
Tagged: Malware

Infectious fear is infectious

PCMag and others have released articles based on a blog post from Sophos. The original post discussed how frequently malware designed for Windows is found on Mac computers. What these articles mostly demonstrate is that we really need to understand security: what it is, and why it matters. The largest threats to security are complacency and misunderstanding; users need to grasp the problem rather than have it burried under weak analogies and illusions of software crutches.

Your data and computational ability can be very valuable to people looking to exploit it.

The point of security is not to avoid malware, nor is it to remove it if you failed to avoid it. Those actions are absolutely necessary components of security -- do those things -- but they are not the goal of security. The goal of security is to retain control of what is yours. At the same time, be a good neighbor and make it easier for others to do the same with what is theirs.

Your responsibility extends far beyond just keeping a current antivirus subscription.

16-ShatteredWindows3.jpg

The problem goes far beyond throwing stones...

The distinction is subtle.

Your operating system is irrelevant. You could run Windows, Mac, Android, iOS, the ‘nixes, or whatever else. Every useful operating system has vulnerabilities and run vulnerable applications. The user is also very often tricked into loading untrusted code either directly or delivering it within data to a vulnerable application.

Blindly fearing malware -- such as what would happen if someone were to draw parallels to Chlamydia -- does not help you to understand it. There are reasons why malware exists; there are certain things which malware is capable of; and there are certain things which malware is not.

The single biggest threat to security is complacency. Your information is valuable and you are responsible to prevent it from being exploited.  The addition of a computer does not change the fundamental problem. Use the same caution on your computer and mobile devices as you should on the phone or in person. You would not leave your credit card information on a park bench unmonitored.

Read on to understand what malware is and what it could do.

Author:
Subject: Editorial
Manufacturer:

Introduction, Low-Power Computing Was Never Enjoyable

deadatom1.jpg

It was nearly five years ago that ASUS announced the first Eee PC model at Computex. That October the first production version of what would to be called a netbook, the ASUS Eee PC 4G, was released. The press latched on to the little Eee PC, making it the new darling of the computer industry. It was small, it was inexpensive, and it was unlike anything on the market.

Even so, the original Eee PC was a bit of a dead end. It used an Intel Celeron processor that was not suited for the application. It consumed too much power and took up a significant portion of the netbook’s production cost. If Intel’s Celeron had remained the only option for netbooks they probably would not have made the leap from press darling to mainstream consumer device.

It turned out that Intel (perhaps unintentionally) had the solution – Atom. Originally built with hopes that it might power “mobile Internet devices” it proved to be the netbook’s savior. It allowed vendors to squeeze out cheap netbooks with Windows and a proper hard drive.

At the time, Atom and the netbook seemed promising. Sales were great – consumers loved the cute, pint-sized, affordable computers. In 2009 netbook sales jumped by over 160% quarter-over-quarter while laptops staggered along with single-digit growth. The buzz quickly jumped to other products, spawning nettops, media centers and low-power all-in-one-PCs. There seemed to be nothing an Atom powered computer could not do.

Fast forward. Earlier this year, PC World ran an article asking if netbooks are dead. U.S. sales peaked in the first quarter of 2010 and have been nose-diving since then, and while some interest remains in the other markets, only central Europe and Latin America have held steady. It appears the star that burned brightest has indeed burned the quickest. 

But why?

Continue reading our editorial on the problems with low power x86 processors...

Author:
Subject: Editorial
Manufacturer: AMD

Get Out the Microscope

AMD announced their Q1 2012 earnings last week, which turned out better than the previous numbers suggested. The bad news is that they posted a net loss of $590 million. That does sound pretty bad considering that their gross revenue was $1.59 billion, but there is more to the story than meets the eye. Of course, there are thoughts of “those spendthrift executives are burying AMD again”, but this is not the case. The loss lays squarely on the GLOBALFOUNDRIES equity and wafer agreements that have totally been retooled.

500px-AMD_Logo.svg_.png

To get a good idea of where AMD stands in Q1, and for the rest of this year, we need to see how all these numbers actually get sorted out. Gross revenue is down 6% from the quarter before, which is expected due to seasonal pressures. This is right in line with Intel’s seasonal downturn, and in ways AMD was affected slightly less than their larger competitor. They are down around 2% from last year’s quarter, and part of that can be attributed to the continuing hard drive shortage that continued to affect the previous quarter.

The biggest news of the quarter was that AMD is no longer constrained by 32 nm availability. GLOBALFOUNDRIES was able to produce as many 32 nm parts for AMD as needed with yields continuously improving over the past two quarters. AMD seems very comfortable about where they are at in terms of yields and availability for both Bulldozer and Llano based product lines. AMD has in fact been ramping production of the upcoming Trinity based processor and has been shipping finished products to customers since mid Q1. They have also started shipping Brazos 2.0 parts to customers, and both Trinity and Brazos will be launched in mid Q2 of this year.
 
The CPU/APU World According to AMD
 
The mobile area has been one of tremendous growth for AMD and Q1 saw 100% of all mobile shipments be APU products (both Llano and Brazos 1.0). AMD is very bullish about Trinity. They say that it offers around 50% more performance at the same TDP as the earlier Llano based processors. This 50% is a combination of both CPU and GPU performance, so do not expect massive jumps in CPU performance alone from current Llano based products at those TDPs. The big jump does appear to be in graphics, and AMD is certainly more than willing to hang their hat on that portion. With the latest Ivy Bridge IGPs still not able to match last year’s Llano, AMD feels that Trinity will truly leave Intel behind in terms of overall graphics performance. Trinity features a totally redesigned graphics portion which combines the VLIW4 architecture of the HD 6900 series with aspects of the new 7000 series of products.
 
Subject: Editorial, Storage
Manufacturer: Various
Tagged: ssd, Future, flash, Bleak, 2012

Overcoming Hurdles

A paper, titled “The Bleak Future of NAND Flash Memory” was recently jointly published by the University of California and Microsoft Research. It has been picked up by many media outlets who all seem to be beating the same morbid drum, spinning tales of a seemingly apocalyptic end to the reign of flash-based storage devices. While I agree with some of what these authors have to say, I have reservations about the methods upon which the paper is based.

TLC and beyond?

The paper kicks off by declaring steep increases in latency and drops in lifetime associated with increases in bits-per-cell. While this is true, flash memory manufacturers are not making large pushes to increase bits-per-cell beyond the standard MLC (2 bits per cell) tech. Sure some have dabbled in 3-bit MLC, also called Triple Level Cell (TLC) which is a bit of a misnomer since storing three bits in a cell actually requires eight voltage level bands, not three as the name implies. Moving from SLC to MLC doubles density, but the diminishing returns increase sharply after that – MLC to TLC only increases capacity by a another 1.5x, but sees a 2-4x reduction in performance and endurance. In light of this, there is little demand for TLC flash, and where there is, it’s clear by the usage cases that it is not meant for anything beyond light usage. There's nothing wrong with the paper going down this road, but the reality is that increasing bits per cell is not the envelope being pushed by the flash memory industry.

paper-lifetime.png

Wait a second – where is 25nm MLC?

Looking at the above we see a glaring omission – 25nm MLC flash, which has been around for close to two years now, and constitutes the majority of shipping flash memory parts currently in production. SLC was also omitted, but I can see the reason for this – it’s hard to get your hands on 25nm SLC these days. Why? Because MLC technology has been improved upon to the point where ‘enterprise MLC’ (eMLC) is rapidly replacing SLC even despite the supposed reduction in reliability and endurance over SLC. The reasons for this are simple, and are completely sidestepped or otherwise overlooked by the paper:

  • SSD controllers employ write combination and wear leveling techniques.
  • Some controllers even compress data on-the-fly as to further reduce writes and provisioning.
  • Controller-level Error Correction (ECC) has improved dramatically with each process shrink.
  • SSD controllers can be programmed to compensate for the drift of data stored in a cell (eMLC).

Continue reading our editorial on the not-so-bleak future of NAND Flash Memory!!!

Author:
Subject: Editorial
Manufacturer: NVIDIA

Quarter Down but Year Up

Yesterday NVIDIA released their latest financial results for Q4 2012 and FY2012.  There was some good and bad mixed in the results, but overall it was a very successful year for NVIDIA.

Q4 saw gross revenue top $953.2 million US with a net income of $116 million US.  This is about $53 million less in gross revenue and $62 million down in net income as compared to last quarter.  There are several reasons as to why this happened, but the majority of it appears to be due to the hard drive shortage affecting add-in sales.  Simply put, the increase in hard drive prices caused most OEMs to take a good look at the price points of the entire system, and oftentimes would cut out the add-in graphics and just use integrated.

tegra3.jpg

Tegra 3 promises a 50% increase in revenue for NVIDIA this coming year.

Two other reasons for the lower than expected quarter were start of the transition to 28 nm products based on Kepler.  They are ramping up production on 28 nm and slowing down 40 nm.  Yields on 28 nm are not where they expected them to be, and there is also a shortage of wafer starts for that line.  This had a pretty minimal affect overall on Q4, but it will be one of the prime reasons why revenue looks like it will be down in Q1 2013. 

Read the rest of the article here.

Author:
Subject: Editorial
Manufacturer:
Tagged: ultrabook, Intel, CES

Introduction, Thin Is Flimsy

ultrabooked1.jpg

If there was anything that can be pointed to as “the” thing CES was about, it’s the ultrabook. These thin and portable laptops were presented by Intel with all the finesse of a sledgehammer. Intel’s message is clear. Ultrabooks are here, and you’re going to like them.

Such a highly coordinated effort on the part of Intel is unusual. Sure, they’ve pushed industry standards before. But the company’s efforts have usually been focused on a specific technology, like USB. The last time Intel put serious effort into trying to change how system builders constructed their systems was when Intel pushed for the BTX form factor. 

BTX was an attempt to address problems the company was having with its Pentium 4 processors, which tended to consume a lot of power and therefor run hot. The push for the ultrabook is also an attempt in address a (perceived) problem. In this case the issue at hand is portability, both in in terms physical system size and battery endurance. 

Intel announced some interesting new smartphone and tablet reference designs at CES 2012. These are signs that the company is making headway in this area. But the products based on those reference designs aren’t out yet, and it will probably take a few years for Intel to gain significant market share even if it does manage to offer x86 processors that can beat ARM in smartphones and tablets. In the meantime, Intel needs to provide slim, responsive and portable systems that can distract consumers from tablets.

So we have the ultrabook. 

Continue reading out editorial on Ultrabook and the pros and cons associated with their push into the market!!

Author:
Subject: Editorial
Manufacturer: AMD

Q4-2012 In a Nutshell

Tis the reporting season.  Yes, that time of year when some of the major players in the computing world get together and tell us all how well they did this past quarter.  Ok, so they do not necessarily get together to announce results, but they sure time them that way.  Today was AMD’s turn (and Apple’s), and the results were not nearly as positive as what Intel had to offer a few days ago.

800px-AMD_Logo.svg_.png

Q4 2011 was flat in terms of revenue as compared to Q3.  The company had gross revenue of $1.69 billion and had a net income loss of $177 million.  That net income is not necessarily a bad result, but more on that later.  Margins rose to 46%, which is still a far cry from Intel’s 65% for the past quarter.  Gross revenue was up 2% from last year, which considering the marketplace and Intel’s dominance, is a solid win for AMD.

When we start talking about non-GAAP results, AMD had a net income of $138 million.  The difference between those two numbers (a loss vs. a nice profit) is that the loss came from one time writeoffs.  AMD has lowered its stake in GLOBALFOUNDRIES to 8.8%, and in so doing incurred a hefty charge.  This is not so much money lost as it is lost value in the company.

Click to read the rest of this article here.

Author:
Subject: Editorial
Manufacturer: Intel

I got your $13.9 Billion over here...

Intel had a record quarter.  Are we tired of hearing that yet?  I guess that depends on who a person is investing with.  Earlier this quarter Intel warned that their results could be negatively affected by the current hard drive shortage that we are experiencing.  Apparently, this was a factor, but it did not stop Intel from still having a record quarter.

intel_logo.jpg

Q4 2011 turned out to be gangbusters for Intel.  They reported gross revenue of $13.9 billion, which is significantly higher than the expected $13.74 billion analysts were predicting.  Net income came in at $3.4 billion with an impressive 65.5% gross margin.  The overall year was also record setting at $54 billion gross revenue and $12.9 billion net income.  For comparison, AMD has a gross revenue of about $6.8 billion and a net income of around $300 million.  2010 was a record year for Intel in that they surpassed $40 billion in revenue for the first time in the company’s history, and this year saw revenue over $10 billion higher.  Intel is certainly hitting their stride, and they do not look to slow down anytime soon.

Read the rest of the article here.

Manufacturer: PC Perspective

Introduction: Griefing the grieving

PC Gaming has been on its death bed for years -- if you believe the countless debates that have occurred most commonly over the last decade. The drum beat roared from the masses: “Why game on the PC anymore when you could just buy a console?” The focus of conversation was set upon the attack and defense of the PC as a viable platform at all, let alone the platform of choice. The question that swarms naggingly through my brain is quite the opposite: “In the long run, why game on a console?” The concept that consoles are better than PCs, given a fraction of the support that consoles receive, is about to die; console supporters are in various levels of grief.

1-game-over.png

U mad Mario Bros.?

I am an avid, though this editorial may suggest livid, video game supporter. My first exposure to video gaming was mixed between the Nintendo Entertainment System and the family 80286. I have equally fond memories with the keyboard as with the gamepad. The balance between console and PC was level throughout my life until just a few years ago when I carefully thought the situation over. The PC is now my platform of choice.

Continue reading our editorial: The Five Stages of Griefing: Death of the Consoles!!

Author:
Manufacturer: Antec

Welcome to The Inside Perspective

Welcome!  The Inside Perspective is a new series on PC Perspective that will take you further behind the scenes of the industry than we typically go, with interviews of industry personnel.  I hope you find these discussions to be both informative as well as eye opening into the world behind the reviews, behind the press releases and the marketing speak that everyone sees on a daily basis.  http://pcper.com/tip

 

The Inside Perspective 001 - Antec's Jessie Lawrence

Antec was at one time the leader in enthusiast and gamer case design and much of what the company built years ago, like the P180 series, has endured years of modifications to remain relevant and these chassis remain some of the favorites of our staff.  However, in the past several years, the case world has seen new entrants, new designs and new feature sets that Antec has been slow to adopt and as such they have fallen out of the leaders position in the mind-set of enthusiasts, even if they haven't in sales in revenue.

tip_antec.png

With the release of the new P280 and Eleven Hundred cases though (the former of which we have already reviewed) Antec is hoping to make a resurgence into the enthusiast market.  And while I definitely was impressed with the P280 there is still some work to do and, luckily, Antec knows it and is moving forward in the right direction. 

Recently, Antec's Communications Manager Jessie Lawrence stopped by the office and we sat down for a small interview to ask him about the new case designs, what he likes about working in this cut throat industry and how Antec plans to stem the tides against the competition.

 

While we are getting The Inside Perspective started, we are including it in our PC Perspective Podcast RSS feed.  Also, the video version which we would encourage you to check out can be seen below, embedded via YouTube.  You can be sure to find all of our episodes of The Inside Perspective at http://pcper.com/tip

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file
  • Video - See YouTube embed below (RSS to come soon!)

Host: Ryan Shrout

Guest: Jessie Lawrence, Antec

Program Length: 13:49

Program Schedule:

  1. 01:08 Introduction
  2. 02:15 Questions about new P280 and Eleven Hundred Cases
  3. 04:40 Target audience for each new case
  4. 05:50 Background with Jessie Lawrence
  5. 09:20 Memorable moments in the industry
  6. 11:20 How can Antec return to enthusiast roots?
  7. 13:45 Conclusion

Author:
Subject: Editorial
Manufacturer: AMD

MIA or Simply Retired?

It is awfully hard to deny the value proposition of the AMD HD 6970 graphics card.  The card overall matches (and sometimes exceeds) the NVIDIA GTX 570 at a slightly lower price, it has 2 GB of frame buffer, and AMD is consistently improving not just gaming performance for the new VLIW 4 architecture, but also adding to its GPGPU support.  Throw in the extra happiness of a more manageable power draw, pretty low heat production for a top end card, and it is also the fastest single GPU card when it comes to bitcoin mining.  With all of these positives, why hasn’t everyone gone out to buy one?  Simple, they simply are hard to come by anymore.

question_6970.jpg

¿Dónde están las tarjetas gráficas?

Throughout Winter and Spring of this year, the HD 6970 was an easy card to acquire.  Prices were very reasonable, supply seemed ample, and most every manufacturer had one in a configuration that would appeal to a lot of people.  The HD 6950 was also in great supply, and it was also in a few unique configurations that adds more for the money than just the reference design.  This Summer saw the pool of HD 6970 cards dry up, not to mention the complete lack of HD 6990 cards in retail altogether.

Continue reading about where all the Radeon HD 6970s have gone!!

Carmack Speaks

Last week we were in Dallas, Texas covering Quakecon 2011 as well as hosting our very own PC Perspective Hardware Workshop.  While we had over 1100 attendees at the event and had a blast judging the case mod contest, one of the highlights of the event is always getting to sit down with John Carmack and pick his brain about topics of interest.  We got about 30 minutes of John's time over the weekend and pestered him with questions about the GPU hardware race, how Intel's intergrated graphics (and AMD Fusion) fit in the future of PCs, the continuing debate about ray tracing, rasterization, voxels and infinite detail engines, key technologies for PC gamers like multi-display engines and a lot more!

One of our most read articles of all time was our previous interview with Carmack that focused a lot more on the ray tracing and rasterization debate.  If you never read that, much of it is still very relevant today and is worth reading over. 

carmack201c.png

This year though John has come full circle on several things including ray tracing, GPGPU workloads and even the advantages that console hardware has over PC gaming hardware.

Continue reading to see the full video interview and our highlights from it!!

Author:
Subject: Editorial
Manufacturer: NVIDIA

Boring but Profitable

NVIDIA posted their latest results following the quarter ending on July 31, 2011.  Unlike 2010 at this time, NVIDIA posted a strong quarter.  Technically this is Q2 FY 2012 for NVIDIA.  Last year’s results were pretty dismal with $811 million in gross revenue and a net loss of $140 million.  This quarter was much stronger with $1.016 billion in gross revenue, and a healthy $150 million in net income.

This quarter was also up sequentially from last quarter’s $962 million gross revenue and $135 million net income.  The only truly interesting thing about the increase is that there really was not very much interesting about it at all.  All of the product groups showed either flat performance, or a marginal increase.  The largest increases came from the mobile sector, which saw discrete mobile GPUs in laptop sales take a significant gain.  Consumer desktop stayed pretty even, though NVIDIA has a much stronger mix of cards stretching from the $100 US mark to above $750 with the GeForce GTX 500 series.

Read the rest of the article after the break.

Author:
Subject: Editorial, Mobile
Manufacturer: Qualcomm

Meet Vellamo

With Google reporting daily Android device activations upward of 550,000 devices a day, the rapid growth and ubiqutity of the platform cannot be denied. As the platform has grown, we here at PC Perspective have constantly kept our eye out for ways to assess and compare the performance of different devices running the same mobile operating systems. In the past we have done performance testing with applications such as Quadrant and Linpack, and GPU testing with NenaMark and Qualcomm's NeoCore product.

Today we are taking a look at a new mobile benchmark from Qualcomm, named Vellamo. Qualcomm has seen the need for an agnostic browser benchmark on Android, and so came Vellamo. A video introduction from Qualcomm's Director of Product Management, Sy Choudhury, is below.

 

With the default configuration, Vellamo performs a battery of 14 tests. These tests are catagorized into Rendering, Javascript, User Experience, Networking, and Advanced. 

For more on this benchmark and our results from 10 different Android-power devices, keep reading!

Author:
Subject: Editorial
Manufacturer: AMD

The Dirty Laggard

 

It may seem odd, but sometimes reviewers are some of the last folks to implement new technology.  This has been the case for myself many a time.  Yes, we get some of the latest and greatest components, but often we review them and then keep them on the shelf for comparative purposes, all the while our personal systems run last generation parts that we will not need to re-integrate into a test rig ever again.  Or in other cases, big money parts, like the one 30” 2560x1600 LCD that I own, are always being utilized on the testbed and never actually being used for things like browsing, gaming, or other personal activities.  Don’t get me wrong, this is not a “woe-is-me” rant about the hardships of being a reviewer, but rather just an interesting side effect not often attributed to folks who do this type of work.  Yes, we get the latest to play with and review, but we don’t often actually use these new parts in our everyday lives.

One of the technologies that I had only ever seen at trade shows is that of Eyefinity.  It was released back in the Fall of 2009, and really gained some momentum in 2010.  Initially it was incompatible with Crossfire technology, which limited it to a great degree.  A single HD 5970 card could push 3 x 1920x1080 monitors in most games, but usually only with details turned down and no AA enabled.  Once AMD worked a bit more on the drivers were we able to see Crossfire setups working in Eyefinity, which allowed users to play games at higher fidelity with the other little niceties enabled.  The release of the HD 6900 series of cards also proved to be a boon to Eyefinity, as these new chips had much better scaling in Crossfire performance, plus were also significantly faster than the earlier HD 5800 series at those price points.

eye_fin.jpg

Continue on to the rest of the story for more on my experiences with AMD Eyefinity.