It wouldn’t be February if we didn’t hear the Q4 FY14 earnings from NVIDIA! NVIDIA does have a slightly odd way of expressing their quarters, but in the end it is all semantics. They are not in fact living in the future, but I bet their product managers wish they could peer into the actual Q4 2014. No, the whole FY14 thing relates back to when they made their IPO and how they started reporting. To us mere mortals, Q4 FY14 actually represents Q4 2013. Clear as mud? Lord love the Securities and Exchange Commission and their rules.
The past quarter was a pretty good one for NVIDIA. They came away with $1.144 billion in gross revenue and had a GAAP net income of $147 million. This beat the Street’s estimate by a pretty large margin. As a response, trading of NVIDIA’s stock has gone up in after hours. This has certainly been a trying year for NVIDIA and the PC market in general, but they seem to have come out on top.
NVIDIA beat estimates primarily on the strength of the PC graphics division. Many were focusing on the apparent decline of the PC market and assumed that NVIDIA would be dragged down by lower shipments. On the contrary, it seems as though the gaming market and add-in sales on the PC helped to solidify NVIDIA’s quarter. We can look at a number of factors that likely contributed to this uptick for NVIDIA.
Subject: Editorial, General Tech, Systems | February 12, 2014 - 10:45 PM | Scott Michaud
Tagged: xbox, xbone, ps4, Playstation, pc gaming
PCMag, your source for Apple and gaming console coverage (I joke), wrote up an editorial about purchasing a gaming console. Honestly, they should have titled it, "How to Buy a Game Device" since they also cover the NVIDIA SHIELD and other options.
The entire Console vs PC debate bothers me, though. Neither side handles it well.
I will start by highlighting problems with the PC side, before you stop reading. Everyone says you can assemble your own gaming PC to save a little money. Yes, that is true and it is unique to the platform. The problem is that the public vision then becomes, "You must assemble and maintain your own gaming PC".
No. No. No.
Some people prefer the support system provided by the gaming consoles. If it bricks, which some of them do a lot, you can call up the manufacturer for a replacement in a few weeks. The same could be absolutely true for a gaming PC. There is nothing wrong with purchasing a computer from a system builder, ranging from Dell to Puget Systems.
The point of gaming PC is that you do not need to. You can also deal with a small business. For Canadians, if you purchase all of your hardware through NCIX, you can add $50 to your order for them to ship your parts as a fully assembled PC, with Windows installed (if purchased). You also get a one-year warranty. The downside is that you lose your ability to pick-and-choose components from other retailers and you cannot reuse your old stuff. Unfortunately, I do not believe NCIX USA offers this. Some local stores may offer similar benefits, though. One around my area assembled for free.
The benefits of the PC is always choice. You can assemble it yourself (or with a friend). You can have a console-like experience with a system builder. You can also have something in-between with small businesses. It is your choice.
Most importantly, your choice of manufacturer does not restrict your choice in content.
As for the consoles, I cannot find a rock-solid argument that will always be better on them. If you are thinking about purchasing one, the available content should sway your decision. Microsoft will be the place to get "Halo". Sony will be the place to get "The Last of Us". Nintendo will be the place to get "Mario". Your money should go where the content you want is. That, and wherever your friends play.
But, of course, then you are what made the content exclusive.
Note: Obviously the PC has issues with proprietary platforms, too. Unlike the consoles, it could also be a temporary issue. The PC business model does not depend upon Windows. If it remains a sufficient platform? Great. If not, we have multiple options which range from Linux/SteamOS to Web Standards for someone to develop a timeless classic on.
Subject: Editorial, General Tech, Graphics Cards | January 22, 2014 - 02:12 AM | Scott Michaud
Tagged: linux, intel hd graphics, haswell
Looking through this post by Phoronix, it would seem that Intel had a significant regression in performance on Ubuntu 14.04 with the Linux 3.13 kernel. In some tests, HD 4600 only achieves about half of the performance recorded on the HD 4000. I have not been following Linux iGPU drivers and it is probably a bit late to do any form of in-depth analysis... but yolo. I think the article actually made a pretty big mistake and came to the exact wrong conclusion.
Let's do this!
According to the article, in Xonotic v0.7, Ivy Bridge's Intel HD 4000 scores 176.23 FPS at 1080p on low quality settings. When you compare this to Haswell's HD 4600 and its 124.45 FPS result, this seems bad. However, even though they claim this as a performance regression, they never actually post earlier (and supposedly faster) benchmarks.
So I dug one up.
Back in October, the same test was performed with the same hardware. The Intel HD 4600 was not significantly faster back then, rather it was actually a bit slower with a score of 123.84 FPS. The Intel HD 4000 managed 102.68 FPS. Haswell did not regress between that time and Ubuntu 14.04 on Linux 3.13, Ivy Bridge received a 71.63% increase between then and Ubuntu 14.04 on Linux 3.13.
Of course, there could have been a performance increase between October and now and that recently regressed for Haswell... but I could not find those benchmarks. All I can see is that Haswell has been quite steady since October. Either way, that is a significant performance increase on Ivy Bridge since that snapshot in time, even if Haswell had a rise-and-fall that I was unaware of.
Subject: Editorial, General Tech | January 20, 2014 - 11:35 PM | Scott Michaud
Tagged: valve, virtual reality
Steam Dev Days was last week. At it, Valve announced a redesign of their Steam Controller and the removal of Steam Greenlight, among other notables. This was a press-free event, officially. Of course, due to Twitter and other social media platforms, everyone can decide to be a journalist on a whim. Things are going to leak out.
Other things are going to be officially released, too.
Michael Abrash held a speech at the event discussing his virtual reality initiative within Valve. Both it and the Steam Machine project was in question when the company released Jeri Ellsworth and several other employees. After SteamOS was announced and castAR, Jeri's project at Valve, had its Kickstarter, it was assumed that Valve gave up on augmented reality. Despite this, they still kept Michael Abrash on their staff.
I would speculate, completely from an outside position, that two virtual reality groups existed at one point (at least to some extent). The project seems to have been sliced into two parts, one leaving with Jeri and one continuing with Michael. I seriously doubt this had anything to do with the "High School Cliques" that Jeri was referring to, however. She said it was "longtime staff" (Michael was newly hired around the end of Portal 2's development) and not within her hardware team.
These are the specs that Valve has developed prototypes to.
1K x 1K per eye is about 100x less than they would like, however.
Ooo... 100 megapixels per eye.
I just believe it all shook out to an unfortunate fork in the project.
Politics aside, Michael Abrash sees virtual reality affecting "the entire entertainment industry" and will be well supported by Steam. I hope this would mean that Valve will finally drop the hammer on music and movie distribution. I have been expecting this ever since the Steam infrastructure was upgraded back in July 2011. Of course, neither server or software will solve content availability but I am still expecting them to take a shot at it. Remember that Valve is creating movies, could they have plans for virtual reality content?
The latest prototype of the Oculus Rift uses camera tracking for low-latency visibility.
This looks like Valve's solution.
The PDF slide deck is publicly available and each page includes the script he heavily followed. Basically, reading this is like being there, just less fun.
Subject: Editorial, General Tech, Graphics Cards, Processors, Memory, Systems | January 20, 2014 - 02:40 AM | Scott Michaud
Tagged: corsair, overclocking
I rarely overclock anything and this is for three main reasons. The first is that I have had an unreasonably bad time with computer parts failing on their own. I did not want to tempt fate. The second was that I focused on optimizing the operating system and its running services. This was mostly important during the Windows 98, Windows XP, and Windows Vista eras. The third is that I did not find overclocking valuable enough for the performance you regained.
A game that is too hefty to run is probably not an overclock away from working.
Thankfully this never took off...
Today, overclocking is easier and safer than ever with parts that basically do it automatically and back off, on their own, if thermals are too aggressive. Several components are also much less locked down than they have been. (Has anyone, to this day, hacked the locked Barton cores?) It should not be too hard to find a SKU which encourages the enthusiast to tweak some knobs.
But how much of an increase will you see? Corsair has been blogging about using their components (along with an Intel processor, Gigabyte motherboard, and eVGA graphics card because they obviously do not make those) to overclock. The cool part is they break down performance gains in terms of raising the frequencies for just the CPU, just the GPU, just the RAM, or all of the above together. This breakdown shows how each of the three categories contribute to the whole. While none of the overclocks are dramatic, Corsair is probably proud of the 5% jump in Cinebench OpenGL performance just by overclocking the RAM from 1600 MHz to 1866 MHz without touching the CPU or GPU.
It is definitely worth a look.
Subject: Editorial, General Tech | January 19, 2014 - 11:46 PM | Scott Michaud
Tagged: Keyboards, keyboard
Peter Bright down at Ars Technica wrote an editorial about the Lenovo ThinkPad X1 Carbon. His opinion is that keyboard developers should innovate in ways that "doesn't undermine expectations". Replacing a row of physical keys for a software-controlled touch strip is destructive because, even if the change proved invaluable, it would ultimately be inferior because it clashes with every other keyboard the user encounters. He then concludes with a statement that really should have directed his thesis.
Lenovo's engineers may be well-meaning in their attempts to improve the keyboard. But they've lost a sale as a result. The quest for the perfect laptop continues.
That is the entire point of innovation! You may dislike how a feature interacts with your personal ecosystem and that will drive you away from the product. Users who purchased the laptop without considering the keyboard have the option of returning it and writing reviews for others (or simply put up with it). Users who purchased the laptop because of the keyboard are happy.
I mainly disagree with the article because it claims that it is impossible to innovate the keyboard in any way that affects the core layout. I actually disagree with it for two reasons.
My first issue is about how vague he is. His primary example of good keyboard innovation is the IBM ThinkPad 701c and its "butterfly keyboard". The attempt is to increase the keyboard size to exceed the laptop itself to make it more conventional. Conventional for who? How many people use primarily small laptops with shrunken keyboards compared to people who touch-type function keys?
The second critique leads from the first. The PC industry became so effective because every manufacturer tries to be a little different with certain SKUs to gain tiny advantages. There could have easily been a rule against touchscreen computers. Eventually someone hit it out of the park and found an implementation that was wildly successful to a gigantic market. The QWERTY design has weathered the storm for more than a century but there is no rule that it cannot shift in the future.
In fact, at some point, someone decided to add an extra row of function keys. This certainly could undermine the expectations of users who have to go between computers and electronic typewriters.
It will be tough, though. Keyboards have settled down and learning their layouts is a significant mental investment. There are several factors to consider when it comes to how successful a keyboard modification will become. Mostly, however, it will come down to someone trying and observing what happens. Do not worry about letting random ideas in because the bad ideas will show themselves out.
Basically the point is: never say never (especially not that vaguely).
Subject: Editorial, General Tech | January 11, 2014 - 12:13 AM | Scott Michaud
Tagged: SimCity, ea
Maxis and Electronic Arts recognize the hefty portion of SimCity's popularity as a franchise is due to its mod community. The current version could use all of the help it can get after its unfortunate first year. They have finally let the community take over... to some extent. EA is imposing certain rules upon the content creators. Most of them are reasonable. One of them can have unforeseen consequences for the LGBQT community. The first rule should apply to their expansion packs.
Starting at the end, the last three rules (#3 through #5) are mostly reasonable. They protect EA against potential malware and breaches of their EULA and Terms of Service. The fifth rule does begin to dip its toe into potential censorship but it does not really concern me.
No-one can be "Best Friends" in North America.
The second rule, while mostly condemning illegal activity, does include the requirement that content remains within ESRB 10+ and PEGI 7. The problem with any content certification is that it limits the dialog between artists and society. In this case, references to same-sex topics (ex: Harvest Moon) in games may force a minimum T or M rating. A mod which introduces some story element where two Sims of the same gender go on a date or live together (again, like Harvest Moon) might result in interest groups rattling the ESRB's cage until EA draws a firm line on that specific topic.
EA is very good with the LGBQT community but this could get unnecessarily messy.
The first rule is a different story. It says that mods which affect the simulation for multiplayer games or features are not allowed (despite being the only official mode). They do not want a modification to give players an unfair advantage over the rest of the game's community.
You know, like maybe an airship which boosts "your struggling industry or commercial [districts]" and also brings in tourists and commuters without causing traffic on your connecting highway?
Maxis is still, apparently, exploring options for offline SimCity experiences. Even if they allow a server preference to not affect the global economy, mods would be able to be quarantined to those areas. Great, problem solved. Instead, it is somewhat left up to interpretation what is allowed. To make matters worse, the current examples of mods that we have are purely cosmetic.
SimCity is nowhere near as bad as Halo 2 Vista for its mod functionality (those mod tools were so hobbled that its own tutorial was impossible). It could actually be good. These are just areas for EA to consider and, hopefully, reconsider.
Subject: Editorial, General Tech | January 7, 2014 - 02:25 AM | Tim Verry
Tagged: valve, SteamOS, steambox, opinion, Gabe Newell, CES 2014, CES
Valve Co-Founder Gabe Newell took the stage at a press conference in Las Vegas last night to introduce SteamOS powered Steam Machines and the company's hardware partners for the initial 2014 launch. And it has been quite the launch thus far, with as many as 13 companies launching at least one Steambox PC.
The majority of Steam Machines are living room friendly Mini-ITX (or smaller) form factors, but that has not stopped other vendors from going all out with full-tower builds. The 13 hardware partners have all put their own spin on a SteamOS-powered PC, and by the second half of 2014, users will be able to choose from $500 SFF cubes to ~$1000 Mini-ITX builds with dedicated graphics, to powerhouse desktop PCs that have MSRPs up to $6,000 and multiple GPUs. In fact, aside from SteamOS and support for the Steam Controller, the systems do not share much else, offering up unique options–which is a great thing.
For the curious, the 13 Steam Machine hardware vendors are listed below.
- Digital Storm
- Falcon Northwest
- Origin PC
- Scan Computers
As luck would have it for those eager to compare all of the available options, the crew over at Ars Technica have put together a handy table of the currently-known specifications and pricing of each company's Steam Machines! Some interesting takeaways from the chart include the almost even division between AMD and NVIDIA dedicated graphics while Intel has a single hardware win with it's Iris Pro 5200 (Gigabyte BRIX Pro). On the other hand, on the CPU side of things, Intel has the most design wins with AMD having as many as 3 design wins versus Intel's 10 (in the best case scenario). The pricing is also interesting. While there are outliers that offer up very expensive and affordable models, the majority of Steam Machines tend to be closer to the $1000 mark than either the $500 or $2000+ price points. In other words, about the same amount of money for a mid-range DIY PC. This is not necessarily a bad thing, as users are getting decent hardware for their money, a free OS, and OEM warranty/support (and there is nothing stopping the DIYers from making their own Steamboxes).
A SFF Steambox (left) from Zotac and a full-tower SteamOS gaming desktop from Falcon Nothwest (right).
So far, I have to say that I'm more impressed than not with the Steam Machine launch which has gone off better than I had expected. Here's hoping the hardware vendors are able to come through at the announced price points and Valve is able to continue wrangling developer support (and to improve the planned game streaming functionality from a Windows box). If so, I think Valve and it's partners will have a hit on their hands that will help bring PC gaming into the living room and (hopefully) on par (at least) in the mainstream perspective with the ever-popular game consoles (which are now using x86 PC architectures).
What do you think about the upcoming crop of Steam Machines? Does SteamOS have a future? Let us know your thoughts and predictions in the comments below!
Follow all of our coverage of the show at http://pcper.com/ces!
Subject: Editorial, General Tech | December 8, 2013 - 04:11 AM | Scott Michaud
Tagged: TSMC, GLOBALFOUNDRIES, broadcom
Josh Walrath titled the intro of his "Next Gen Graphics and Process Migration: 20nm and Beyond" editorial: "The Really Good Times are Over". Moore's Law predicts that, with each ~2 year generation, we will be able to double the transistor count of our integrated circuits. It does not, however, set a price.
A look into GlobalFoundries.
"Moore's Law is expensive" remarked Tom Kilroy during his Computex 2013 keynote. Intel spends about $12 billion USD in capital, every year, to keep the transistors coming. It shows. They are significantly ahead of their peers in terms of process technology. Intel is a very profitable company who can squirrel away justifications for these research and development expenses across numerous products and services.
The benefits of a process shrink are typically three-fold: increased performance, decreased power consumption, and lower cost per chip (as a single wafer is better utilized). Chairman and CTO of Broadcom, Henry Samueli, told reporters that manufacturing complexity is pushing chip developers into a situation where one of those three benefits must be sacrificed for the other two.
You are suddenly no longer searching for an overall better solution. You are searching for a more optimized solution in many respects but with inherent tradeoffs.
He expects GlobalFoundries and TSMC to catch up to Intel and "the cost curve should come back to normal". Still, he sees another wall coming up when we hit the 5nm point (you can count the width or height of these transistors, in atoms, using two hands) and even more problems beyond that.
Image Credit: IONAS
From my perspective: at some point, we will need to say goodbye to electronic integrated circuits. The theorists are already working on how we can develop integrated circuits using non-electronic materials. For instance, during the end of my Physics undergraduate degree, my thesis adviser was working on nonlinear optics within photonic crystals; waveguides which transmit optical frequency light rather than radio frequency electric waves. Of course I do not believe his research was on Optical Integrated Circuits, but that is not really the point.
Humanity is great at solving problems when backs are against walls. But, what problem will they try?
Power consumption? Cost? Performance?
Subject: Editorial, General Tech | December 5, 2013 - 06:53 PM | Scott Michaud
Tagged: windows, microsoft
Peter Bright at Ars Technica is wondering how many operating systems (OSes) Microsoft actually needs and, for that matter, how many they already have. Three consumer versions of Windows exists (or brands of it does): Windows RT, "full" Windows, and Windows Phone. Then again, it is really difficult to divide up what a unique operating system even is. All of the aforementioned "OSes" run on the same base kernel and even app compatibility does not align to that Venn diagram.
In my personal opinion, it really does not matter how many (or what) operating systems Microsoft has. That innate desire to categorize things into boxes really does nothing useful. At best, it helps you create relationships between it and other platforms; these comparisons may not even be valid. Sure, from the perspective of Microsoft's marketing team, these categories help convey information about their products to consumers.
... And if recent trends mean anything: very incorrect and confusing information.
So really, and I believe this is what Peter Bright was getting at, who cares how many OSes Microsoft has? The concern should really be what these products mean for consumers. In that sense, I really hope we trend towards the openness of the last couple Internet Explorer versions (and of course Windows 7) and further from the censored nature of Windows RT.
You can have 800 channels or just a single one but that doesn't mean something good is on.