Mozilla Dumps "Metro" Version of Firefox

Subject: Editorial, General Tech | March 16, 2014 - 12:27 AM |
Tagged: windows, mozilla, microsoft, Metro

If you use the Firefox browser on a PC, you are probably using its "Desktop" application. They also had a version for "Modern" Windows 8.x that could be used from the Start Screen. You probably did not use it because fewer than 1000 people per day did. This is more than four orders of magnitude smaller than the number of users for Desktop's pre-release builds.

Yup, less than one-thousandth.

22-mozilla-2.jpg

Jonathan Nightingale, VP of Firefox, stated that Mozilla would not be willing to release the product without committing to its future development and support. There was not enough interest to take on that burden and it was not forecast to have a big uptake in adoption, either.

From what we can see, it's pretty flat.

The code will continue to exist in the organization's Mercurial repository. If "Modern" Windows gets a massive influx of interest, they could return to what they had. It should also be noted that there never was a version of Firefox for Windows RT. Microsoft will not allow third-party rendering engines as a part of their Windows Store certification requirements (everything must be based on Trident, the core of Internet Explorer). That said, this is also true of iOS and Firefox Junior exists with these limitations. It's not truly Firefox, little more than a re-skinned Safari (as permitted by Apple), but it exists. I have heard talks about Firefox Junior for Windows RT, Internet Explorer reskinned by Mozilla, but not to any detail. The organization is very attached to its own technology because, if whoever made the engine does not support new features or lags in JavaScript performance, the re-skins have nothing to leverage it.

Paul Thurrott of WinSupersite does not blame Mozilla for killing "Metro" Firefox. He acknowledges that they gave it a shot and did not see enough pre-release interest to warrant a product. He places some of the blame on Microsoft for the limitations it places on browsers (especially on Windows RT). In my opinion, this is just a symptom of the larger problem of Windows post-7. Hopefully, Microsoft can correct these problems and do so in a way that benefits their users (and society as a whole).

Source: Mozilla

Valve's Direct3D to OpenGL Translator (Or Part of It)

Subject: Editorial, General Tech | March 11, 2014 - 07:15 PM |
Tagged: valve, opengl, DirectX

Late yesterday night, Valve released source code from their "ToGL" transition layer. This bundle of code sits between "[a] limited subset of Direct3D 9.0c" and OpenGL to translate engines which are designed in the former, into the latter. It was pulled out of the DOTA 2 source tree and published standalone... mostly. Basically, it is completely unsupported and probably will not even build without some other chunks of the Source engine.

valve-dx-opengl.jpg

Still, Valve did not need to release this code, but they did. How a lot of open-source projects work is that someone dumps a starting blob, and if sufficient, the community pokes and prods it to mold it into a self-sustaining entity. The real question is whether the code that Valve provided is sufficient. As often is the case, time will tell. Either way, this is a good thing that other companies really should embrace: giving out your old code to further the collective. We are just not sure how good.

ToGL is available now at Valve's GitHub page under the permissive, non-copyleft MIT license.

Source: Valve GitHub

The Bigger They Are: The Titan They Fall? 48GB Install

Subject: Editorial, General Tech | February 25, 2014 - 12:43 PM |
Tagged: titanfall, ssd

UPDATE (Feb 26th): Our readers pointed out in the comments, although I have yet to test it, that you can change Origin's install-to directory before installing a game to have them on a separate hard drive as the rest. Not as easy as Steam's method, but apparently works for games like this that you want somewhere else. I figured it would forget games in the old directory, but apparently not.

Well, okay. Titanfall will require a significant amount of hard drive space when it is released in two weeks. Receiving the game digitally will push 21GB of content through your modem and unpack to 48GB. Apparently, the next generation has arrived.

titanfall.jpg

Honestly, I am not upset over this. Yes, this basically ignores customers who install their games to their SSDs. Origin, at the moment, forces all games to be installed in a single directory (albeit that can be anywhere) unlike Steam, which allows games to be individually sent to multiple folders. It would be a good idea to keep those customers in mind... but not at the expense of the game itself. Like always, both "high-end" and "unoptimized" titles have high minimum specifications; we decide which one applies by considering how effectively the performance is used.

That is something that we will need to find out when it launches on March 11th.

Source: PC Gamer

Kaveri loves fast memory

Subject: Editorial, General Tech | February 25, 2014 - 12:34 PM |
Tagged: ddr3, Kaveri, A10 7850K, amd, linux

You don't often see performance scaling as clean as what Phoronix saw when testing the effect of memory speed on AMD's A10-7850K.  Pick any result and you can clearly see a smooth increase in performance from DDR3-800 to DDR3-2400.  The only time that increase seems to decline slightly is between DDR3-2133 and 2400MHz, with some tests showing little to no increase between those two speeds.  Some tests do still show an improvement, for certain workloads on Linux the extra money is worth it but in other cases you can save a few dollars and limit yourself to the slightly cheaper DDR3-2133.  Check out the full review here.

image.php_.jpg

"Earlier in the week I published benchmarks showing AMD Kaveri's DDR3-800MHz through DDR3-2133MHz system memory performance. Those results showed this latest-generation AMD APU craving -- and being able to take advantage of -- high memory frequencies. Many were curious how DDR3-2400MHz would fair with Kaveri so here's some benchmarks as we test out Kingston's HyperX Beast 8GB DDR3-2400MHz memory kit."

Here are some more Memory articles from around the web:

Memory

Source: Phoronix

Irrational Games Implodes with Controlled Demolition

Subject: Editorial, General Tech | February 19, 2014 - 03:15 PM |
Tagged: bioshock infinite

The team behind the original BioShock and Bioshock: Infinite decided to call it quits. After seventeen years, depending on where you start counting, the company dissolved to form another, much smaller studio. Only about fifteen employees will transition to the new team. The rest are provided financial support, given a bit of time to develop their portfolios, and can attend a recruitment day to be interviewed by other studios and publishers. They may also be offered employment elsewhere in Take-Two Interactive.

bioshock_infinite_sp.jpg

The studio formed by the handful of remaining employees will look to develop games based on narrative, which is definitely their strength. Each game will be distributed digitally and Take-Two will continue to be their parent company.

While any job loss is terrible, I am interested in the future project. BioShock: Infinite sold millions of copies but I wonder if its size ultimately caused it harm. It was pretty and full of detail, at the expense of requiring a large team. The game had a story which respected your intelligence, you may not understand it and that was okay, but I have little confidence that it was anywhere close to the team's original vision. From budget constraints to the religious beliefs of development staff, we already know about several aspects of the game that changed significantly. Even Elizabeth, according to earlier statements from Ken Levine, was on the bubble because of her AI's complexity. I can imagine how difficult it is to resist those changes when seeing man-hour budgets. I cannot, however, imagine BioShock: Infinite without Elizabeth. A smaller team might help them concentrate their effort where it matters and keep artistic vision from becoming too dilute.

As for BioShock? The second part of the Burial at Sea DLC is said to wrap up the entire franchise. 2K will retain the license if they want to release sequels or spin-offs. I doubt Ken Levine will have anything more to do with it, however.

Author:
Subject: Editorial
Manufacturer: NVIDIA

It wouldn’t be February if we didn’t hear the Q4 FY14 earnings from NVIDIA!  NVIDIA does have a slightly odd way of expressing their quarters, but in the end it is all semantics.  They are not in fact living in the future, but I bet their product managers wish they could peer into the actual Q4 2014.  No, the whole FY14 thing relates back to when they made their IPO and how they started reporting.  To us mere mortals, Q4 FY14 actually represents Q4 2013.  Clear as mud?  Lord love the Securities and Exchange Commission and their rules.

633879_NVLogo_3D.jpg

The past quarter was a pretty good one for NVIDIA.  They came away with $1.144 billion in gross revenue and had a GAAP net income of $147 million.  This beat the Street’s estimate by a pretty large margin.  As a response, trading of NVIDIA’s stock has gone up in after hours.  This has certainly been a trying year for NVIDIA and the PC market in general, but they seem to have come out on top.

NVIDIA beat estimates primarily on the strength of the PC graphics division.  Many were focusing on the apparent decline of the PC market and assumed that NVIDIA would be dragged down by lower shipments.  On the contrary, it seems as though the gaming market and add-in sales on the PC helped to solidify NVIDIA’s quarter.  We can look at a number of factors that likely contributed to this uptick for NVIDIA.

Click here to read the rest of NVIDIA's Q4 FY2014 results!

Oh PCMag, Console vs PC

Subject: Editorial, General Tech, Systems | February 12, 2014 - 07:45 PM |
Tagged: xbox, xbone, ps4, Playstation, pc gaming

PCMag, your source for Apple and gaming console coverage (I joke), wrote up an editorial about purchasing a gaming console. Honestly, they should have titled it, "How to Buy a Game Device" since they also cover the NVIDIA SHIELD and other options.

The entire Console vs PC debate bothers me, though. Neither side handles it well.

PS4-01.png

I will start by highlighting problems with the PC side, before you stop reading. Everyone says you can assemble your own gaming PC to save a little money. Yes, that is true and it is unique to the platform. The problem is that the public vision then becomes, "You must assemble and maintain your own gaming PC".

No.

No. No. No.

Some people prefer the support system provided by the gaming consoles. If it bricks, which some of them do a lot, you can call up the manufacturer for a replacement in a few weeks. The same could be absolutely true for a gaming PC. There is nothing wrong with purchasing a computer from a system builder, ranging from Dell to Puget Systems.

The point of gaming PC is that you do not need to. You can also deal with a small business. For Canadians, if you purchase all of your hardware through NCIX, you can add $50 to your order for them to ship your parts as a fully assembled PC, with Windows installed (if purchased). You also get a one-year warranty. The downside is that you lose your ability to pick-and-choose components from other retailers and you cannot reuse your old stuff. Unfortunately, I do not believe NCIX USA offers this. Some local stores may offer similar benefits, though. One around my area assembled for free.

The benefits of the PC is always choice. You can assemble it yourself (or with a friend). You can have a console-like experience with a system builder. You can also have something in-between with small businesses. It is your choice.

Most importantly, your choice of manufacturer does not restrict your choice in content.

5-depressing.png

As for the consoles, I cannot find a rock-solid argument that will always be better on them. If you are thinking about purchasing one, the available content should sway your decision. Microsoft will be the place to get "Halo". Sony will be the place to get "The Last of Us". Nintendo will be the place to get "Mario". Your money should go where the content you want is. That, and wherever your friends play.

But, of course, then you are what made the content exclusive.

Note: Obviously the PC has issues with proprietary platforms, too. Unlike the consoles, it could also be a temporary issue. The PC business model does not depend upon Windows. If it remains a sufficient platform? Great. If not, we have multiple options which range from Linux/SteamOS to Web Standards for someone to develop a timeless classic on.

Source: PCMag

(Phoronix) Intel Haswell iGPU Linux Performance in a Slump?

Subject: Editorial, General Tech, Graphics Cards | January 21, 2014 - 11:12 PM |
Tagged: linux, intel hd graphics, haswell

Looking through this post by Phoronix, it would seem that Intel had a significant regression in performance on Ubuntu 14.04 with the Linux 3.13 kernel. In some tests, HD 4600 only achieves about half of the performance recorded on the HD 4000. I have not been following Linux iGPU drivers and it is probably a bit late to do any form of in-depth analysis... but yolo. I think the article actually made a pretty big mistake and came to the exact wrong conclusion.

Let's do this!

7-TuxGpu.png

According to the article, in Xonotic v0.7, Ivy Bridge's Intel HD 4000 scores 176.23 FPS at 1080p on low quality settings. When you compare this to Haswell's HD 4600 and its 124.45 FPS result, this seems bad. However, even though they claim this as a performance regression, they never actually post earlier (and supposedly faster) benchmarks.

So I dug one up.

Back in October, the same test was performed with the same hardware. The Intel HD 4600 was not significantly faster back then, rather it was actually a bit slower with a score of 123.84 FPS. The Intel HD 4000 managed 102.68 FPS. Haswell did not regress between that time and Ubuntu 14.04 on Linux 3.13, Ivy Bridge received a 71.63% increase between then and Ubuntu 14.04 on Linux 3.13.

Of course, there could have been a performance increase between October and now and that recently regressed for Haswell... but I could not find those benchmarks. All I can see is that Haswell has been quite steady since October. Either way, that is a significant performance increase on Ivy Bridge since that snapshot in time, even if Haswell had a rise-and-fall that I was unaware of.

Source: Phoronix

Valve Virtual Reality Project at Steam Dev Days

Subject: Editorial, General Tech | January 20, 2014 - 08:35 PM |
Tagged: valve, virtual reality

Steam Dev Days was last week. At it, Valve announced a redesign of their Steam Controller and the removal of Steam Greenlight, among other notables. This was a press-free event, officially. Of course, due to Twitter and other social media platforms, everyone can decide to be a journalist on a whim. Things are going to leak out.

Other things are going to be officially released, too.

Valve-VR-heroish.jpg

Michael Abrash held a speech at the event discussing his virtual reality initiative within Valve. Both it and the Steam Machine project was in question when the company released Jeri Ellsworth and several other employees. After SteamOS was announced and castAR, Jeri's project at Valve, had its Kickstarter, it was assumed that Valve gave up on augmented reality. Despite this, they still kept Michael Abrash on their staff.

I would speculate, completely from an outside position, that two virtual reality groups existed at one point (at least to some extent). The project seems to have been sliced into two parts, one leaving with Jeri and one continuing with Michael. I seriously doubt this had anything to do with the "High School Cliques" that Jeri was referring to, however. She said it was "longtime staff" (Michael was newly hired around the end of Portal 2's development) and not within her hardware team.

valve-vr-specs.jpg

These are the specs that Valve has developed prototypes to.

1K x 1K per eye is about 100x less than they would like, however.

Ooo... 100 megapixels per eye.

I just believe it all shook out to an unfortunate fork in the project.

Politics aside, Michael Abrash sees virtual reality affecting "the entire entertainment industry" and will be well supported by Steam. I hope this would mean that Valve will finally drop the hammer on music and movie distribution. I have been expecting this ever since the Steam infrastructure was upgraded back in July 2011. Of course, neither server or software will solve content availability but I am still expecting them to take a shot at it. Remember that Valve is creating movies, could they have plans for virtual reality content?

valve-vr-room.jpg

The latest prototype of the Oculus Rift uses camera tracking for low-latency visibility.

This looks like Valve's solution.

The PDF slide deck is publicly available and each page includes the script he heavily followed. Basically, reading this is like being there, just less fun.

Source: Valve

Corsair Quantifies the Benefits of Overclocking

Subject: Editorial, General Tech, Graphics Cards, Processors, Memory, Systems | January 19, 2014 - 11:40 PM |
Tagged: corsair, overclocking

I rarely overclock anything and this is for three main reasons. The first is that I have had an unreasonably bad time with computer parts failing on their own. I did not want to tempt fate. The second was that I focused on optimizing the operating system and its running services. This was mostly important during the Windows 98, Windows XP, and Windows Vista eras. The third is that I did not find overclocking valuable enough for the performance you regained.

A game that is too hefty to run is probably not an overclock away from working.

intelupgrade.jpg

Thankfully this never took off...

Today, overclocking is easier and safer than ever with parts that basically do it automatically and back off, on their own, if thermals are too aggressive. Several components are also much less locked down than they have been. (Has anyone, to this day, hacked the locked Barton cores?) It should not be too hard to find a SKU which encourages the enthusiast to tweak some knobs.

But how much of an increase will you see? Corsair has been blogging about using their components (along with an Intel processor, Gigabyte motherboard, and eVGA graphics card because they obviously do not make those) to overclock. The cool part is they break down performance gains in terms of raising the frequencies for just the CPU, just the GPU, just the RAM, or all of the above together. This breakdown shows how each of the three categories contribute to the whole. While none of the overclocks are dramatic, Corsair is probably proud of the 5% jump in Cinebench OpenGL performance just by overclocking the RAM from 1600 MHz to 1866 MHz without touching the CPU or GPU.

It is definitely worth a look.

Source: Corsair