Subject: General Tech | November 1, 2016 - 12:49 PM | Scott Michaud
Tagged: Adobe, linux, mozilla
Apparently I missed this the first time around, but Adobe has decided to continue supporting the NPAPI version of Flash Player on Linux. They have just released their second update, Flash Player 24 Beta, on October 28th for both 32- and 64-bit platforms. Before September, Adobe was maintaining Flash Player 11.2 with security updates. Adobe has also extended NPAPI support beyond 2017, which was supposed to be the original cut-off for that plug-in architecture on Linux, and pledge to keep “major version numbers in sync”.
This took me by surprise. Browser vendors, even Mozilla, have been deprecating NPAPI for a while. Plug-ins are unruly from a security and performance standpoint, and they would much rather promote the Web standards that they work so hard to implement, rather than being a window frame around someone else's proprietary platform.
So what are Adobe thinking? Well, they claim that this “is primarily a security initiative”. As such, it would make sense that, possibly, and again I'm an outsider musing here, the gap between now and 11.2 was large enough that it would be easier to just maintain two branches.
Whatever the reason, Flash on Linux is continuing to be supported for all browsers. If you find yourself at the intersection of Linux, Firefox, and hobbyist-developed Tower Defense games, you can pick up the latest plug-in at Adobe Labs.
Subject: General Tech | October 30, 2016 - 01:09 AM | Scott Michaud
Tagged: mozilla, servo, gecko, firefox
One of the big announcements at Mozilla Summit 2013, despite Firefox OS being the focus of the event, was their research (with Samsung) into a new rendering engine, Servo. Rendering HTML5 is horrifically complex, so creating a new rendering engine from scratch is a big “nope!” for basically all organizations. Mozilla saw this as a big potential, because current engines are very difficult to scale to multiple cores, so they went in to this as a no-assumptions experiment.
At the time, they didn't know whether Servo would be built up into a full rendering engine, or whether it would be picked apart and pulled back into their current engine, Gecko. Mozilla has now unveiled Quantum, and the first sentence of its MozillaWiki entry is “Quantum is not a new web browser.” They go on to say that they will be “building on the Gecko engine as a solid foundation”. So it seems pretty clear that, like they've recently done with their media file parser in Firefox 48.
While this will likely not have the major impact that “boom, new engine” would, in terms of performance, this piece-wise method should be quicker than bulking up Servo. Mozilla expects that big changes will begin to land next year.
Subject: Storage | October 5, 2016 - 07:57 PM | Scott Michaud
Tagged: ssd, mozilla, google, firefox, endurance, chrome
A couple of weeks ago, I saw a post pop up on Twitter a few times about Firefox performing excessive writes to SSDs, which total up to 32GBs in a single day. The author attributes it mostly to a fast-updating session restore feature, although cookies were also resource hogs in their findings. In an update, they also tested Google Chrome, which, itself, clocked in over 24GB of writes in a day.
This, of course, seemed weird to me. I would have thought that at least one browser vendor might notice an issue like this. Still, I passed the link to Allyn because he would be much more capable in terms of being able to replicate these results. In our internal chat at the time, he was less skeptical than I was. I've since followed up with him, and he said that his initial results “wasn't nearly as bad as their case”. He'll apparently elaborate on tonight's podcast, and I'll update this post with his findings.
Subject: General Tech, Mobile | September 29, 2016 - 02:15 AM | Scott Michaud
Tagged: mozilla, Firefox OS, firefox
Update: There has been a little confusion. The web browser, Firefox, is still going strong. In fact, they're focusing their engineering efforts more on it, by cutting back on these secondary projects.
Less than a year after their decision to stop developing and selling smartphones through carriers, Mozilla has decided to end all commercial development of Firefox OS. Releases after Firefox OS 2.6 will be handled by third parties, such as Panasonic, should they wish to continue using it for their smart TV platform. Further, source code for the underlying operating system, Boot-to-Gecko (B2G), will be removed from their repository, mozilla-central, so it doesn't hinder development of their other products.
Regardless, Mozilla needs to consider their long-term financial stability, and throwing resources at Firefox OS apparently doesn't return enough value for them, both directly and for its impact on society.
Subject: General Tech | September 20, 2016 - 04:51 PM | Scott Michaud
Tagged: mozilla, firefox
While it was originally scheduled for last week, some last-minute issues preventing the software non-profit organization from releasing it until today. Also, for some reason, Firefox for Android doesn't want to update from within itself, but triggering an update from the Google Play store works. This might be temporary and/or happens with every Firefox for Android update; I'm new to this platform.
With Firefox 49, VentureBeat is reporting that Mozilla is allowing a “small initial set of compatible add-ons” to be enabled alongside multi-process. If you don't have any non-compatible add-ons installed, then you should see Multiprocess Windows enabled in about:support. Otherwise, it will be disabled and you won't see any difference.
Interestingly, Mozilla is promoting "Refresh Firefox" at their site if you have the latest version. This basically cleans all the add-ons out of your user profile, but maintains browsing history, bookmarks, and the like. It might have been around for a while, but, if it's new, it times nicely with the multi-process rollout. On top of cleaning out old, crufty add-ons, a user should see a bigger jump when Mozilla's enhancements are (I'm guessing) enabled.
Mozilla has also changed a few things here and there, too. While many of our readers will probably have hardware acceleration for video, they have just added in SSSE3 enhancements if GPU support isn't available. I'm not sure all of the use cases for this, but I'd expect it would help in virtualized environments and certain, older PCs (ex: Intel Atom and Via Nano). I'm just speculating, though.
Subject: General Tech | August 20, 2016 - 05:36 PM | Scott Michaud
Tagged: mozilla, webvr, Oculus
Earlier this month, the W3C published an Editor's Draft for WebVR 1.0. The specification has not yet been ratified, but the proposal is backed by engineers from Mozilla and Google. It enables the use of VR headsets in the web browser, including all the security required, such as isolating input to a single tab (in case you need to input a password while the HMD is on your face).
Firefox Nightly, as of August 16th, now supports the draft 1.0 specification.
The browser currently supports Oculus CV1 and DK2 on Windows. It does not work with DK1, although Oculus provided backers of that KickStarter with a CV1 anyway, and it does not (yet) support the HTC Vive. It also only deals with the headset itself, not any motion controllers. I guess, if your application requires this functionality, you will need to keep working on native applications for a little while longer.
Subject: General Tech | July 16, 2016 - 05:41 PM | Scott Michaud
Tagged: mozilla, Rust, firefox
Mozilla has been working on the Rust language for several years now. It is designed to be extremely fast, memory-safe, and easy to parallelize on multi-core processors, doing so by having a compiler that's not afraid to tell you “Nope.” Mozilla (and others, like Samsung) want a language with those characteristics because it will make an extremely fast, yet secure, web browser (although there's a lot of single-threaded design choices tangled in the Web specifications).
The first example will arrive next month for Windows, though (64-bit OSX and Linux already had it). Firefox 48 will replace a small portion of the code, originally written in C++, with a Rust-based equivalent. The affected component parses media files, getting values like track id, duration, resolution, and so forth. Because it's written in Rust, this ingestion should be resilient to memory-based vulnerabilities.
This probably will not be noticeable to end-users, but it's a few thousand less lines of code that Mozilla should need to worry about hijacking the browser. Mozilla is also planning on bringing URL parsing to Rust, and has already done so with Servo. You would think that the C++ code has been battle-hardened by now, but, I mean, 15-year-old open-source bugs do exist, hiding in plain sight.
Subject: General Tech | July 1, 2016 - 07:12 PM | Scott Michaud
Tagged: web browser, gecko, servo, Rust, mozilla, Samsung
No love for Windows at the moment, but Mozilla is showing previews of their new browser rendering engine, Servo. This one is developed in Rust, which is a highly parallel yet very memory safe language, which are two great features for a web browser, especially on mobile and multi-core desktops. You are currently able to pick it up on Mac and Linux, although it is not ready to be your primary browser yet. Windows and Android builds “should be available soon”.
Basically, Mozilla has been spending the last few years re-thinking how to design a web browser. Most Web standards are based on assumptions that the browser is going through a main loop, and that these items will occur in sequence. Back in 2013, most of the research was to see far a browser could travel into parallelization before compatibility just stops following. Samsung, who is obviously interested in smartphone technology, partnered with them, because it's easier to add more cores onto a mobile SoC than it is to make existing ones faster.
At the time, they weren't sure whether this research would be used to improve Gecko, the current rendering engine that has been around since Netscape 6, or create a suitable replacement for it. As far as I know, that decision has still not been made, but they also haven't bailed on it yet.
Perhaps we'll see a new wave of Web technology coming soon? Maybe even break up the Webkit monopoly that seems to be forming, led by iOS and Android devices?
Subject: General Tech | June 9, 2016 - 01:08 AM | Scott Michaud
Tagged: mozilla, firefox
This roll-out won't necessarily be immediate, though. You can install Firefox 48 and, only some weeks later, get Electrolysis turned on retroactively. They are starting with about 1% of eligible users, which will ramp up to all eligible users over time or even be disabled if alarm bells start to ring.
Speaking of eligible users, there are quite a few conditions that will prevent you from getting Electrolysis. Namely, if you use extensions (it's unclear if they're talking about all extensions, or just ones that use certain APIs) then you will be kept on single-process. They don't specify why, but it could very well be the situation that I mentioned in the first paragraph.
Firefox 48 is scheduled to be released in six weeks (the first week of August).
Subject: General Tech | March 18, 2016 - 09:26 PM | Scott Michaud
Tagged: mozilla, servo, Rust
Mozilla, the open-source creators of Firefox and Thunderbird, have announced that their Servo project will reach public alpha in June. Nightly builds will be available, presumably around that time, for Linux, OSX, Windows, and Android. Servo is a browser engine that is built in Rust, which emphasizes security and high performance (especially in multi-threaded scenarios).
The technology is really interesting, although it is still quite early. Web browsers are massively single-threaded by design, which limits their potential performance as CPUs widen in core count but stagnate in per-thread performance. This is especially true in mobile, which is why Samsung has been collaborating on Servo for almost all of its life.
Rust, being so strict about memory access, also has the advantage of security and memory management. It is designed in such a way that it's easier for the compiler to know, at compile time, whether you will be trying to access data that is no longer available. The trade-off is that it's harder to program, because if your code isn't robust enough, the compiler just won't accept it. This is beneficial for web browsers, though, because basically everything they access is untrusted, third-party data. It's better to fight your compiler than to fight people trying to exploit your users.
Again, it's still a way off, though. It might be good for web developers to keep an eye on, though, in case any of their optimizations implement standards either correctly, but differently from other browsers and highlights a bug in your website, or incorrectly, which exposes a bug in Servo. Making a web browser is immensely difficult.