Mozilla Publishes WebVR 1.0 to Nightly Releases

Subject: General Tech | August 20, 2016 - 05:36 PM |
Tagged: mozilla, webvr, Oculus

Earlier this month, the W3C published an Editor's Draft for WebVR 1.0. The specification has not yet been ratified, but the proposal is backed by engineers from Mozilla and Google. It enables the use of VR headsets in the web browser, including all the security required, such as isolating input to a single tab (in case you need to input a password while the HMD is on your face).

Mozilla_Firefox_logo_2013.png

Firefox Nightly, as of August 16th, now supports the draft 1.0 specification.

The browser currently supports Oculus CV1 and DK2 on Windows. It does not work with DK1, although Oculus provided backers of that KickStarter with a CV1 anyway, and it does not (yet) support the HTC Vive. It also only deals with the headset itself, not any motion controllers. I guess, if your application requires this functionality, you will need to keep working on native applications for a little while longer.

Source: Mozilla

Mozilla to Publish First Rust-based Module in Firefox 48

Subject: General Tech | July 16, 2016 - 05:41 PM |
Tagged: mozilla, Rust, firefox

Mozilla has been working on the Rust language for several years now. It is designed to be extremely fast, memory-safe, and easy to parallelize on multi-core processors, doing so by having a compiler that's not afraid to tell you “Nope.” Mozilla (and others, like Samsung) want a language with those characteristics because it will make an extremely fast, yet secure, web browser (although there's a lot of single-threaded design choices tangled in the Web specifications).

mozilla-rust.png

The first example will arrive next month for Windows, though (64-bit OSX and Linux already had it). Firefox 48 will replace a small portion of the code, originally written in C++, with a Rust-based equivalent. The affected component parses media files, getting values like track id, duration, resolution, and so forth. Because it's written in Rust, this ingestion should be resilient to memory-based vulnerabilities.

This probably will not be noticeable to end-users, but it's a few thousand less lines of code that Mozilla should need to worry about hijacking the browser. Mozilla is also planning on bringing URL parsing to Rust, and has already done so with Servo. You would think that the C++ code has been battle-hardened by now, but, I mean, 15-year-old open-source bugs do exist, hiding in plain sight.

Source: Mozilla

Mozilla Publishes Servo Nightly (for Mac and Linux)

Subject: General Tech | July 1, 2016 - 07:12 PM |
Tagged: web browser, gecko, servo, Rust, mozilla, Samsung

No love for Windows at the moment, but Mozilla is showing previews of their new browser rendering engine, Servo. This one is developed in Rust, which is a highly parallel yet very memory safe language, which are two great features for a web browser, especially on mobile and multi-core desktops. You are currently able to pick it up on Mac and Linux, although it is not ready to be your primary browser yet. Windows and Android builds “should be available soon”.

Basically, Mozilla has been spending the last few years re-thinking how to design a web browser. Most Web standards are based on assumptions that the browser is going through a main loop, and that these items will occur in sequence. Back in 2013, most of the research was to see far a browser could travel into parallelization before compatibility just stops following. Samsung, who is obviously interested in smartphone technology, partnered with them, because it's easier to add more cores onto a mobile SoC than it is to make existing ones faster.

mozilla-architecture.jpg

At the time, they weren't sure whether this research would be used to improve Gecko, the current rendering engine that has been around since Netscape 6, or create a suitable replacement for it. As far as I know, that decision has still not been made, but they also haven't bailed on it yet.

Perhaps we'll see a new wave of Web technology coming soon? Maybe even break up the Webkit monopoly that seems to be forming, led by iOS and Android devices?

Source: Mozilla

Mozilla Will Begin Electrolysis with Firefox 48

Subject: General Tech | June 9, 2016 - 01:08 AM |
Tagged: mozilla, firefox

Electrolysis (e10s) is Mozilla's codename for their multi-process initiative in Firefox. The main goal of this is to separate the content of the website from the user interface. This means that, if a site has long-running JavaScript or layout, Firefox will not lock up. This seems like a simple idea, except that it undoes over a decade of assumptions that were made during Firefox's development. Imagine, for instance, that you have an extensions which modifies both the browser UI as well as the page content -- that's a single script that needs to be run across multiple threads. Whoops!

Mozilla_Firefox_logo_2013.png

This roll-out won't necessarily be immediate, though. You can install Firefox 48 and, only some weeks later, get Electrolysis turned on retroactively. They are starting with about 1% of eligible users, which will ramp up to all eligible users over time or even be disabled if alarm bells start to ring.

Speaking of eligible users, there are quite a few conditions that will prevent you from getting Electrolysis. Namely, if you use extensions (it's unclear if they're talking about all extensions, or just ones that use certain APIs) then you will be kept on single-process. They don't specify why, but it could very well be the situation that I mentioned in the first paragraph.

Firefox 48 is scheduled to be released in six weeks (the first week of August).

Mozilla to Preview Servo in June

Subject: General Tech | March 18, 2016 - 09:26 PM |
Tagged: mozilla, servo, Rust

Mozilla, the open-source creators of Firefox and Thunderbird, have announced that their Servo project will reach public alpha in June. Nightly builds will be available, presumably around that time, for Linux, OSX, Windows, and Android. Servo is a browser engine that is built in Rust, which emphasizes security and high performance (especially in multi-threaded scenarios).

mozilla-architecture.jpg

The technology is really interesting, although it is still quite early. Web browsers are massively single-threaded by design, which limits their potential performance as CPUs widen in core count but stagnate in per-thread performance. This is especially true in mobile, which is why Samsung has been collaborating on Servo for almost all of its life.

Rust, being so strict about memory access, also has the advantage of security and memory management. It is designed in such a way that it's easier for the compiler to know, at compile time, whether you will be trying to access data that is no longer available. The trade-off is that it's harder to program, because if your code isn't robust enough, the compiler just won't accept it. This is beneficial for web browsers, though, because basically everything they access is untrusted, third-party data. It's better to fight your compiler than to fight people trying to exploit your users.

Again, it's still a way off, though. It might be good for web developers to keep an eye on, though, in case any of their optimizations implement standards either correctly, but differently from other browsers and highlights a bug in your website, or incorrectly, which exposes a bug in Servo. Making a web browser is immensely difficult.

Source: Mozilla

WebGL2 Is On Its Way

Subject: General Tech | December 31, 2015 - 10:25 PM |
Tagged: webgl2, webgl, mozilla, firefox

The Khronos Group created WebGL to bring a GPU-accelerated platform to web browsers. With a few minor differences, it is basically JavaScript bindings for OpenGL ES 2.0. It also created a few standards in JavaScript itself to support things like raw buffers of data that could be assigned types in an unmanaged way. Basically every latest-version web browser supports it these days, and we're starting to see it used in interesting ways.

webgl2-2015-particles.jpg

The next step is WebGL2. OpenGL ES 3.0 adds a bunch of new features that are sorely needed for modern games and applications. For instance, it allows drawing to multiple render targets, which is very useful for virtual cameras in video games (although the original WebGL software could access this as an optional extension when supported). The addition of “Uniform Buffer Objects” is a better example. This allows you to store a bunch of data, like view transformation matrices, as a single buffer that can be bound to multiple applications, rather than binding them one at a time to every draw that needs them.

It's hard to describe, but demos speak a thousand words.

The news today is that Mozilla Nightly now ships with WebGL2 enabled by default. It was previously hidden, disabled by default, behind an option in the browser. This doesn't seem like a big deal, but one of the largest hurdles to WebGL2 is how the browsers actually implement it. The shading language in WebGL was simple enough that most browsers convert it to DirectX HLSL on Windows. This is said to have the added advantage of obfuscating the ability to write malicious code, since developers never directly writes what's executed. GLSL in OpenGL ES 3.0 is much more difficult. I'm not sure whether the browsers will begin to trust OpenGL ES 3.0 drivers directly, or if they finally updated the GLSL translator, but supported implementations means that something was fixed.

Unfortunately, OpenGL compute shaders are not supported in WebGL2. That said, the biggest hurdle is, again, to get WebGL2 working at all. From my talks with browser vendors over the last year or so, it sounds like features (including compute shaders) should start flying in easily once this hurdle is cleared. From there, GPGPU in a website should be much more straightforward.

Mozilla Abandons Firefox OS Smartphones

Subject: Editorial, Mobile, Shows and Expos | December 9, 2015 - 07:04 AM |
Tagged: yahoo, mozilla, google, Firefox OS, Android

Author's Disclosure: I volunteer for Mozilla, unpaid. I've been to one of their events in 2013, but otherwise have no financial ties with them. They actually weren't aware that I was a journalist. Still, our readers should know my background when reading my editorial.

Mozilla has announced that, while Firefox OS will still be developed for “many connected devices,” the organization will stop developing and selling smartphones through carriers. Mozilla claims that the reason is because they “weren't able to offer the best user experience possible.” While the statement is generic enough to apply in a lot of contexts, I'm not sure how close to the center of that region it is.

This all occurred at the “Mozlando” conference in Florida.

firefoxos.jpg

Firefox OS was born when stakeholders asked Mozilla to get involved in the iOS and Android duopoly. Unlike Windows, Blackberry, and other competitors, Mozilla has a history of leveraging Web standards to topple industry giants. Rather than trying to fight the industry leaders with a better platform, and hoping that developers create enough apps to draw users over, they expanded what Web could do to dig the ground out of their competitors.

This makes sense. Mobile apps were still in their infancy themselves, so Firefox OS wouldn't need to defeat decades of lock-in or orders of magnitude performance deltas. JavaScript is getting quite fast anyway, especially when transpiled from an unmanaged language like C, so apps could exist to show developers that the phones are just as capable as their competitors.

ilovetheweb.png

The issue is that being able to achieve high performance is different from actually achieving it. The Web, as a platform, is getting panned as slow and “memory hungry” (even though free memory doesn't make a system faster -- it's all about the overhead required to manage it). Likewise, the first few phones landed at the low end, due in part to Mozilla, the non-profit organization remember, wanting to use Firefox OS to bring computing to new areas of the world. A few hiccups here and there added another coat of paint to the Web's perception of low performance.

Granted, they couldn't compete on the high end without a successful app ecosystem if they tried. Only the most hardcore of fans would purchase a several-hundred dollar smartphone, and intend to put up with just Web apps. Likewise, when I've told people that phones run on the Web, they didn't realize we mean “primarily localhost” until it's explicitly stated. People are afraid for their data caps, even though offline experiences are actually offline and stored locally.

The Dinosaur in the Room

Then there's the last question that I have. I am a bit concerned about the organization as a whole. They seem to be trying to shed several products lately, and narrow their focus. Granted, all of these announcements occur because of the event, so there's plenty of room for coincidence. They have announced that they will drop ad tiles, which I've heard praised.

Mozilla_Foundation_201x_logo.png

The problem is, why would they do that? Was it for good will, aligning with their non-profit values? (Update: Fixed double-negative typo) Or was it bringing in much less money than projected? If it's the latter, then how far do they need to shrink their influence, and how? Did they already over-extend, and will they need to compensate for that? Looking at their other decisions, they've downsized Firefox OS, they are thinking about spinning out Thunderbird again, and they have quietly shuttered several internal projects, like their division for skunkworks projects, called “Mozilla Labs.” Mozilla also has a division called "Mozilla Research," although that is going strong. They are continually hiring for projects like "Servo," a potential new browser engine, and "Rust," a programming language that is used for Servo and other projects.

While Mozilla is definitely stable enough, financially, to thrive in their core products, I'm concerned about how much they can do beyond that. I'm genuinely concerned that Mozilla is trying to restructure while looking like a warrior for both human rights and platforms of free expression. We will not see the books until a few months from now, so we can only speculate until then. The organization is pulling inward, though. I don't know how much of this is refocusing on the problems they can solve, or the problems they can afford. We will see.

Source: Techcrunch

Firefox 38 Launches with (and also without) DRM Support

Subject: General Tech | May 13, 2015 - 05:06 PM |
Tagged: mozilla, firefox, DRM

Mozilla has just released Firefox 38. With it comes the controversial Adobe Primetime DRM implementation through the W3C's Encrypted Media Extensions (EME). Or, maybe not. If you upgrade the browser through one of the default channels, the Adobe Primetime Content Decryption Module will appear in the Plugins tab of your Add-ons manager on Windows Vista or later (but it might take a few minutes after the upgrade).

Mozilla_Firefox_logo_2013.png

Alternatively, you can use Mozilla's EME-free installer for Firefox and avoid it altogether.

I have mentioned my concerns about DRM in the past. EME does not particularly bother me, because it is just a plugin architecture, but the fundamental concept does. Simply put, copy protection does very little good and a whole lot of bad. If your movie is leaked before it is legally available in consumer's hands, as it regularly does, then what do you expect to accomplish after the fact? It takes one instance to be copied infinitely, and that often comes from the film company's own supply chain, not their customers. Moreover, it is found to reduce sales and hurt customer experience (above and beyond the valid ideological concerns).

Beyond the DRM inclusion, several new features were added. One of the more interesting ones is BroadcastChannel API. This standard allows a web application to share data between “contexts” that have the same “user agent and origin”. In other words, it must be on the same browser and using the same app (even secondary instances of it). This will allow sites to do multi-monitor split screen, which is useful for games and utilities.

WebRTC has also been upgraded with multistream and renegotiation. Even though the general public thinks of WebRTC as a webcam and voice chat standard, it actually allows arbitrary data channels. For example, “BananaBread” is a first person shooter that used WebRTC to synchronize multiplayer state. Character and projectile position is very much not webcam or audio data, but WebRTC doesn't care.

Firefox 38 launched on May 12th with an optional, DRM-incompatible build.

Source: Mozilla

Forcing HTTPS Is Being Discussed

Subject: General Tech | April 14, 2015 - 08:08 PM |
Tagged: mozilla, http, https, firefox

On the Mozilla Dev-Platform Newsgroup, hosted at Google Groups, a proposal to deprecate insecure HTTP is being discussed. The idea is that HTTPS needs to be adopted and organizations will not do it without being pushed. The plan is to get browser vendors to refuse activating new features, and eventually disable old features, unless the site is loaded as a “privileged context”.

22-mozilla-2.jpg

This has sparked a debate, which was the whole point of course, about how secure do we want the Web to be. What features should we retroactively disable unless it is done through HTTPS? Things that access your webcam and microphone? Things that write to your hard drive? Then there is the question of how to handle self-signed certificates to get encryption without verification, and so forth.

Note: Websites cannot access or create files on your hard drive, but standards like localStorage and IndexedDB allow websites to have their own spaces for persistence. This is to allow, for instance, a 3D game to cache textures (and so forth) so you don't need to download them every time.

Personally, this concerns me greatly. I started helping Mozilla a couple of years ago, a few weeks after I saw Microsoft's Windows 8 developer certification program. I do not like the thought of someone being able to stifle creation and expression, and the web was looking like it might be the last bastion of unrestricted development for the general public.

In the original Windows Store requirements, no browser could exist unless it was a skin of Trident. This meant that, if a site didn't work in Internet Explorer, it didn't exist. If you didn't want to play by their rules? Your app didn't get signed and your developer certificate could even be revoked by Microsoft, or someone with authority over them. You could imagine the problems a LGBT-focused developer might have in certain countries, even if Microsoft likes their creations.

This is obviously not as bad as that. In the Windows Store case, there was one authority whereas HTTPS can be authenticated by numerous providers. Also, if self-signed certificates are deemed “secure enough”, it would likely avoid the problem. You would not need to ask one of a list of authorities permission to exist; you could secure the connection yourself. Of course, that is a barrier of skill for many, and that is its own concern.

So we'll see, but I hope that Mozilla will take these concerns as a top priority in their decisions.

Source: Mozilla

Just wait, blacklisting dangerous root certificates will lead to a legal battle

Subject: General Tech | February 23, 2015 - 01:35 PM |
Tagged: superfish, mozilla, komodia, security

Firefox can remove any threat that Superfish presents with a simple step and 24 hours; indeed they could prevent any similar issue using a questionable or downright poisonous SSL Certificate simply by blacklisting them.  They specifically quote the ability of OneCRL to block even obfuscated certs before the Network Security Services level if the certs are properly recorded on the blacklist in this Register article.  This would lead to a much more secure web, requiring attackers to invest significantly more effort when attempting to create fake or dangerous SSL certs.  There is a flip side to this, for there are those who may attempt to have valid certs added to the Blacklist and so there must be a way of policing the list and a way to remove certs which should not be on the list due to being placed there in error or because of a change in the software associated with that certificate.  It is also likely that there will be court cases attempting to have the blacklist removed if it does come into being as Superfish is not the only business out there whose business model requires phishing or at least a way around proper SSL certification and best practices which will no longer be viable if we are allowed to block their mutant SSL certs.

images.jpg

"Firefox-maker Mozilla may neuter the likes of Superfish by blacklisting dangerous root certificates revealed less than a week ago to be used in Lenovo laptops."

Here is some more Tech News from around the web:

Tech Talk

Source: The Register