Subject: General Tech | July 16, 2016 - 05:41 PM | Scott Michaud
Tagged: mozilla, Rust, firefox
Mozilla has been working on the Rust language for several years now. It is designed to be extremely fast, memory-safe, and easy to parallelize on multi-core processors, doing so by having a compiler that's not afraid to tell you “Nope.” Mozilla (and others, like Samsung) want a language with those characteristics because it will make an extremely fast, yet secure, web browser (although there's a lot of single-threaded design choices tangled in the Web specifications).
The first example will arrive next month for Windows, though (64-bit OSX and Linux already had it). Firefox 48 will replace a small portion of the code, originally written in C++, with a Rust-based equivalent. The affected component parses media files, getting values like track id, duration, resolution, and so forth. Because it's written in Rust, this ingestion should be resilient to memory-based vulnerabilities.
This probably will not be noticeable to end-users, but it's a few thousand less lines of code that Mozilla should need to worry about hijacking the browser. Mozilla is also planning on bringing URL parsing to Rust, and has already done so with Servo. You would think that the C++ code has been battle-hardened by now, but, I mean, 15-year-old open-source bugs do exist, hiding in plain sight.
Subject: General Tech | July 1, 2016 - 07:12 PM | Scott Michaud
Tagged: web browser, gecko, servo, Rust, mozilla, Samsung
No love for Windows at the moment, but Mozilla is showing previews of their new browser rendering engine, Servo. This one is developed in Rust, which is a highly parallel yet very memory safe language, which are two great features for a web browser, especially on mobile and multi-core desktops. You are currently able to pick it up on Mac and Linux, although it is not ready to be your primary browser yet. Windows and Android builds “should be available soon”.
Basically, Mozilla has been spending the last few years re-thinking how to design a web browser. Most Web standards are based on assumptions that the browser is going through a main loop, and that these items will occur in sequence. Back in 2013, most of the research was to see far a browser could travel into parallelization before compatibility just stops following. Samsung, who is obviously interested in smartphone technology, partnered with them, because it's easier to add more cores onto a mobile SoC than it is to make existing ones faster.
At the time, they weren't sure whether this research would be used to improve Gecko, the current rendering engine that has been around since Netscape 6, or create a suitable replacement for it. As far as I know, that decision has still not been made, but they also haven't bailed on it yet.
Perhaps we'll see a new wave of Web technology coming soon? Maybe even break up the Webkit monopoly that seems to be forming, led by iOS and Android devices?
Subject: General Tech | March 18, 2016 - 09:26 PM | Scott Michaud
Tagged: mozilla, servo, Rust
Mozilla, the open-source creators of Firefox and Thunderbird, have announced that their Servo project will reach public alpha in June. Nightly builds will be available, presumably around that time, for Linux, OSX, Windows, and Android. Servo is a browser engine that is built in Rust, which emphasizes security and high performance (especially in multi-threaded scenarios).
The technology is really interesting, although it is still quite early. Web browsers are massively single-threaded by design, which limits their potential performance as CPUs widen in core count but stagnate in per-thread performance. This is especially true in mobile, which is why Samsung has been collaborating on Servo for almost all of its life.
Rust, being so strict about memory access, also has the advantage of security and memory management. It is designed in such a way that it's easier for the compiler to know, at compile time, whether you will be trying to access data that is no longer available. The trade-off is that it's harder to program, because if your code isn't robust enough, the compiler just won't accept it. This is beneficial for web browsers, though, because basically everything they access is untrusted, third-party data. It's better to fight your compiler than to fight people trying to exploit your users.
Again, it's still a way off, though. It might be good for web developers to keep an eye on, though, in case any of their optimizations implement standards either correctly, but differently from other browsers and highlights a bug in your website, or incorrectly, which exposes a bug in Servo. Making a web browser is immensely difficult.
Subject: General Tech | November 5, 2015 - 07:01 AM | Scott Michaud
Tagged: valve, steam, Rust
Team Fortress 2 switched from a paid game, first seen in The Orange Box bundle, to a free-to-play title. Financially, you could say that it was supported by tips... ... tips of the hat. Some responded with a wag of their finger, but others with a swipe of their credit card. Where was I going with this? Oh right. This game put Valve on the path of microtransactions, which fuels games like DOTA 2 that aren't supported in any other way.
Each of these item payments are done in game however, even Valve games, except for one. Rust has been chosen to introduce Item Stores on Steam. If you go to Rust's store page, you will see a category called “Items available for this game”. Clicking on it brings you to “Rust Item Store”, where you can buy in-game clothing, weapons, and sleeping bags with real money. This feature is not even available on Team Fortress 2 or DOTA 2.
While there has been some parallels drawn between this and the backtracked paid mods initiative, I don't see it. This is not attempting to take third-party content, some of which was plagiarized from free, existing mods, and sell it. This is an attempt to provide a platform for in-game purchases that already exist. If there's a story, I'd say it's how the initiative launched with a third-party game, and not one of Valve's two, popular, free-to-play titles.
Subject: General Tech | December 5, 2013 - 03:59 AM | Scott Michaud
Tagged: GCC, Rust, mozilla
Rust is an interesting language in that it aims to be safe and concurrent. It was discussed frequently at Mozilla Summit back in early October both on its own and in terms of the experimental HTML5 rendering engine, Servo. From how it was describe to me from other attendees, it prides itself on its task-based architecture. Basically, your application is (or, at least, is often) set up like a bunch of tasks that get scheduled concurrently and pass messages to one another if they want to communicate. This concept allow for efficient multithreading because each task is inherently independent.
This may remind you of the experiments John Carmack did with Wolfenstein and Haskell.
Apparently at least one developer from the GNU Compiler Collection (GCC) is also paying attention. Philip Herron has been working on the "gccrs" branch to create a GCC front-end for Mozilla's language.
We will need languages like Rust in the near future as processors continue to ramp up in thread count. Just look at the Xeon Phi story from last week: a bootable 288-thread standalone processor based on the Silvermont architecture. If you want this processor to be used efficiently then you better be light on the main thread otherwise your 6 TFLOPs (3 TFLOPs double-precision) will only be quick to behave like an Atom.