Subject: General Tech | April 14, 2015 - 08:08 PM | Scott Michaud
Tagged: mozilla, http, https, firefox
On the Mozilla Dev-Platform Newsgroup, hosted at Google Groups, a proposal to deprecate insecure HTTP is being discussed. The idea is that HTTPS needs to be adopted and organizations will not do it without being pushed. The plan is to get browser vendors to refuse activating new features, and eventually disable old features, unless the site is loaded as a “privileged context”.
This has sparked a debate, which was the whole point of course, about how secure do we want the Web to be. What features should we retroactively disable unless it is done through HTTPS? Things that access your webcam and microphone? Things that write to your hard drive? Then there is the question of how to handle self-signed certificates to get encryption without verification, and so forth.
Note: Websites cannot access or create files on your hard drive, but standards like localStorage and IndexedDB allow websites to have their own spaces for persistence. This is to allow, for instance, a 3D game to cache textures (and so forth) so you don't need to download them every time.
Personally, this concerns me greatly. I started helping Mozilla a couple of years ago, a few weeks after I saw Microsoft's Windows 8 developer certification program. I do not like the thought of someone being able to stifle creation and expression, and the web was looking like it might be the last bastion of unrestricted development for the general public.
In the original Windows Store requirements, no browser could exist unless it was a skin of Trident. This meant that, if a site didn't work in Internet Explorer, it didn't exist. If you didn't want to play by their rules? Your app didn't get signed and your developer certificate could even be revoked by Microsoft, or someone with authority over them. You could imagine the problems a LGBT-focused developer might have in certain countries, even if Microsoft likes their creations.
This is obviously not as bad as that. In the Windows Store case, there was one authority whereas HTTPS can be authenticated by numerous providers. Also, if self-signed certificates are deemed “secure enough”, it would likely avoid the problem. You would not need to ask one of a list of authorities permission to exist; you could secure the connection yourself. Of course, that is a barrier of skill for many, and that is its own concern.
So we'll see, but I hope that Mozilla will take these concerns as a top priority in their decisions.
Subject: General Tech | February 23, 2015 - 01:35 PM | Jeremy Hellstrom
Tagged: superfish, mozilla, komodia, security
Firefox can remove any threat that Superfish presents with a simple step and 24 hours; indeed they could prevent any similar issue using a questionable or downright poisonous SSL Certificate simply by blacklisting them. They specifically quote the ability of OneCRL to block even obfuscated certs before the Network Security Services level if the certs are properly recorded on the blacklist in this Register article. This would lead to a much more secure web, requiring attackers to invest significantly more effort when attempting to create fake or dangerous SSL certs. There is a flip side to this, for there are those who may attempt to have valid certs added to the Blacklist and so there must be a way of policing the list and a way to remove certs which should not be on the list due to being placed there in error or because of a change in the software associated with that certificate. It is also likely that there will be court cases attempting to have the blacklist removed if it does come into being as Superfish is not the only business out there whose business model requires phishing or at least a way around proper SSL certification and best practices which will no longer be viable if we are allowed to block their mutant SSL certs.
"Firefox-maker Mozilla may neuter the likes of Superfish by blacklisting dangerous root certificates revealed less than a week ago to be used in Lenovo laptops."
Here is some more Tech News from around the web:
- Hackers now popping Cisco VPN portals @ The Register
- Pandora Pays Artists $0.001 Per Stream, Thinks This Is "Very Fair" @ Slashdot
- iOS 8.3 to be made available as public beta as Apple aims for bug-free releases @ The Inquirer
- Portable USB Wall Charger Roundup @ eTeknix
- Tech ARP 2015 Mega Giveaway
Subject: General Tech | November 27, 2014 - 09:29 PM | Scott Michaud
Tagged: apple, safari, google, yahoo, bing, microsoft, mozilla
After Mozilla inked the deal with Yahoo, the eyes turned to Apple and its Safari browser. Currently, the default search engine is Google on both iOS and OSX, although Bing is the primary engine used for other functions, like Siri and Spotlight. Until early 2015, they are tied into a contract with Google for those two browsers, but who will get the new contract?
Apparently Yahoo and Microsoft have both approached the company for the position, and Apple is not ruling any of the three out. Probably the most interesting part is how Yahoo is genuinely taking the search business seriously. The deal with Mozilla is fairly long-term, and with Yahoo approaching Apple as well, it probably was not just charity on Mozilla's part because no-one else wanted to be Firefox's default. Yahoo would probably need some significant monetary backing for an Apple deal, which suggests the same for their deal with Mozilla.
If both Mozilla and Apple leave Google, it will take a significant chunk out of the search engine. Power users, like those who read this site, will likely be unaffected if they care, because of how low the barrier is to change the default search engine. On the other hand, even the most experienced user will often accept default settings until there is a reason to change. The winning party will need to have a good enough product to overcome that initial shock.
But the money will at least give them a chance when the decision comes into effect. That is, unless the barrier to changing default search engines is less than the barrier to changing default web browsers.
Google will always be default on Google Chrome.
Subject: General Tech | November 20, 2014 - 10:10 PM | Scott Michaud
Tagged: yahoo, mozilla, google, firefox
Mozilla, developer of the Firefox web browser, has been mostly funded by Google for the last decade. Between 2005 and 2011, the search giant slowly ramped up its contributions from around $50 million USD for a single year to just over $100 million for the last year. All of this money was to keep the default search engine set to Google for the location and search bar. At that time, journalists were voicing their concerns that Mozilla would be cut off after the success Google saw with their Chrome browser.
In December 2011, Google and Mozilla surprised the world with a different announcement, $300 million dollars per year until November 2014, or almost three times their previous annual contributions. I could not help but feel it was like a light bulb that flares before it extinguishes, although later rumors claimed that Microsoft and Yahoo drove up Google's bid with high counter-offers. Of course, that deal ends this month and Google is no longer the winning bid, if they even proposed a deal at all.
This time, Yahoo won for the next five years (in the US) with a currently undisclosed sum. Yandex will be the default for Russia, and Baidu has been renewed as the default in China.
Yahoo also committed to supporting the Do Not Track (DNT) header for Firefox browsers. If your settings have DNT enabled, the search engine will adjust its behavior to acknowledge your request for privacy. One thing that has not been mentioned is how they will react to your request. This could be anything from treating you as completely anonymous, to personalizing your search results but not your ads, to personalizing your ads but not your search results, to only looking at the geographic location of your IP address, and so forth.
The search experience is not what you will get by going to the Yahoo homepage today; the new site was developed in collaboration with Mozilla and will launch for Firefox users in December. It will go live for every other Yahoo user in 2015.
Subject: General Tech | November 12, 2014 - 04:54 PM | Jeremy Hellstrom
Tagged: mozilla, oculus rift, MozVR
You have been able to browse the web on your Oculus Rift since the first dev kit, but not with a UI designed specifically for the VR device. MozVR is in development along with a specific version of Firefox or Chromium to allow Oculus users to browse the web in a new way. It will work with both Mac and Windows, though as of yet there is no mention of Linux support which should change in the near future. You need to get your hands on an Oculus to try out the new browser, it simply is not going to translate to the desktop. The software is open sourced and available on Github so you can contribute to the overall design of the new way to surf the web as well as optimizing your own site for VR. Check out more on MozVR and Oculus over at The Inquirer.
"MOZILLA IS CONTINUING its 10th birthday celebrations with the launch of a virtual reality (VR) website."
Here is some more Tech News from around the web:
- Elon Musk and ex-Google man mull flinging 700 internet satellites into orbit @ The Register
- Samsung slams door on OLED TVs, makes QUANTUM dot LEAP @ The Register
- Intro to Systemd Runlevels and Service Management Commands @ Linux.com
- TSMC 16FinFET Plus process achieves risk production milestone @ DigiTimes
- Iranian contractor named as Stuxnet 'patient zero' @ The Register
- Hardware Asylum Podcast - MOA 2014 Final and Surprise Lightning
Subject: General Tech | October 6, 2014 - 03:45 AM | Scott Michaud
Tagged: windows, mozilla, firefox, 64-bit
If you had a reason, Mozilla has been compiling Firefox Nightly as a 64-bit application for Windows over the last several months. It is not a build that is designed for the general public; in fact, I believe it is basically only available to make sure that they did not horribly break anything during some arbitrary commit. That might change relatively soon, though.
According to Mozilla's "internal", albeit completely public wiki, the non-profit organization is currently planning to release an official, 64-bit version of Firefox 37. Of course, all targets in Firefox are flexible and, ultimately, it is only done when it is done. If everything goes to schedule, that should be March 31st.
The main advantage is for high-performance applications (although there are some arguments for security, too). One example is if you open numerous tabs, to get Firefox's memory usage up, then attempt to load a Web applications like BananaBread. Last I tried, it will simply not load (unless you clean up memory usage somehow, like restarting the browser). It will run out of memory and just give up. You can see how this would be difficult for higher-end games, video editing utilities, and so forth. This will not be the case when 64-bit comes around.
If you are looking to develop a web app, be sure to check out the 64-bit Firefox Nightly builds. Unless plans change, it looks like you will have even more customers soon. This is unless, of course, you are targeting Mac OSX and Linux, which already have 64-bit binaries available. Also, why are you targeting specific operating systems with a website?
Subject: General Tech, Mobile | September 13, 2014 - 10:12 PM | Scott Michaud
Tagged: mozilla, intex, Firefox OS, firefox, cloud fx
If you were on a mission to make the cheapest possible mobile phone, you would probably not do much better than Intex Cloud Fx. Running Firefox OS, it will cost users about $35 to purchase it outright. Its goal is to bring the internet to places which would otherwise have nothing.
I believe the largest concession made by this phone is its RAM -- 128 MB. Yes, I had a computer with 32 MB of RAM and it browsed the internet just fine (on Netscape Navigator 2 through 4). I also had a computer before that (which was too slow to run Windows 3.1 but hey it had a turbo button). This is also the amount of RAM on the first and second generation iPod Touches. Nowadays, it is very little. Ars Technica allegedly made it crash by scrolling too fast and attempting to run benchmarks on it. This leads into its other, major compromise: its wireless connectivity. It does not support 3G. Edge is the best that you will get.
Other than those two points: it has a 1 GHz Spreadtrum SoC, 46MB of storage, a 2MP camera, and a 1250mAh battery. You do get WiFi, Bluetooth, and a microSD card slot. It also supports two SIM cards if necessary.
Again, at $35, this is not designed for America or Western Europe. This is for the areas of the world that will probably not experience the internet at all unless it is through a mobile phone. For people in India and Asia, it is about the lowest barrier to entry of the internet that is possible. You can also check out phones from other partners of Mozilla.
Subject: General Tech | September 11, 2014 - 04:22 PM | Scott Michaud
Tagged: firefox, mozilla, web browser, web development
Remote Debugging for Safari on iOS and Chrome on Android is available in early development on Firefox Nightly with an optional extension.
Subject: Editorial, General Tech | May 14, 2014 - 09:56 PM | Scott Michaud
Tagged: ultraviolet, mozilla, DRM, Adobe Access, Adobe
Needless to say, DRM is a controversial topic and I am clearly against it. I do not blame Mozilla. The non-profit organization responsible for Firefox knew that they could not oppose Chrome, IE, and Safari while being a consumer software provider. I do not even blame Apple, Google, and Microsoft for their decisions, either. This problem is much bigger and it comes down to a total misunderstanding of basic mathematics (albeit at a ridiculously abstract and applied level).
Simply put, piracy figures are meaningless. They are a measure of how many people use content without paying (assuming they are even accurate). You know what is more useful? Sales figures. Piracy figures are measurements, dependent variables, and so is revenue. Measurements cannot influence other measurements. Specifically, measurements cannot influence anything because they are, themselves, the result of influences. That is what "a measure" is.
Implementing DRM is not a measurement, however. It is a controllable action whose influence can be recorded. If you implement DRM and your sales go down, it hurt you. You may notice piracy figures decline. However, you should be too busy to care because you should be spending your time trying to undo the damage you did to your sales! Why are you looking at piracy figures when you're bleeding money?
I have yet to see a DRM implementation that correlated with an increase in sales. I have, however, seen some which correlate to a massive decrease.
The thing is, Netflix might know that and I am pretty sure that some of the web browser companies know that. They do not necessarily want to implement DRM. What they want is content and, surprise, the people who are in charge of the content are definitely not enlightened to that logic. I am not even sure if they realize that the reason why content is pirated before their release dates is because they are not leaked by end users.
But whatever. Technical companies, who want that content available on their products, are stuck finding a way to appease those content companies in a way that damages their users and shrinks their potential market the least. For Mozilla, this means keeping as much open as possible.
Since they do not have existing relationships with Hollywood, Adobe Access will be the actual method of displaying the video. They are clear to note that this only applies to video. They believe their existing relationships in text, images, and games will prevent the disease from spreading. This is basically a plug-in architecture with a sandbox that is open source and as strict as possible.
This sandbox is intended to prevent a security vulnerability from having access to the host system, give a method of controlling the DRM's performance if it hitches, and not allow the DRM to query the machine for authentication. The last part is something they wanted to highlight, because it shows their effort to protect the privacy of their users. They also imply a method for users to opt-out but did not go into specifics.
As an aside, Adobe will support their Access DRM software on Windows, Mac, and Linux. Mozilla is pushing hard for Android and Firefox OS, too. According to Adobe, Access DRM is certified for use with Ultraviolet content.
I accept Mozilla's decision to join everyone else but I am sad that it came to this. I can think of only two reasons for including DRM: for legal (felony) "protection" under the DMCA or to make content companies feel better while they slowly sink their own ships chasing after numbers which have nothing to do with profits or revenue.
Ultimately, though, they made a compromise. That is always how we stumble and fall down slippery slopes. I am disappointed but I cannot suggest a better option.
Subject: Editorial, General Tech | May 5, 2014 - 08:08 PM | Scott Michaud
Tagged: mozilla, net neutrality
Recently, the FCC has been moving to give up Net Neutrality. Mozilla, being dedicated to the free (as in speech) and open internet, has offered a simple compromise. Their proposal is that the FCC classifies internet service providers (ISPs) as common carriers on the server side, forcing restrictions on them to prevent discrimination of traffic to customers, while allowing them to be "information services" to consumers.
In other words, force ISPs to allow services to have unrestricted access to consumers, without flipping unnecessary tables with content distribution (TV, etc.) services. Like all possibilities so far, it could have some consequences, however.
"Net Neutrality" is a hot issue lately. Simply put, the internet gives society an affordable method of sharing information. How much is "just information" is catching numerous industries off guard, including ones which Internet Service Providers (ISPs) participate in (such as TV and Movie distribution), and that leads to serious tensions.
On the one hand, these companies want to protect their existing business models. They want consumers to continue to select their cable and satellite TV packages, on-demand videos, and other services at controlled profit margins and without the stress and uncertainty of competing.
On the other hand, if the world changes, they want to be the winner in that new reality. Yikes.
A... bad... photograph of Mozilla's "UP" anti-datamining proposal.
Mozilla's proposal is very typical of them. They tend to propose compromises which divides an issue such that both sides get the majority of their needs. Another good example is "UP", or User Personalization, which tries to cut down on data mining by giving a method for the browser to tell websites what they actually want to know (and let the user tell the browser how much to tell them). The user would compromise, giving the amount of information they find acceptable, so the website would compromise and take only what they need (rather than developing methods to grab anything and everything they can). It feels like a similar thing is happening here. This proposal gives users what they want, freedom to choose services without restriction, without tossing ISPs into "Title II" common carrier altogether.
Of course, this probably comes with a few caveats...
The first issue that pops in my mind is, "What is a service?". I see this causing problems for peer-to-peer applications (including BitTorrent Sync and Crashplan, excluding Crashplan Central). Neither endpoint would necessarily be classified as "a server", or at least convince a non-technical lawmaker that is the case, and thus ISPs would not need to apply common carrier restrictions to them. This could be a serious issue for WebRTC. Even worse, companies like Google and Netflix would have no incentive to help fight those battles -- they're legally protected. It would have to be defined, very clearly, what makes "a server".
Every method will get messy for someone. Still, the discussion is being made.