Subject: General Tech | March 27, 2013 - 11:01 PM | Tim Verry
Tagged: Internet, hc-pbgf, fiber, data transmission
Transmitting data over optical fiber is one of the fastest methods available, and researchers at the University of Southampton have managed to dial up the speed even further.
Being optical in nature, light is used to transmit data over fiber. The speed of light through a vacuum is 299,792,458 meters per second, but traditional fiber is not nearly that fast due to light traveling approximately 31% slower (206,856,796.02 m/s) through silica glass than a vacuum.
The new fiber employs a hollow design that allows light to travel through air rather than glass while still allowing the cable to bend and twist around corners. The new fiber has been dubbed Hollow Core Photonic Bandgap Fiber, or HC-PBGF, and allows light to travel up to about 298,893,080.63 m/s (~99.7% the speed of light). Currently, the HC-PBGF fiber is still in the experimental phase, but it could have big implications for data centers and HPC server clusters that depend on high bandwidth, low latency connections between individual nodes.
Just how fast is the new HC-PBGF? According to ExtremeTech, a researcher told the site that the new fiber has a total cable throughput of 73.7 Tbps. It transmits 3 modes of 96 channels of 256 Gbps each using a combination of wave division multiplexing and mode division multiplexing. The fiber is 160nm and is noticeably faster than traditional fiber. Additionally, the HC-PBGF has a data loss of 3.5 dB/km which makes it a useful candidate for short runs between nodes or rows of racks, but not yet suitable for longer runs. HC-PBGF will not be blanketing your neighborhood anytime soon, but the research may lead to new optical networking technologies used in the next supercomputer or cloud service, for example.
The full paper can be found here, along with more details over at Ars Technica. Unfortunately, the full paper is behind a paywall but it may be worth seeing your school or work can give you access should you be interested in drilling into the details of the experimental hollow fiber,.
Subject: General Tech | December 26, 2012 - 04:34 PM | Tim Verry
Tagged: mozilla, firefox, browser, Internet, 64-bit
A month ago Mozilla announced that it would no longer release 64-bit versions of its popular Firefox web browser due to a lack of resources. While the stable versions for Windows were 32-bit, nightly builds were available to enthusiasts that were 64-bit and could take advantage of more than 4GB of memory.
Mozilla developer Benjamin Smedberg stated that there was significant negative feedback from the community over the decision to axe 64-bit nightlies. While Mozilla has reaffirmed that they do not have the resources to support 64-bit builds, the developers are proposing a compromise that they hope will assuage users. In short, the Release Engineering team will continue to build 64-bit versions of the Firefox browser, but Mozilla will consider it a teir 3 build and the support is left up to the community.
Currently, the plan regarding 64-bit versions of Firefox involves a forced migration of existing 64-bit users to 32-bit versions via the automatic browser updates. Then, after the migration date, users that want the 64-bit version will need to go and download it again. Once installed, users will be informed that it is not officially supported software and they are to use it at their own risk. Click-to-play plugins will be enabled in the 64-bit builds while the crash reporter will be disabled. Win64 tests and on-checkin builds of the browser will be discontinued.
Interestingly, all browser testing by Mozilla will be done on the 64-bit edition of Windows 8. Yet they are only testing and supporting 32-bit versions of Firefox. The current situation is less than ideal as the x64 Firefox browsers will not be supported by Mozilla, but at least the software will still be available for those that need it. For now, Waterfox is an option for those that need to install a 64-bit browser based on Firefox.
Does Mozilla’s decision to stop supporting the 64-bit Firefox browser affect you? What do you think of the offered compromise?
Subject: General Tech | July 2, 2012 - 02:03 PM | Tim Verry
Tagged: sony, ps4, Internet, gaming, gaikai, cloud gaming
Gaikai, the streaming cloud gaming service was bought today by Sony Computer Entertainment. At this year’s Fusion Developer Summit, Gaikai stated its goal to be the gaming service on all of your devices, from your cell phone to Smart TV. Interestingly, the recent buyout from Sony raises questions about the future openness of the platform.
Purchased for $380 million, Sony plans to combine its game catalog with Gaikai’s streaming technology to provide cloud entertainment services. Gaikai CEO David Perry was quoted by The Verge as saying:
“We're honored to be able to help SCE rapidly harness the power of the interactive cloud and to continue to grow their ecosystem, to empower developers with new capabilities, to dramatically improve the reach of exciting content and to bring breathtaking new experiences to users worldwide.”
The biggest question I have about the future of Gaikai is whether not not it will now be a Sony-only technology. At AFDS, Gaikai showed off the technology running on Samsung Smart TVs, though it remains to be seen whether Sony will continue to license the technology to other companies. Should it remain Sony-only, the company could use that exclusivity as a feature-add for its consoles, Google TVs, blu ray players, and televisions. They could further use Gaikai to power its future consoles or to bring its entire library of console games to the PlayStation 3 and PlayStation Vita gaming platforms. The Verge speculates that Sony could be using the technology to bring its back-catalog of PS1 and PS2 games to the current generation console, now that it is otherwise no longer backwards compatible with the older hardware. That sounds like a very plausible plan of action for Sony.
Will Sony bring Gaikai-powered cloud gaming to the PS3?
You can find more additional quotes and speculation over at The Verge. What do you think will happen to Gaikai’s technology? Will Sony put it to good use or did they only buy it now to keep others from using it?
Subject: General Tech | June 3, 2012 - 04:23 AM | Tim Verry
Tagged: verizon, pricing, Internet, fios, fiber, 300mbps
According to sources that talked with The Verge, Verizon is planning on offering faster internet services for its FIOS customers, but the new tiers are going to cost a pretty penny.
Verizon will be upgrading many of its FIOS internet speeds, and the changes are set to go into effect on June 17th. The base 15/5Mbps (download/upload) plan will cost $10 more than the current price of $54.99 at $64.99 a month. The current 25/25Mbps will be upgraded to 50/25 and will not see a price increase–it will continue to cost $74.99. The current 50/20Mbps plan will see a significant speed bump to 150/65Mbps, and it will cost $94.99 a month (no price increase). A new 75/35 speed plan will become available and it will cost $84.99 a month. Finally, the service that readers will be drooling over–the 300Mbps plan–will feature speeds of 300Mbps downloads and 65Mbps uploads. It will cost a hefty $204.99 a month, a price that The Verge notes is a mere $5 more than the 150/35 speed tier that it replaces.
Comcast telco fashion, Verizon has managed to tack on up to three fees including a $5 per month fee for those without a contract, a $5 fee for those that do not subscribe to FIOS phone service, and a $100 fee to install equipment for those that want the upper two speed tiers. Fortunately (sort of...), users can avoid the $100 fee if they are new customers or already subscribe to the company’s 150Mbps tier. Also on the less-than-stellar news front, Verizon will not be upgrading plans for those on VDSL plans (in buildings where Verizon delivers fiber to the premises and uses copper from there to homes–think older apartment buildings). Even worse, VDSL customers will still be subject to the increased pricing although they cannot take advantage of the upgraded speeds.
|Single Family Home||VDSL 1||VDSL 2||2 Year Contract||Month-to-Month Rate|
|3/1 Mbps||3/1 Mbps||3/1 Mbps||$54.99||$59.99|
|15/5 Mbps||10/2 Mbps||15/5 Mbps||$64.99||$69.99|
|50/25 Mbps||20/5 Mbps||20/10 Mbps||$74.99||$79.99|
|75/35 Mbps||30/5 Mbps||50/10 Mbps||$84.99||$89.99|
(Source: The Verge. The 150/65 plan doesn't seem like a bad deal actually, if only I had FIOS in my area!)
So this fiber internet upgrade announcement seems great at first does have a dark side. Some customers will be getting a great deal while others will be getting the short end of the stick. Here’s hoping that you are one of the lucky customers on the middle tiers who have FTTH that get a free speed upgrade! More information on the specifics of this upgrade should be coming later this month.
Subject: General Tech | February 24, 2012 - 02:48 AM | Tim Verry
Tagged: storage, media, Internet, free, cloud, box, backup
The online storage space is really starting to heat up as companies start getting competitive to grab their share of the 'cloud storage user base' pie. Dropbox is a popular file syncing and online storage space solution offering 2GB free and routinely offering extra free space to those that want it though promotions and referrals. On the other side of things, Microsoft offers 25GB of online storage space with SkyDrive minus the computer syncing (currently) for free to those with a Windows Live (or Hotmail) account and they are in the process of overhauling the service to make it easier to use. Besides those two juggernauts, there are several alternative solutions that offer extra space or cheaper paid storage in order to remain competitive with the larger services. One such service that has not gotten the same amount of public recognition is a site called Box.com. They primarily provide Internet based (paid) storage for businesses; however, it seems that they are starting to make a big push to get deeper into the consumer market.
The company is currently offering 50 GB (yes, you read that right) of free online storage space for life (or at least the life of the company) if you install their recently updated Android application and sign up for an account (or sign into an existing account) within the next 30 days (as of writing, that would mean 3/24/2012).
Further, if you download the Android Box application before March 23, 2012 at 11:59 they will up the individual file size limit from 25 MB per file to 100 mb per file. Although that is still not big enough for movies, the increased per file limit makes it easy to backup your photos even in RAW.
Once you download the Box android application from the Android Market, and sign up (or sign into an existing account) a message will pop up indicating that you have been given 50 GB of free storage and it is immediately accessible. There are a few caveats; however. The Box.com service has mobile applications that are free; however, they do not provide a free application for Windows or Mac. To get the desktop/laptop syncing service, you will need to upgrade to a paid Business or Enterprise account. Also, the Android application itself may concern some users as one of the application permissions during installation includes access to your contact list. The company has stated that this is necessary to make the sharing and collaboration process easy for the user. It certainly would not be the first application to ask for (to the user) strange permissions, however. You could always install the app on an Android VM or another phone if you're that paranoid (heh).
While you do not get a desktop application for free, you can still access your files (and the increased 50 GB of storage) from the website, and they do allow bulk uploads that can include multiple sub-folders. One snag that I ran into was that if the uploader identified any file in a folder as being over 100 MB, it would refuse to upload the entire folder. This may be a bug or an issue on my end; however, I was not able to figure out a way to just skip that one file and upload the rest of the files in the folder.
The batch uploader allows uploading multiple subfolders via drag and drop.
One thing that I enjoyed about the process (aside from the plentiful storage) was that they made it easy to sign up, all they ask for is an email (which doesn't need to be verified to get access to storage) and password. It's kind of nice to not have to slog through the process of handing out a bunch of personal information just for an online account!
I'm currently uploading my photos to the site to back them up (I learned two years ago that it can never hurt to have too many backups!) and the upload is going smoothly. The website batch uploader is Flash based and does not require IE like SkyDrive does, so that's a positive thing in my book. Let us know in the comments if you've tried Box out before, and how you're going to use the 50 GB of cloud storage. It really seems like the cloud / Internet based storage market is heating up, and this is a good thing for end users as it means more options, more innovation, and cheaper prices! If Box.com isn't for you, Dropbox and SkyDrive are also offering plenty of free storage space.
Yesterday morning the Internet juggernaut that is Google announced that its public DNS service has far surpassed their expectations for the experimental service. In fact, the company has taken the 'beta service' training wheels off of what they believe to be "the largest public DNS service in the world," and their statement that they are now handling 70 billion requests a day means the claim may not be far from the truth.
Interestingly, 70% of the service's users come from outside of the US, and Google has announced that they are beefing up their overseas presence to service them with new access points in Australia, India, Japan, and Nigeria. Further, they are expanding their offerings in Asia in addition to maintaining the current servers in North America, South America, and Europe.
The company is continuing to provide their DNS service for free, and they ended their announcementby stating "Google Public DNS’s goal is simple: making the web—really, the whole Internet!—faster for our users."
For those curious, DNS is the technology that allows users to punch in easy to remember text URLs and have their computers connect to the proper servers via numerical IP addresses (which are definitely not as easy to remember). It has been likened to the Internet equivalent of a phone book, and that description is an apt one as DNS servers maintain a running list of IP addresses and the accompanying URL (universal resource locator) so that humans can input a text URL and connect to servers using an IP address. DNSSEC makes things a bit more complicated as it adds further layers of security, but on a basic level the description fits.
DNS benchmark "namebench" results
There are several free offerings besides the DNS services provided by your ISP, and open source tools like Name Bench can help you track down which DNS service is the fastest for you. Users connect to DNS servers using an IP address on one of several levels (in software, at the computer level, or at the router level, et al), and for the majority of people your modem and/or router will obtain the default DNS automatically from your ISP along with your IP.
The default DNS is not your only option, however. Further, many routers can support up to three DNS IP addresses, and by connecting to multiple (separate) services you can achieve a bit of redundancy and maybe even a bit of speed. A fast DNS server can result in much faster web page load times, especially for sites that you don't normally go to (and thus are not cached).
In the case of the Google Public DNS, they operate on the following IP addresses.
(The latter two are IPv6 addresses, and were announced on World IPv6 Day.)
If you have not looked into alternative DNS services, I encourage you to do so as they can often be faster and more reliable than the default ISP provided servers (though that is not always the case). It does not take much time to test and is an easy configuration tweak that can save you a bit of time in getting to each web page (like PC Perspective!). Have you tried out Google or other alternative DNS services, and did you see any improvements?
Subject: General Tech | January 20, 2012 - 12:49 PM | Tim Verry
Tagged: SOPA, pipa, Internet, Copyright
After the numerous website protests of SOPA and PIPA on Wednesday, quite a few representatives and senators started to backpedal on their support for their respective bills. Among the politicians that retracted their full support for the bill include:
- Kelly Ayotte (New Hampshire)
- Roy Blunt (Missouri)
- John Boozman (Arkansas)
- Scott Brown (Massachusets)
- Orrin Hatch (Utah)
- Tim Holden (Pennsylvania)
- Lee Terry (Nebraska)
- Jeff Merkley (Oregon),
- Ben Quayle (Arizona)
- Marco Rubio (Florida)
A fairly nice boost to the SOPA/PIPA opposition group. While both SOPA and PIPA are far from dead, both bills have now been delayed from being voted on in the House of Representatives and the Senate. We reported on the SOPA delay here, and Lamar Smith has since stated that he will be pushing for a SOPA vote around February 24th. Now, Senate Majority Leader Harry M. Reid, a democrat from Nevada, announced that a PIPA vote would be delayed until a compromise can be reached. On one hand, this means that PIPA is far from dead and the wording of "compromise" implies a slight wording change here or there so that they can pass it with less opposition. If; however, I'm being optimistic, the delay gives Americans more time to talk with their representatives about the bill and their concerns. Reid further stated that (allegedly) piracy costs the economy "billions of dollars and thousands of jobs each year," and that:
"We made good progress through the discussions we’ve held in recent days, and I am optimistic that we can reach a compromise in the coming weeks.”
Make of that what you will, but personally I'm of the opinion that it's not time to get comfortable. Keep the pressure on our Representatives and Senators by calling and sending letters. For example, one concern that I would like answered is this: Using the MegaUpload take-down as an example, why exactly do we need this law that takes away due process when that very same technique was used to take down a website. Obviously we already have methods in place to combat believed piracy, and a court system to, as fairly as possible, charge and punish those found guilty; therefore, why do we legitimately need SOPA and/or PIPA?
Subject: General Tech | January 15, 2012 - 06:21 AM | Tim Verry
Tagged: SOPA, senate, security, pipa, Internet, house, freedom, dnssec, dns, Copyright, congress, bill
SOPA, the ever controversial bill making its way through the House of Representatives, contained a provision that would force ISPs to block any website accused of copyright infringement from their customers. This technical provision was highly contested by Internet security experts and the standards body behind DNSSEC. The experts have been imploring Congress to reconsider the SOPA DNS provision as they feel it poses a significant threat to the integrity and security of the Internet.
In a somewhat surprising move, on Friday, Representative Lamar Smith of Texas and Senator Patrick Leahy of Vermont both announced that the DNS provisions included in their respective bills (SOPA in the House and companion bill PIPA in the Senate) would be removed until such time that security experts could provide them with more conclusive information on the implications of such DNS interference.
Many sites are preparing protests to SOPA, most will be forced to shut down should SOPA pass.
As a quick primer, DNS (Domain Name System) is the Internet equivalent of a phone book (or Google/Facebook contact list for the younger generation) for websites, allowing people to reach websites at difficult to remember IP (Internet Protocol) addresses by typing in much simpler text based URLs. Take the PC Perspective website- pcper.com- for example; the website is hosted on a server that is then access by other computers using the IP address of "220.127.116.11." Humans; however cannot reasonably be expected to remember an IP address for every website they wish to visit, especially IPV6 addresses which are even longer numerical strings. Instead, people navigate using text based URLs. By typing a URL (universal resource locator) into a browser such as "pcper.com," the software then polls other computers on the Internet running DNS software to match the URL to an IP address. This IP is then used to connect to the website's server. Further, DNSSEC (the Domain Name System Security Extensions) is a standard and set of protocols backed by the IETF (Internet Engineering Task Force) that seeks to make looking up IP addresses more secure. DNSSEC seeks to protect look-up requests by using multiple servers to verify that the URL look-up returns the correct IP address. By securing DNS requests, users are protected from malicious redirects on compromised servers. Browsers will request IP addresses from multiple DNS servers to reduce the risk that they will receive a malicious IP address to a compromises site.
Security experts are opposed to the DNS blocking provisions in SOPA because the methods contradict the very secure environment that standards bodies have been working for years to implement. SOPA would require ISPs to filter every person's DNS requests (the URL typed into the browser), and to block and/or redirect any requests for websites accused of copyright infringement of US rights holders. This very action goes against DNSSEC and opens the door to a less secure Internet. If ISPs are forced to invalidate DNSSEC, browsers will be forced to poll otherwise untrusted servers and what is to stop so called hacking groups and others of malicious intent from compromising DNS servers oversees and redirecting legal and valid URLs to compromised web sites and drive by downloads of malware and trojan viruses? DNSSEC is not perfect; however, it was a big step in the right direction in keeping DNS look-up requests reasonably secure. SOPA tears down that wall with a reckless abandon for the well being of citizens. Stewart Baker, former first Assistant Secretary for Policy at DHS and former General Counsel of the NSA has stated that SOPA would result in "great damage to Internet security" by undermining the DNSSEC standard, and that SOPA was "badly in need of a knockout punch." Various other Internet experts have expressed further concerns that the DNS provisions in SOPA would greatly reduce the effectiveness of the DNS system and would greatly effect the integrity of the Internet including the CEO of (anti-virus company) ESET, the head of OpenDNS, and security experts Steve Crocker and Dan Kaminsky.
While the suspension of the DNS redirecting provisions is a good thing, such actions are too little and too late. And in one respect, by (for now) removing the DNS provisions, Congress may have made it that much easier to pass the bill into law. After all, it would be much easier to amend DNS blocking onto SOPA once it's law later than fight to get the foothold passed at all. From the perspective of an Internet user and content creator, I really do not want to see SOPA or PIPA pass (I've already ranted about the additional reasons why so I'll save you this time from having to read it again). While I really want to be excited about this DNS provision removal, it's just not anywhere near the same thing as stopping the entire bill. I can't shake the feeling that removing DNS blocking is only going to make it that much easier for Congress to pass SOPA, and for the Internet to become much less free. We hear about the death of PC gaming or any number of other proclamations made by content creators expressing themselves and exercising their rights to free speech every year, but PC gaming and most things are still around. Please, call and write you congressmen and implore them to vote against SOPA and PIPA so that the last proclamation I read about is not about the death of the Internet!
Subject: General Tech | January 12, 2012 - 12:13 PM | Tim Verry
Tagged: yahoo, search, microsoft, Internet
The Internet search market is a competitive space, as the more control over search a company has, the more money they can make from ad networks, analytic and tracking data, and having an influence over the development of the Internet. Google still remains handily in first place with a majority share of the search market. That's not anything surprising. Where competition heats up; however, is beneath Google where companies fight over the remaining 30% or so of the search market. Microsoft's Bing search engine is the latest major entrant to the market, and for the first time since it's launch it has surpassed Yahoo for the number 2 spot over the search market.
Microsoft Bings Yahoo. Also, transparency fail.
According to comScore, Bing and Microsoft's other websites reached 2.75 billion search requests in the United States during the month of December. This allowed Microsoft to slip past Yahoo, who's search engine fielded 2.65 billion requests. Microsoft now holds a 15.1 % share of US search traffic while Yahoo holds 14.5 %. To put those numbers in perspective, Google holds 65.9 % market share. This fight for a slice of the search market has come at a huge cost to Microsoft who's online division lost $7 billion USD in operating costs since June of 2008, according to CBS News.
Further, the article suggests that Microsoft and Yahoo will now continue to draft and pass each other for the next few years. More information can be found at the article linked above. Have you used Bing, and will it ever have the oomph to take on Google? I personally use Google for the majority of the time but Bing is an okay backup. The image searching is fairly good. I predict that a Bing powered Windows search box (offer internet results from bing in Explorer if no files matching keywords are found, for example) would be interesting and help Microsoft to maintain a search market presence, but don't let the EU find out. What are your thoughts on Microsoft taking second place? Will they be able to maintain their position?
Subject: General Tech | January 2, 2012 - 06:24 PM | Tim Verry
Tagged: terrible idea, tech, SOPA, Internet, bill
Let me say right off the bat, that personally I'm very much against the idea of SOPA due to how easily the system could be abused and the degree to which innovation would be stiffed all in the name of "stopping piracy." Fortunately, I'm not the only one against the Stopping Online Piracy Act, and many of the opponents include Internet giants Google, Facebook, Ebay, and Twitter.
While money being paid to congressmen may speak louder than a few tech enthusiasts writing to voice their opposition, when no one is able to perform Google searches, update their Twitter, or check their Facebook you can bet that the thousands of Americans are going to go nuts and is surely to get the attention of the everyday-person. And when those same sites show their users who to blame, people are going to react. (Seriously, have you been around someone when their internet has gone out for a day and they haven't been able to get on Facebook!?). According to CNET, various top Internet sites have an ace up their sleeve and are prepared to blackout their sites such that visitors will be greeted with censorship logos naming SOPA and the government for the lack of user content and users' social networking fix.
"When the home pages of Google.com, Amazon.com, Facebook.com, and their Internet allies simultaneously turn black with anti-censorship warnings that ask users to contact politicians about a vote in the U.S. Congress the next day on SOPA, you'll know they're finally serious" says Declan McCullagh.
If SOPA passes, there will effectively be no internet, so maybe it is time to institute some MAD (mutually assured destruction) by encouraging sites to go with, as Mr. McCullagh puts it, their nuclear option and motivate people to let Congress know just how bad of an idea SOPA is. After all, if SOPA passes how would you get your YouTube laughs, or even more importantly your PCPer fix!? Have you called your Congressmen yet (nudge, nudge)?