Subject: Editorial, General Tech | June 20, 2011 - 03:24 AM | Tim Verry
Tagged: simulator, networking, Internet, cyber warfare
Our world is the host to numerous physical acts of aggression every day, and until a few years ago those acts have remained in the (relatively) easily comprehensible physical world. However, the millions of connected servers and clients that overlay the numerous nations around the world have rapidly become host to what is known as “cyber warfare,” which amounts to subversion and attacks against another people or nation through electronic means-- by attacking its people or its electronic and Internet-based infrastructure.
While physical acts of aggression are easier to examine (and gather evidence) and attribute to the responsible parties, attacks on the Internet are generally the exact opposite. Thanks to the anonymity of the Internet, it is much more difficult to determine the originator of the attack. Further, the ethical debate of whether physical actions in the form of military action is appropriate in response to online attacks comes into question.
It seems as though the Pentagon is seeking the answers to the issues of attack attribution and appropriate retaliation methods through the usage of an Internet simulator dubbed the National Cyber Range. According to Computer World, two designs for the simulator are being constructed by Lockheed Martin with a $30.8 million USD grant and Johns Hopkins University Applied Physics Laboratory with a $24.7 million USD grant provided by DARPA.
The National Cyber Range is to be designed to mimic human behavior in response to various DefCon and InfoCon (Informational Operations Condition) levels. It will allow the Pentagon and authorized parties to study the effectiveness of war plan execution as it simulates offensive and defensive actions on the scale of nation-backed levels of cyber warfare. Once the final National Cyber Range design has been chosen by DARPA from the two competing projects (by Johns Hopkins and Lockheed Martin), the government would be able to construct a toolkit that would allow them to easily transfer and conduct cyber warfare testing from any facility.
Image cortesy Kurtis Scaletta via Flickr Creative Commons.
Subject: General Tech | June 14, 2011 - 12:02 PM | Jeremy Hellstrom
Tagged: http, tcp, spdy, Internet
Google has been working on SPDY, a new protocol which is intended to speed up HTTP without forcing changes to existing websites or protocols. This application-layer protocol sits between HTTP and TCP, replacing neither instead translating for the application layer and the transport layer to optimize certain parts of the transaction. Specifically they hope to allow multiple connections over TCP, something that up until now is provided by a workaround in the browser which creates parallel connections as well as getting servers to push data to clients more effectively. They are also working to reduce latency by reducing the size of the headers that are transported which will be very important in the near future, not only as a way to speed up SSL connections but to help with the increased size of IPv6 headers.
Up until now SPDY has only been available for Chrome and even then only for certain Google sites which utilize the new translation protocol. Now Strangeloop is offering an online service as well as hardware which will allow you to implement SPDY without the need to change your website or host. The Register covers the long overdue change to TCP here.
"Strangeloop – a Vancouver-based outfit offering an online service for accelerating website load times – has embraced Google's SPDY project, a new application-layer protocol designed to significantly improve the speed of good ol' HTTP."
Here is some more Tech News from around the web:
- Intel: Chinese microprocessor development inefficient @ SemiAccurate
- Intel new server platform expected to start large replacement trend @ DigiTimes
- Games co Epic resets passwords after hack attack @ The Register
- Research @ Intel: The cloud's future is many-core and GPU accelerated @ Ars Technica
- Planar structure extends lifetime of memristor @ Nanotechweb
- Nikon COOLPIX S570 12MP Digital Camera Review @ ThinkComputers
Subject: Networking | June 12, 2011 - 04:24 AM | Tim Verry
Tagged: networking, Lawsuit, Internet, Cisco
Cisco, the worldwide networking heavyweight, is now facing a lawsuit from three Chinese website authors for Harry Wu. Mr. Wu is a Chinese political activist who spent 19 years in Chinese forced-labor prison camps, according to Network World. The charges raised against Cisco allege that Cisco optimized its networking equipment and worked with the Chinese government to train them to identify and track individuals on the Internet that speak out against the Chinese government with pro-democratic speech.
In a similar vein, the networking company was presented with an additional lawsuit last month by members of the Falun Gong religious group. This previous lawsuit claims that Cisco supplied networking technology to the Chinese government with the knowledge that the technology would be used to oppress the religious movement. Falun Gong is religious and spiritual movement that emphasizes morality and the theoretical nature of life. It was banned in July 1999 by the Communist Party Of China for being a “heretical organization”. Practitioners have been the victims of numerous human rights violations throughout the years.
Cisco has stated on the company’s blog that they strongly support free expression on the Internet. Further, they have responded to the allegations by stating that “Our company has been accused in a pair of lawsuits of contributing to the mistreatment of dissidents in China, based on the assertion that we customize our equipment to participate in tracking of dissidents. The lawsuits are inaccurate and entirely without foundation,” as well as “We have never customized our equipment to help the Chinese government—or any government—censor content, track Internet use by individuals or intercept Internet communications.”
It remains to be seen whether the allegations hold any truth; however, Cisco has been here before and are likely to see further lawsuits in the future. How do you feel about the Cisco allegations?
Subject: Networking | June 8, 2011 - 10:19 PM | Tim Verry
Tagged: networking, ipv6, Internet
As has been said numerous times throughout the Internet, the pool of available IPv4 (32 bit) addresses are running out. This is due to an ever increasing number of Internet users all over the world. In response to this, the standards for IPv4’s successor were developed by the Internet Engineering Task Force. The new IPv6 standard uses 128-bit addresses, which supports 2^128 (approximately 340 undecillion) individual addresses. Compared to IPv4’s 32-bit addresses, which can support up a little under 4.30 billion addresses (4,294,967,296 to be exact), the new Internet protocol will easily be able to assign everyone a unique IP address and will enable support for multiple devices per user without the specific need for network address translation (NAT).
Today is World IPv6 Day, and is the first widespread (live) trial of IPv6. Over a 24 hour period numerous large and small websites plan to switch on IPv6 to test for consumer readiness for the protocol. Big companies such as Google, Facebook, Yahoo, Youtube, and Bing are participating by enabling IPv6 alongside IPv4 to test how many users are able to connect to IPv6 versus IPv4. Led by the Internet Society, consumers and businesses are encouraged to help test the new IP protocol by visiting the participants sites and running readiness tests.
According to Network World, the trial has been going very well. They state that, according to Arbor Networks, overall IPv6 traffic volume has “doubled during the first 12 hours.” Further, Akamai experienced a tenfold increase in IPv6 traffic just before the event. Fortunately, this increase did not result in an increase of DDoS attacks, which Akami states were minimal. The June 8, 2011 event was used as a deadline of sorts by many businesses, and resulted in many large corporations getting their IPv6 protocol up and running.
While the event only lasts 24 hours, some large websites likely will continue to enable IPv6 alongside IPv4 addressing. Network Wold quotes Champagne in hoping more businesses will move to IPv6 after seeing the successes of the World IPv6 Day participants now that “everybody went into the water today and found out that the water is fine.”
It will certainly be interesting to see if the success continues and if consumers still on IPv4 can be made ready before the 32-bit address well runs dry, much like the move to digital TV broadcasting in the US saw many deadline push-backs. Are you ready for IPv6?
Subject: Networking | May 24, 2011 - 06:52 PM | Tim Verry
Tagged: networking, Internet, fiber
Using a single laser, scientists were able to encode data and transmit it over 50 km of single-node fiber using “325 optical frequencies within a narrow spectral band of laser wavelengths.” The single laser was capable of handling 26 terabits of information per second in an energy efficient manner, which is equivalent to the amount of data used by 400 million phone calls.
The technique used to encode and decode the optical data is called orthogonal frequency-division multiplexing (OFDM). It is a modulation technology that can be applied to both optical and electrical based transmission methods. The data is broken down into numerous parallel streams of data (using mathematics) that greatly increases the transmission speed and amount of bandwidth available. While electrical/copper based systems are not able to transmit 26 terabits of information using OFDM, optical systems are able encode the amount of data in their experiments without speed restrictions and while using “negligible energy.” Dr. Leuthold stated “we had to come up with a technique that can process data about one million times faster than what is common in the mobile communications world.” Further, his stated that his experiment shows that optical technology still has room for transmission speed improvement, and increases in bit-rate do not necessarily result in higher energy usage.
The important aspect of Dr. Leuthold’s research lies in the energy efficiency inherent in reducing the amount of lasers and fiber nodes required to transmit 26 terabits per second of data. Using simple optical technologies, they are able to greatly increase the amount of bandwidth in a single fiber line. Japanese researchers have been able to achieve 109 terabits per second download speeds; however, they had to use multiple lasers to achieve the speeds. Dr. Leuthold iterated that “it’s the fact that it’s one laser,” as being the important results of his research.
Image courtesy Kainet via Flickr Creative Commons
Subject: General Tech, Mobile | May 13, 2011 - 12:05 PM | Tim Verry
Tagged: Netflix, Internet, Android
It has been a long time coming; however, Netflix Instant Streaming is finally coming to a select number of Android powered smart phones. Engadget has the scoop, stating that
“Netflix explains that while the app is currently limited to phones with ‘requisite playback support,’ it anticipates that many of the ‘technical challenges will be resolved in the coming months,’and that it will be able to ‘provide a Netflix application that will work on a large majority of Android phones.’”
The following phones will be able to use the streaming feature of the Netflix application: HTC Incredible, Nexus One, Evo 4G, G2, and Samsung Nexus S.
While Nitdroid users and owners of older Android phones are currently out of luck, this move by Netflix is a good sign that Netflix on the open source operating system is possible, and can work well.
If you own one of the supported Android phones, you can download the application from the Android Market today!
Subject: General Tech | May 6, 2011 - 09:20 AM | Tim Verry
Tagged: sony, Internet, Data Breach, Anonymous
As Sony analyzed the forensic data of the recent PSN/SOE attack, they discovered a text file named "Anonymous" and containing the phrase "We are legion," according to Network World. As a result of this, Sony even went so far as to accuse the hacker group as the responsible party in hacking the Playstation Network (and stealing customers' information) in a letter to the U.S. congress.
Anonymous responded to the implications brought by Sony today. Network World reports that Anonymous has stated they were not involved in the attack and that "others performed the attack with the intent of making Anonymous look bad." Based on a press release by the hacker group, it's prior victims had motive to irreparably defame the group in the public eye. Anonymous stated that they have never been involved in credit card theft. Further, they claim to be an "ironically transparent movement," and had they truly been behind the attack they would have claimed responsibility for their actions.
The press release goes on to state that "no one who is actually associated with our movement would do something that would prompt a massive law enforcement response." They further claim that the world's standard fare of Internet thieves would have invested interest in making Sony and law enforcement agencies believe it was Anonymous to throw police off of their trail.
The hacker group names such former victims as Palantir, HBGary, and the U.S. Chamber Of Commerce of being organizations that would like to discredit Anonymous. "Anonymous will continue its work in support of transparency and individual liberty; our adversaries will continue their work in support of secrecy and control," they state in their press release "we are anonymous."
As Anonymous, Sony, and spectators the world over debate, the affected public continues to wait for the true identies of the hackers involved in stealing 77 milion Sony customers' private information to come to light.
Subject: Editorial, General Tech | May 5, 2011 - 08:35 AM | Tim Verry
Tagged: Internet, Education, Cyber Security
Microsoft recently posted a press release detailing the results of its sponsored study by the NCSA (National Cyber Security Alliance). The study sought to determine whom people believe bears the responsibility for teaching children how to protect themselves on the Internet, as well as what the current situation is as far as K-12 students’ level of preparedness and education. The executive director of the NCSA, Michael Kaiser, had this to say:
“Just as we would not hand a child a set of car keys with no instruction about how to drive, we should not be sending students out into the world without a solid understanding of how to be safe and secure online."
According to Microsoft, the NCSA advocates for a “comprehensive approach” to teaching children from K-12 how to stay safe and secure online. While the consensus seems to be that students do need educated in Internet security, people are divided on exactly who bears the primary responsibility for teaching children. Children’s teachers, parents, and even government leaders and law enforcement have all been raised as possible responsible parties. The majority of teachers (80 percent) and school administrators (60 percent) surveyed are proponents of parents being responsible for teaching their kids about “digital safety, security, and ethics.” On the other hand, more than 50 percent of the IT coordinators surveyed believe that teachers are the ones that bear the most responsibility of educating kids. From the survey, one area where all groups do seem to agree is on the question of government responsibility in educating kids. Microsoft states that less than one percent believe law enforcement and government officials should bear the responsibility.
While cyber security is important for students to learn, as 97 percent of school administrators believe schools should have courses and an educational plan for students throughout their K-12 grades, only 68 percent of administrators “believe their schools or school districts are doing an adequate job of preparing students...”
The situation of adequate education looks even bleaker when teachers where surveyed. When asked whether they feel prepared to teach students adequately, 24 percent believed they were adequately prepared to talk about and educate kids on protecting personal information on the Internet, and 23 percent are comfortable teaching the risks of cyberbullying. Further, only one-third of teachers surveyed believe they are prepared to educated students on basic Internet security skills “such as password protection and backing up data.” The low numbers are attributed to the lack of professional development training that teachers are receiving. Microsoft states that “86 percent received less than six hours of related training.” Microsoft quotes Kaiser in saying that “America’s schools have not caught up with the realities of the modern economy. Teachers are not getting adequate training in online safety topics, and schools have yet to adopt a comprehensive approach to online safety, security and ethics as part of a primary education. In the 21st century, these topics are as important as reading, writing and math.”
In all of this, there is a ray of hope. Comparing the 2010 study to the NCSA’s 2008 study which you can read here, an increasing number of teachers believe cyber security and professional development training is a priority.More than 60 percent of school officials and teachers are interested in pursing further security training. This interest in training among teachers is up to 69 percent from 55 percent in 2008. IT coordinators and administrators are also becoming more interested in revamping the educational curriculum to better teach their students and workers. Further improvements in interest among educators pursuing further security training can be seen between the 2010 and the 2011 NCSA study. Also, slightly higher percentages exist across the board for teachers who have tought aspects of security in their classrooms compared to both the 2010 and 2008 studies.
On the other hand, while interest in training is increasing for teachers, from 2010 to 2011, security topics taught in clases have actually dropped. This is in addition to a decrease in teachers' beliefs that they bear responsibility in educating kids.
A comparison paper between the 2008 and 2010 study can be downloaded here (PDF).
What are your thoughts on this issue; who bears the primary responsibility in educating children on the importance of Internet safety?
Image 1 courtesy 2011 NCSA study. Image 2 courtesy 2008 to 2010 NCSA comparison study. Material is copyright NCSA, and used according to fair usage guidelines for the purpose of commentary and reporting.
Subject: Editorial, General Tech | May 3, 2011 - 09:40 AM | Tim Verry
Tagged: Internet, Information, Filtering
TED talks are very similar to the motivational speeches that kids everywhere have had to endure throughout their junior high and high school years. The only real difference is that the talks are made available online to millions of people instead of a few thousand at a time. That said, if you are at all interested in the technology world, TED talks are usually both enlightening and relevant to present issues in the industry.
If that preface has not already scared you off of this article, I encourage you to watch this particular TED talk (which is embedded below), where Eli Pariser demonstrates just what a "filter bubble" is, and what repercussions the once ever-interconnected Internet world faces as more and more websites make personalization take priority over discovery.
Eli uses a search on Google for the subject "Egypt" to show that the results two people get can be drastically different. In an even more "close to home" example, by being a part of a social network like Facebook, you may already be inside a filter bubble and not even know it! This filter bubble is in the form of the "news feed" on Facebook. If you have not talked to, as an example, your best friends from college or high school in a few months, it likely will appear to you that according to their lack of any posts showing on your news feed, they have dropped off the face of the planet and have not updated their Facebook status since the last time you talked to them. More than likely; however, you are part of a filter bubble and simply were not aware of it.
Facebook has somewhat recently modified the way its news feed shows statuses of your Facebook friends to show only statuses of friends with whom you have a certain number of interactions with. This may seem like a good thing at first, as it leaves more room for the people that you talk with most often. Think for a second; however, if you missed your little brother or only nephew's first winning football game score status and photos of him during the winning play because you haven't talked to them in a few weeks. While that may be something you would consider to be big news and something that you would likely want to know about, Facebook's computer algorithms may just decide the exact opposite for you.
In practice, filter bubbles and personalization on the web are likely to be more subtle occurrences. Eli Pariser's talk does beg the question of whether or not filter bubbles are the right for the Internet and its users in any capacity. Is individual personalization worth people giving up the freedom to stumble upon new information and the opportunity to get the same exposure to the world as everyone else if they so choose? Do you see the personalized web as a positive or a negative thing for the world? What are your thoughts on users being led into a "web of one" as Eli cautions?