Subject: General Tech | July 21, 2011 - 04:27 PM | Tim Verry
Tagged: networking, voip, google
The Gmail blog recently showed off a new feature that allows you to put one call on hold while accepting another, a feature that standard phones have had for a long time now. Inside Gmail, you are able to start a call to another computer or a physical phone and then you are free to place this call on hold by hitting the “hold” button. When you wish to return to the call, you simply hit the “Resume” button- just like a normal phone. When a second person calls you, you will be asked to accept or reject it, and if you accept the call the first call will automatically be placed on hold.
According to Google, the call hold feature “works across all call types (voice, video, and phone)” and the only caveat is a limit of two outgoing calls to physical phones can be active at a time. The only feature I see missing from this function is integration with Google Music that would allow me to set up custom hold music to the chagrin to telemarketers and customer support everywhere. After all, it is almost a Friday and everyone would just love to hear some Rebecca Black, right!?
Subject: Editorial, General Tech | June 20, 2011 - 03:24 AM | Tim Verry
Tagged: simulator, networking, Internet, cyber warfare
Our world is the host to numerous physical acts of aggression every day, and until a few years ago those acts have remained in the (relatively) easily comprehensible physical world. However, the millions of connected servers and clients that overlay the numerous nations around the world have rapidly become host to what is known as “cyber warfare,” which amounts to subversion and attacks against another people or nation through electronic means-- by attacking its people or its electronic and Internet-based infrastructure.
While physical acts of aggression are easier to examine (and gather evidence) and attribute to the responsible parties, attacks on the Internet are generally the exact opposite. Thanks to the anonymity of the Internet, it is much more difficult to determine the originator of the attack. Further, the ethical debate of whether physical actions in the form of military action is appropriate in response to online attacks comes into question.
It seems as though the Pentagon is seeking the answers to the issues of attack attribution and appropriate retaliation methods through the usage of an Internet simulator dubbed the National Cyber Range. According to Computer World, two designs for the simulator are being constructed by Lockheed Martin with a $30.8 million USD grant and Johns Hopkins University Applied Physics Laboratory with a $24.7 million USD grant provided by DARPA.
The National Cyber Range is to be designed to mimic human behavior in response to various DefCon and InfoCon (Informational Operations Condition) levels. It will allow the Pentagon and authorized parties to study the effectiveness of war plan execution as it simulates offensive and defensive actions on the scale of nation-backed levels of cyber warfare. Once the final National Cyber Range design has been chosen by DARPA from the two competing projects (by Johns Hopkins and Lockheed Martin), the government would be able to construct a toolkit that would allow them to easily transfer and conduct cyber warfare testing from any facility.
Image cortesy Kurtis Scaletta via Flickr Creative Commons.
Subject: General Tech | June 12, 2011 - 01:12 PM | Tim Verry
Tagged: US, technology, networking, IT
The US has seen a rather rapid rise in unemployment in the last few years as companies cut back on staff and computing costs. According to Computer World, Tom Silver has been quoted in saying “several years ago companies cut back pretty far, particularly in infrastructure and technology development.” Silver further believes that the tech unemployment rate is half that of the national unemployment rate due to companies needing to replace aging hardware, software, and deal with increased security threats. 65% of 900 respondents in a recent biannual hiring survey conducted by Dice.com found that hiring managers and head hunters plan on bringing even more new workers into their businesses in the second half of 2011 versus the first half.
Workers with mobile operating system, hardware, and ecosystem expertise and java development skills are the most desirable technology workers, according to Computer World. Although anyone with an IT background and recent programming skills have a fairly good chance of acquiring jobs in a market that is demanding now-rare talent. Employers are starting to be more confident in the economy and thus are more willing to invest in new workers. In an era where Internet security is more important that ever, skilled enterprise IT workers are becoming a valuable asset to employers, who are increasingly fighting for rare talent and incentivizing new workers with increased salaries.
Even though businesses are still remaining cautious in their new hiring endeavors, it is definitely a good sign for people with tech backgrounds who are looking for work as the market is ever so slowly starting to bounce back. For further information on the study, Computer World has the full scoop here.
Are you in or studying to enter into the IT profession? Do you feel confident in the US employers' valuation of their IT workers?
Subject: General Tech | June 12, 2011 - 10:36 AM | Tim Verry
Tagged: networking, dell, cloud computing
A recent survey conducted during the first two days of the Cloud Expo by Marketing Solutions and sponsored by Dell suggests that IT professionals believe that their less technical CEOs believe cloud computing to be a "fad" that will soon pass. On the other hand, IT departments see the opportunities and potential of the technology. This gap between the two professions, according to Dell, lies in "the tendency of some enthusiasts to overhype the cloud and its capacity for radical change." Especially with a complex and still evolving technology like cloud computing, CEOs are less likely to see the potential benefits and moreso the obstacles and cost to adopt the methods.
The study surveyed 223 respondents from various industries (excluding technology providers), and found that the attitudes of IT professionals and what they felt their respective CEOs' attitudes were regarding "the cloud" were rather different. The pie graphs in figure 1 below illustrate the gap between the two professions mentioned earlier. Where 47% of those in IT see cloud computing as a natural evolution of the trend towards remote networks and virtualization, only 26% of IT believed that CEOs agreed. Also, while 37% of IT professions stated that cloud computing is a new way to think about their function in IT, "37 percent deemed their business leaders mostly likely to describe the cloud as having “immense potential,” contrasted with only 22 percent of the IT pros who said that was their own top descriptor."
Further, the survey examined what both IT professionals and CEOs believed to be obstacles in the way of adopting cloud computing. On the IT professionals' front, 57% believed data security to be the biggest issue, 32% stated industry compliance and governance as the largest obstacle, and 27% thought disaster recovery options to be the most important barrier, contrasted with 51%, 30%, and 22% of CEOs. This comparison can be seen in figure 2 below.
While the survey has handily indicated that enterprises' IT departments are the most comfortable with the idea of adopting cloud computing, other areas of the business could greatly benefit from the technology but are much more opposed to the technology. As seen in figure 3, 66% of IT departments are willing to advocate for cloud computing, only 13% of Research and Development, 13% of Strategy and Business Development, and a mere 5% of Supply Chain Management departments feel that they would move to cloud computing and benefit from the technology.
Dell stated that IT may be able to help in many more functions and departments by advocating for and implementing cloud computing strategies in information-gathering and data-analyzation departments. In doing so, IT could likely benefit the entire company and further educate their CEOs in cloud computing's usefulness to close the gap between the IT professionals' and CEO's beliefs.
You can read more about the Dell study here. How do you feel about cloud computing?
Subject: Networking | June 12, 2011 - 04:24 AM | Tim Verry
Tagged: networking, Lawsuit, Internet, Cisco
Cisco, the worldwide networking heavyweight, is now facing a lawsuit from three Chinese website authors for Harry Wu. Mr. Wu is a Chinese political activist who spent 19 years in Chinese forced-labor prison camps, according to Network World. The charges raised against Cisco allege that Cisco optimized its networking equipment and worked with the Chinese government to train them to identify and track individuals on the Internet that speak out against the Chinese government with pro-democratic speech.
In a similar vein, the networking company was presented with an additional lawsuit last month by members of the Falun Gong religious group. This previous lawsuit claims that Cisco supplied networking technology to the Chinese government with the knowledge that the technology would be used to oppress the religious movement. Falun Gong is religious and spiritual movement that emphasizes morality and the theoretical nature of life. It was banned in July 1999 by the Communist Party Of China for being a “heretical organization”. Practitioners have been the victims of numerous human rights violations throughout the years.
Cisco has stated on the company’s blog that they strongly support free expression on the Internet. Further, they have responded to the allegations by stating that “Our company has been accused in a pair of lawsuits of contributing to the mistreatment of dissidents in China, based on the assertion that we customize our equipment to participate in tracking of dissidents. The lawsuits are inaccurate and entirely without foundation,” as well as “We have never customized our equipment to help the Chinese government—or any government—censor content, track Internet use by individuals or intercept Internet communications.”
It remains to be seen whether the allegations hold any truth; however, Cisco has been here before and are likely to see further lawsuits in the future. How do you feel about the Cisco allegations?
Subject: Networking | June 8, 2011 - 10:19 PM | Tim Verry
Tagged: networking, ipv6, Internet
As has been said numerous times throughout the Internet, the pool of available IPv4 (32 bit) addresses are running out. This is due to an ever increasing number of Internet users all over the world. In response to this, the standards for IPv4’s successor were developed by the Internet Engineering Task Force. The new IPv6 standard uses 128-bit addresses, which supports 2^128 (approximately 340 undecillion) individual addresses. Compared to IPv4’s 32-bit addresses, which can support up a little under 4.30 billion addresses (4,294,967,296 to be exact), the new Internet protocol will easily be able to assign everyone a unique IP address and will enable support for multiple devices per user without the specific need for network address translation (NAT).
Today is World IPv6 Day, and is the first widespread (live) trial of IPv6. Over a 24 hour period numerous large and small websites plan to switch on IPv6 to test for consumer readiness for the protocol. Big companies such as Google, Facebook, Yahoo, Youtube, and Bing are participating by enabling IPv6 alongside IPv4 to test how many users are able to connect to IPv6 versus IPv4. Led by the Internet Society, consumers and businesses are encouraged to help test the new IP protocol by visiting the participants sites and running readiness tests.
According to Network World, the trial has been going very well. They state that, according to Arbor Networks, overall IPv6 traffic volume has “doubled during the first 12 hours.” Further, Akamai experienced a tenfold increase in IPv6 traffic just before the event. Fortunately, this increase did not result in an increase of DDoS attacks, which Akami states were minimal. The June 8, 2011 event was used as a deadline of sorts by many businesses, and resulted in many large corporations getting their IPv6 protocol up and running.
While the event only lasts 24 hours, some large websites likely will continue to enable IPv6 alongside IPv4 addressing. Network Wold quotes Champagne in hoping more businesses will move to IPv6 after seeing the successes of the World IPv6 Day participants now that “everybody went into the water today and found out that the water is fine.”
It will certainly be interesting to see if the success continues and if consumers still on IPv4 can be made ready before the 32-bit address well runs dry, much like the move to digital TV broadcasting in the US saw many deadline push-backs. Are you ready for IPv6?
Subject: General Tech, Networking | June 8, 2011 - 11:38 AM | Jeremy Hellstrom
Tagged: what could go wrong, networking, ipv6, 404
On February 3 of this year, the last block of IPv4 addresses were allocated which brought IPv6 to the forefront of the minds of many network heads. NAT and internal LANs can extend the usage of IPv4 for quite a while and many of the allocated addresses are not actually in use which is a good thing as not many OSes support IPv6 natively, nor do many network appliances.
That brings us to today, where many major websites are cumulating all of the internal testing they have been performning by doing a 24 hour dry run of IPv6. Companies like Juniper and Cisco have been working to ensure their portion of the Internet's backbone will be able to handle the new addressing scheme so that clients can connect to the sites that are testing IPv6. Google, Facebook, Yahoo and Bing have all turned on IPv6 as have several ISPs including Comcast, AT&T and Verizon, with Verizon's LTE mobile network are also testing IPv6. You can see a full list of the participants here.
This will of course involve a little pain, as new technology does tend to have sharp edges. You may well see a few 404's or have other problems when surfing the net today but overall it should not be too bad, Google predicts about a 1% failure rate. The hackers will also be out to play today, likely using the larger sized packets for DDoS attacks. Since the IPv6 packets are four times larger than an IPv6 packet, a flood of the new protocol will be super effective at DDoS attacks. As well, most of the IPv6 packets will be bypassing companies current deep packet inspection hardware and software, IPv6 is not backwards compatible with IPv4 so the network appliances used for that type of scan simply cannot inspect IPv6 packets. That is not to say that these devices cannot inspect IPv6 packets, simply that for a one day test major providers are reluctant to completely reprogram the devices. In the case of an attack, most of the participants have a plan in place to revert immediately back to IPv4.
"Sponsored by the Internet Society, World IPv6 Day runs from 8 p.m. EST Tuesday until 7:59 p.m. EST Wednesday. The IT departments in the participating organizations have spent the last five months preparing their websites for an anticipated rise in IPv6-based traffic, more tech support calls and possible hacking attacks prompted by this largest-ever trial of IPv6."
Here is some more Tech News from around the web:
- Skype hangs up on users yet again @ The Register
- Microsoft reportedly considers launching own-brand tablet @ DigiTimes
- New AMD CEO imminent @ SemiAccurate
- Chrome 12 adds a raft of new features @ The Inquirer
- TomTom GO 2535 M LIVE Review @ TechReviewSource
- The Post PC era begins @ t-break
- Win 2 Logitech diNovo Keyboards for Notebooks @ t-break
Subject: Networking | May 24, 2011 - 06:52 PM | Tim Verry
Tagged: networking, Internet, fiber
Using a single laser, scientists were able to encode data and transmit it over 50 km of single-node fiber using “325 optical frequencies within a narrow spectral band of laser wavelengths.” The single laser was capable of handling 26 terabits of information per second in an energy efficient manner, which is equivalent to the amount of data used by 400 million phone calls.
The technique used to encode and decode the optical data is called orthogonal frequency-division multiplexing (OFDM). It is a modulation technology that can be applied to both optical and electrical based transmission methods. The data is broken down into numerous parallel streams of data (using mathematics) that greatly increases the transmission speed and amount of bandwidth available. While electrical/copper based systems are not able to transmit 26 terabits of information using OFDM, optical systems are able encode the amount of data in their experiments without speed restrictions and while using “negligible energy.” Dr. Leuthold stated “we had to come up with a technique that can process data about one million times faster than what is common in the mobile communications world.” Further, his stated that his experiment shows that optical technology still has room for transmission speed improvement, and increases in bit-rate do not necessarily result in higher energy usage.
The important aspect of Dr. Leuthold’s research lies in the energy efficiency inherent in reducing the amount of lasers and fiber nodes required to transmit 26 terabits per second of data. Using simple optical technologies, they are able to greatly increase the amount of bandwidth in a single fiber line. Japanese researchers have been able to achieve 109 terabits per second download speeds; however, they had to use multiple lasers to achieve the speeds. Dr. Leuthold iterated that “it’s the fact that it’s one laser,” as being the important results of his research.
Image courtesy Kainet via Flickr Creative Commons