Corsair Acquires Simple Audio: Aggressive Expansion into New Markets

Subject: General Tech | February 11, 2013 - 03:57 PM |
Tagged: streaming, Simple Audio, Roomplayer, networking, corsair, audio

Corsair sure does like to expand upon their product base.  The company was founded in 1994 and produced only memory for quite a few years.  The past five years have seen tremendous growth from the company in terms of SSDs, cases, power supplies, and high end cooling solutions.  Corsair also dabbled in sound with a line of successful speakers (though these have not been updated in some time).  Corsair is again making another move, but this time with an aime to deliver content around the entire house.

roomplayer2.jpg

The front of the Roomplayer II is rather bland, but it should hide itself well in nearly any decor.

Simple Audio is a Scottish based company (if it isn't Scottish it's crap!) that designs and sells multimedia streaming solutions.  The hardware is the Roomplayer 1 and Roomplayer II units which are high definition media players that are either amplified (forconnecting directly to speakers) or non-amplified to connect to current stereo and home theater systems.  Audio is broadcast to these units from iOS enabled devices or PC and Mac computers via software provided by Simple Audio.

Corsair has acquired Simple Audio in a multi-million dollar transaction, but we do not have exact numbers due to Corsair being a privately owned company.  From my understanding these products will still carry the Simple Audio name, but Corsair will be the parent company and will distribute the products throughout Asia and North America (two areas that Simple Audio currently does not support).

roomplayer1_back.jpg

The back of the Roomplayer I is much more interesting as it has a 50 watt amplifier built-in so it can power speakers independently.

The Roomplayer solutions are apparently quite easy to hook up and their output is very clean (supports up to 24 bit sound natively).  As the average consumer is becoming more and more comfortable with setting up a home network, this is an opportunity for both Corsair and Simple Audio to market these products in new regions where overall market penetration of networked home audio is still quite low.

Corsair is a very, very aggressive company when it comes to entering new markets.  Their power supplies and cases are perfect examples of how they tend to do business.  Corsair actually produces neither of those product lines, but instead relies on contract manufacturing to handle production.  What Corsair certainly appears to do well is specify these components very well and handle end product quality control.  There really are few overall complaints about Corsair and their products, and as a consumer I do hope that they have another good one on their hands.

The sales numbers will of course be key, and obviously Corsair feels comfortable enough with Simple Audio and their products to buy them up.  We are not certain when we expect to see the Simple Audio products on store shelves, but Corsair typically does not screw around.

Now we only have to wonder, "Who is next on Corsair's radar?"

Source: Corsair

CES 2013: The Verge Interviews Gave Newell for Steam Box. Valve's Director Hints Post-Kepler GPUs Can Be Virtualized!

Subject: Editorial, General Tech, Graphics Cards, Networking, Systems, Shows and Expos | January 8, 2013 - 11:11 PM |
Tagged: valve, gaben, Gabe Newell, ces 2013, CES

So the internet has been in a roar about The Steam Box and it probably will eclipse Project Shield as topic of CES 2013. The Verge scored an interview to converse about the hardware future of the company and got more than he asked for.

Gaben.jpg

Now if only he would have discussed potential launch titles.

Wow! That *is* a beautiful knife collection.

The point which stuck with me most throughout the entire interview was directed at Valve’s opinion of gaming on connected screens. Gabe Newell responded,

The Steam Box will also be a server. Any PC can serve multiple monitors, so over time, the next-generation (post-Kepler) you can have one GPU that’s serving up eight simulateneous [sic] game calls. So you could have one PC and eight televisions and eight controllers and everybody getting great performance out of it. We’re used to having one monitor, or two monitors -- now we’re saying lets expand that a little bit.

This is pretty much confirmation, assuming no transcription errors on the part of The Verge, that Maxwell will support the virtualization features of GK110 and bring it mainstream. This also makes NVIDIA Grid make much more sense in the long term. Perhaps NVIDIA will provide some flavor of a Grid server for households directly?

The concept gets me particularly excited. One of the biggest wastes of money the tech industry has is purchasing redundant hardware. Consoles are a perfect example: not only is the system redundant to your other computational device which is usually at worst a $200 GPU away from a completely better experience, you pay for software to be reliant on that redundant platform which will eventually disappear along with said software. In fact, many have multiple redundant consoles because the list of software they desire is not localized to just one system so they need redundant redundancies. Oy!

A gaming server should help make the redundancy argument more obvious. If you need extra interfaces then you should only need to purchase the extra interfaces. Share the number crunching and only keep it up to date.

Also check out the rest of the interview over at The Verge. I decided just to cover a small point with potentially big ramifications.

Coverage of CES 2013 is brought to you by AMD!

PC Perspective's CES 2013 coverage is sponsored by AMD.

Follow all of our coverage of the show at http://pcper.com/ces!

Source: The Verge

CES 2013: Corsair Voyager Air Offers Wireless Mobile Storage and Home NAS Support

Subject: General Tech, Networking, Storage, Mobile | January 8, 2013 - 09:00 AM |
Tagged: wi-fi, Voyager Air, NAS, mobile, corsair, CES

01-VoyagerAir_K3_ON.png

The newest member member of the Corsair Voyager family of devices, the Voyager Air, drives Corsair's entry into the home networking arena with their all-in-one mobile drive and home NAS (network attached storage) solution.

VoyagerAir_R1_ON.png

Courtesy of Corsair

The Voyager Air is as versatile as it is sleek, with support for the following hiding beneath its stylish hood:

 

  • Up to 1TB capacity drive
  • Rechargeable battery
  • Wi-Fi (802.11n/b/g), GigE Ethernet, and USB 3.0 support built-in
  • Wireless hub support for shared internet support via passthrough technology

03-va-snuggies-white.png

Courtesy of Corsair

The Voyager Air comes in a variety of colors as well, more than enough to match anyone's sense of style. According to Corsair, the Voyager Air units should be accessible at an electronics retailer near you in a 500GB model for $179.99 MRSP and a 1TB model for $219.99 MSRP.

Press release after the break.

 

Coverage of CES 2013 is brought to you by AMD!

PC Perspective's CES 2013 coverage is sponsored by AMD.

Follow all of our coverage of the show at http://pcper.com/ces!

Brace Yourself: The PC Perspective CES 2013 Coverage is Coming!

Subject: Graphics Cards, Networking, Motherboards, Cases and Cooling, Processors, Systems, Storage, Mobile, Shows and Expos | January 5, 2013 - 10:47 AM |
Tagged: CES, ces 2013, pcper

It's that time of year - the staff at PC Perspective is loaded up and either already here in Las Vegas, on their way to Las Vegas or studiously sitting at their desk at home - for the 2013 Consumer Electronics Show!  I know you are on our site looking for all the latest computer hardware news from the show and we will have it.  The best place to keep checking is our CES landing page at http://pcper.com/ces.  The home page will work too. 

brace2.jpg

We'll have stories covering companies like, Intel, AMD, NVIDIA, ASUS, MSI, Gigabyte, Zotac, Sapphire, Galaxy, EVGA, Lucid, OCZ, Western Digital, Corsair and many many more that I don't feel like listing here.  It all starts Sunday with CES Unveiled and then the NVIDIA Press Conference where they will announce...something.

Also, don't forget to subscribe to the PC Perspective Podcast as we will be bringing you daily podcasts wrapping up each day.  We are also going to try to LIVE stream them on our PC Perspective Live! page but times and bandwidth will vary.

Coverage of CES 2013 is brought to you by AMD!

PC Perspective's CES 2013 coverage is sponsored by AMD.

Follow all of our coverage of the show at http://pcper.com/ces!

Qualcomm announces "Streamboost" to give your Home Network some Brains

Subject: Networking | January 4, 2013 - 07:30 AM |
Tagged: streamboost, qualcomm, qos, D-Link, ces 2013, alienware

01_Qualcomm_Logo.jpg

With CES right around the corner, we’re about to be buried in a deluge of announcements from consumer electronics vendors.  Since I was not able to get out to CES in person this year, Qualcomm offered to give me a sneak peek at their new “Streamboost” technology they’ve just announced and will be showing at CES.  I got to spend some time on the phone with Ciera Jammal, their PR rep and Michael Cubbage, Director of Business Development in their networking unit.  For those of you that may not recognize Michael’s name, he was one of the co-founders of Bigfoot Networks that brought us their “Killer Gaming” line of Ethernet and wireless products.  Acquired by Qualcomm in the fall of 2011, the merged Bigfoot and Qualcomm teams have now released “Streamboost”.

02_Bigfoot_Networks.jpg

So, what is Streamboost you ask?  Simply put, it’s an innovative Quality of Service engine that’s much, much more than what’s available for consumer QoS today.  QoS on most current consumer products only looks at what ports traffic is flowing across and prioritizes the traffic based on a simple list of what ports should be given priority at the expense of traffic on lower priority ports.  There’s no analysis of what the actual traffic is and in cases where different types of traffic flows over the same port (port 80 for example) it doesn’t offer any benefit at all.  The Streamboost engine on the other hand, will actually inspect the packets in real time, determining not only what port the traffic is using, but what the traffic actually is.  So for example, Streamboost will be able actually be able to tell that one stream is 1080p YouTube video while another is Standard Definition Netflix traffic, even though they are both on port 80, and give both streams the bandwidth they need. 

03_QCA_StreamBoost_01.jpg

Once the engine determines what type of data is moving through the connection, it will give that connection the bandwidth it needs to run optimally, but no more.  The “no more” piece is important because it frees up bandwidth for other applications and connections.  If there is not enough bandwidth available for the “Optimal” setting, it will then drop back and make every effort to give the connection what’s been determined to be the “Minimum acceptable” bandwidth needed for that type of traffic.

04_QCA_StreamBoost_02.jpg

How does Streamboost know what bandwidth is Optimal and what bandwidth is Minimum?  Well, Qualcomm has studied various types of traffic ranging from YouTube to Netflix to Call of Duty to torrents, and they’ve come up with the Optimal and Minimum bandwidth values for all types of traffic.  This data will be included in a “Detection and Policy Table” on the router that the Streamboost engine will reference.  My first thoughts when I heard this was that it sounded great, but what happens when that table gets out of date?  Qualcomm has thought of that as well and Streamboost includes an opt-in, cloud based service that will keep your router’s table up to date.  Not only that, but if the router encounters a new type of traffic not in its table, it will capture a few packets and send them up to the cloud (anonymized of course) to be analyzed and added to future table updates.  Your router should actually perform better as it’s table is updated and will be better after a year than it was on Day 1.  However, if you’re not interested in being part of the “Opt In crowd”, the engine can also be manually updated at any time.

The UI looks great and will let you drill down into your bandwidth use either by application or device.  Speaking of devices, Streamboost can detect the various types of devices on your network and lets you prioritize based on those criteria as well.

05_QCA_StreamBoost_03.jpg

D-Link and Alienware are the first two partners onboard with Streamboost and will be showing routers with the technology at CES as well as releasing them this spring.  All in all, after speaking to Qualcomm, I think I’m going to hold off my planned router upgrade until I can get my hands on a new router with Streamboost built in.

Coverage of CES 2013 is brought to you by AMD!

PC Perspective's CES 2013 coverage is sponsored by AMD.

Follow all of our coverage of the show at http://pcper.com/ces!

Click here for the full press release!

Source: Qualcomm

A light in the quantum cryptography tunnel

Subject: General Tech, Networking | November 21, 2012 - 02:15 PM |
Tagged: quantum encryption, security

One of the biggest hurdles to implementing quantum cryptography has been vaulted, with researchers finding a way to transmit the key over a non-dedicated connection.  Previously because of the inherent noise in a fibre channel transmitting general data the key would be lost and so a separate fibre channel was needed which only the keys were able to transmit but thanks to researchers at Toshiba’s Cambridge Research Laboratory it is now possible to send the keys on existing fibre which also carries other data.  They have created a detector which can open for a mere 100 millionths of a micro-second and receive the key, with the detection window being so quick there is not time for noise to interfere and the wrong photon be detected as the key.  The Register reports they can transmit keys over a line running at 500kbps for 50km and still have the key properly detected.

qcnet_keydist.jpg

"Traditionally it has been necessary to use dedicated fibre to send the single photons (particles of light) that are required for Quantum Key Distribution (QKD). This has restricted any applications of quantum cryptography technology to specialist and small-scale systems in banks and high-level government, essentially because of the extra inconvenience and cost required in allocating a dedicated fibre strand for quantum key distribution."

Here is some more Tech News from around the web:

Tech Talk

Source: The Register

Your wireless signal getting tangled? WiFox might be the cure

Subject: General Tech, Networking | November 15, 2012 - 12:49 PM |
Tagged: WiFox, wireless

The short blurb at Slashdot makes WiFox sound more impressive than the actual implementation will be, but for anyone who has attempted to connect to WiFi at a tech show would love to see this new software appear on wireless access points.   Once a channel becomes crowded by multiple users, especially if they use devices which like to have a constant connection, the network will hit a point of saturation and the performance of every device on the network will suffer; not just the most recently connected devices.  WiFox is a piece of software which can monitor channel saturation and when it is reached it will immediately assign all available bandwidth to the WAP and allow it to transmit all backlogged data before then allocating bandwidth back to the devices ... clearing the tubes as it were.  It sounds like this will be easily added to existing WAPs instead of only being available to the next generation of devices so while your WAN will not suddenly become 700 times faster, the WiFi at next conference you go to might just be usable for a change.

SeriesOfTubes.png

"Engineers at NC State University (NCSU) have discovered a way of boosting the throughput of busy WiFi networks by up to 700%. Perhaps most importantly, the breakthrough is purely software-based, meaning it could be rolled out to existing WiFi networks relatively easily — instantly improving the throughput and latency of the network. As wireless networking becomes ever more prevalent, you may have noticed that your home network is much faster than the WiFi network at the airport or a busy conference center. The primary reason for this is that a WiFi access point, along with every device connected to it, operates on the same wireless channel."

Here is some more Tech News from around the web:

Tech Talk

Source: Slashdot

Intel's Crystal Forest, coming to a network appliance near you

Subject: General Tech | November 1, 2012 - 01:50 PM |
Tagged: Intel, crystal forest, networking

Intel's new Crystal Forest chipset includes an undisclosed embedded CPU, an 89XX-series chipset and their Data Plane Development Kit which is the SDK they've created for designing fast path network processing.  This is not a competitor to AMD's Freedom Fabric which is designed for communication within a large series of processing nodes, instead you will see Crystal Forest powering high end routers and web appliances.  Intel has designed this new chipset to increase the performance of cryptography and compression on network packets and claims it will increase speed as well as security, along with the benefits of support coming directly from Intel.  The Inquirer reports a long list of vendors who have signed on, including Dell, Wind River Systems and Emerson Network Power.

Intel_CrystalForest_1.jpg

"CHIPMAKER Intel has launched its Crystal Forest chipset for network infrastructure.

Intel might be best known for its X86 desktop, laptop and server chips but the firm does a pretty good trade in embedded chips and has announced that a number of large vendors have pledged their support for its new Crystal Forest chipset. The firm said its Crystal Forest chipset will enable networking equipment vendors to shift and control more data using its chips."

Here is some more Tech News from around the web:

Tech Talk

Source: The Inquirer

Netgear Announces New 802.11ac Gear, Launches New Router

Subject: Networking | May 16, 2012 - 09:57 PM |
Tagged: wifi, router, networking, netgear, 802.11ac

Following up on the announcement by Buffalo Technology, Netgear has released their own 802.11ac wireless router, the R6300. (PC Perspective recently ran a giveaway for the R6300 which you can read about here). In addition to the flagship 802.11ac router, Netgear announced a slimmed down version–the R6200–and the A6200 WiFi USB dongle.

R6300-Product-Image18-51162.png

The Netgear R6300 is their highest end wireless router supporting the 802.11ac WiFi standard. It supports 802.11ac speeds up to 1300 Mbps (450 Mbps over wireless n) and is backwards compatible with the 802.11 a/b/g/n standards. It also has two USB 2.0 ports that can be used to share hard drive and printers across the network. Further, the “5G WiFI” router is powered by a Broadcom chipset, which should open the door to third part firmware(s).

r6200-pg-img18-52080.jpg

In addition to the above router, Netgear has announced the R6200 wireless router. It is compatible with the upcoming 802.11ac standard, but at reduced speeds. It features approximately 900 Mbps transfer rates over the “ac” standard and up to 300 Mbps over the 802.11n standard. The router is backwards compatible with all the older consumer standards (a/b/g/n), and it features a single USB 2.0 port to share a printer or hard drive to computers on the LAN.

A6200.jpg

Last up in the announcement is the Netgear A6200. This device is a USB WiFi dongle that supports the 802.11ac standard as well as existing a/b/g/n networks. It claims to deliver enough speed for HD streaming of videos, though Netgear has not stated if it will be able to take advantage of the full 1300 Mbps theoretical maximum connection. The WiFi adapter features a swiveling antenna and a docking station for use with desktop systems.

The other neat feature that the new routers support is the Netgear Genie application, which allows users to monitor and control the network using an application on their computer or smartphone (iOS and Android). They also feature Netgear MyMedia, printer sharing, guest network access, a DLNA server, parental controls, and automatic WiFi security.

The Netgear R6300 router is available for purchase now with an MSRP of $199.99. The R6200 router and A6200 WiFi dongle will be available for purchase in Q3 2012 with suggested retail prices of $179.99 and $69.99 respectively.

Source: Netgear

Buffalo First To Market With 802.11ac Gigabit Wi-Fi Router

Subject: Networking | May 15, 2012 - 05:38 PM |
Tagged: wireless, router, networking, ethernet bridge, buffalo, 802.11ac

Netgear and Buffalo have been working hard to build and get to market new wireless routers based on the 802.11ac (pending ratification) standard. PC Perspective recently ran a giveaway for the Netgear 802.11ac router, but it seems that Buffalo has managed to beat them to market with their new gear. In fact, Buffalo yesterday released two 802.11ac devices with the AirStation™ WZR-D1800H wireless router and WLI-H4-D1300 wireless Ethernet bridge. Both devices are powered by Broadcom’s 5G WiFi chips (what Broadcom refers to 802.11ac as–the fifth generation of consumer WiFi) and based around the IEEE standard that is set to become an official standard early next year.

Buffalo_Router.jpg

The Buffalo 802.11ac Router (left: front, right: rear view)

The router and Ethernet bridge both support the upcoming 802.11ac standard as well as the current 802.11 b, g, and n standards so they are backwards compatible with all your devices. They also support all the normal functions of any other router or bridge device–the draft support for 802.11ac is what differentiates these products. The router stands vertically and has a router reset and USB eject buttons, one USB 2.0 port, four Gigabit Ethernet LAN ports, and one Gigabit Ethernet WAN port. Below the WAN port is a power button and DC in jack. The Buffalo Ethernet bridge allows users to connect Ethernet devices to a network over WiFi. It looks very similar to the router but does not have a WAN port or USB port on the back. It also does not act as a router, only a bridge to a larger network. The largest downside to the Ethernet bridge is pricing: (although out of stock now) Newegg has the bridge listed for the same price as the full fledged router. At that point, it does not have much value–users would be better off buying two routers and disabling the router features on one (and because the Broadcom chipset should enable custom firmwares, this should be possible soon).

Ethernet_bridge.jpg

The Buffalo 802.11ac Ethernet Bridge (left: front, right: rear view)

What makes these two devices interesting though is the support for the “5G WiFi” 802.11ac wireless technology. This is the first time that the Wireless connections have a (granted, theoretical) higher transfer speed than the wired connections, which is quite the feat. 802.11ac is essentially 802.11n but with several improvements and only operating on channels in the 5GHz spectrum. The pending standard also uses wider 80 Mhz or 160 MHz channels, 256 QAM modulation, and up to eight antennas (much like 802.11n’s MIMO technology) to deliver much faster wireless transfer rates than consumers have had available previously. The other big technology with the upcoming WiFi standard is beamforming. This allows wireless devices to communicate with their access point(s) to determine relative spatial position. That data is then used to adjust the transmitted signals such that it is sent in the direction of the access point at the optimum power levels. This approach is different to traditional WiFi devices that broadcast omni-directionally (think big circular waves coming out of your router) because the signals are more focused. By focusing the signals, users get better range and can avoid WiFi deadspots.

Hajime Nakai, Chief Executive Officer at Buffalo Technology stated that “along with Broadcom, we continue to demonstrate our commitment to innovation by providing a no-compromise, future proofed wireless infrastructure for consumers’ digital worlds.”

The Buffalo AirStation™ WZR-D1800H router and WLI-H4-D1300 Ethernet bridge are available for purchase now for around $179.99 USD. The Ethernet bridge is listed as out of stock on Newegg; however, the router is still available (and the better value).

Qualcomm Atheros Launches Two New Killer Networking Cards

Subject: Networking | April 18, 2012 - 10:33 PM |
Tagged: wi-fi, qualcomm, networking, killer, Ethernet

Qualcomm Atheros today launched two new networking cards for the desktop and laptop markets. A subsidiary company of Qualcomm (formerly Killer Networks), the Wireless-N 1202 and E2200 provides Wi-Fi and Ethernet connectivity based on Killer Networks’ technology.

Qualcomm_Atheros_blue_big.jpg

The Wireless-N 1202 is a 802.11n Wi-Fi and Bluetooth module with 2x2 MIMO antennas which should provide plenty of Wireless N range. On the wired side of things the E2200 is a Gigabit Ethernet network card for desktop computers. Both modules are powered by Killer Network’s chip and the Killer Network Manager software. The software will allow users to prioritize gaming, audio, and video packets over other network traffic to deliver the best performance. Director of Business Development Mike Cubbage had the following to say.

“These products create an unprecedented entertainment and real-time communications experience for the end user by ensuring that critical online applications get the bandwidth and priority they need, when they need it.”

The E2200 Gigabit Ethernet NIC is available for purchase now, and the Wireless-N 1202 module will go on sale in May. More specific information on the products will be available after the official launch date (today) so stay tuned to PC Perspective.

ZTE Shows Off 1.7 Tbps Fiber Optic Network

Subject: Networking | March 16, 2012 - 05:58 AM |
Tagged: zte, wdm, networking, fiber optics, 1.7tbps

Chinese telecommunications provider ZTE showed off a new fiber optic network capable of 1.7 Tbps over a single fiber cable. Computer World reports that the ZTE network trial utilizes Wavelength Division Multiplexing technology to pack more information through a single cable by employing multiple wavelengths that comprise different channels.

fiber optic.jpg

The ZTE fiber network runs 1,750 kilometers (just over 1,087 miles) and uses eight channels- each capable of 216.4 Gbps- to send data at speeds up to 1.7312 Tbps. The company has no immediate plans to implement such a network. Rather, they wanted to prove that an upgrade to 200 Gbps per channel speeds is possible. To put their achievement in perspective, Comcast currently has fiber networks running at 10 Gbps, 40 Gbps, and 100 Gbps channel speeds, according to an article on Viodi.

And to think that I only recently upgraded to a Gigabit router! I can't wait to see this technology trickle down towards a time when home networks are running through fiber optic cables and doing so at terabit per second speeds!

Image courtesy kainet via Flickr Creative Commons.

Counter-terrorism Expert States Cyberthreats Should Never Be Taken Lightly

Subject: Networking | August 4, 2011 - 02:01 AM |
Tagged: security, networking, cyber warfare

Computer World posted a short news piece quoting the former director of the CIA’s Counter-terrorism Center Cofer Black as he explained why Cyberthreats needs to be taken more seriously by the nation. Cofer Black played a key role during the first term of the George W. Bush administration and was one of the counter-terrorism experts made aware of a likely attack on American soil prior to the September 11th attacks.

Black noted that the people in a position with the power to act on these warnings were unwilling to act without some measure of validation. He goes on to say that while the general public was blindsided by the September 11th attacks, “I can tell that neither myself nor my people in counter-terrorism were surprised at all.”

binary numbers

With cyber warfare becoming increasingly utilized as an attack vector to foreign adversaries, the need for quick responses to threats will only increase. Further, the demand on security professionals to search for and validate threats for those in power to enact a response will be a major issue in the coming years. “The escalatory nature of such threats is often not understood or appreciated until they are validate,” Black offered in regards to the challenges decision makers face. He believes that the decision makers do listen to the threats; however, they do not believe them. This behavior, he believes, will hinder the US’ ability to properly respond to likely threats.

With the recent announcement by the Department of Defense that physical retaliation to Internet based attacks (in addition to counter attacks) may be necessary, the need to quickly respond to likely threats proactively is all the more imperative.  Do you believe tomorrows battles will encompass the digital plane as much as real life?

18,592 Academic Papers Released To Public Via Torrent

Subject: General Tech | July 21, 2011 - 07:29 PM |
Tagged: torrent, tech, networking, jstor

In light of Aaron Swartz’s recent legal trouble involving charges being brought against him for downloading academic papers from the online pay-walled database called JSTOR using MIT’s computer network, a bittorrent user named Greg Maxwell has decided to fight back against publishers who charge for access to academic papers by releasing 18,592 academic papers to the public in a 32.48 gigabyte torrent uploaded to The Pirate Bay.

library.png

Maxwell claims that the torrent consists of documents from the Philosophical Transactions of the Royal Society journal. According to Gigaom, the copyrights on these academic papers have been expired for some time; however, the only way to access these documents have been through the pay-walled JSTOR database where individual articles can cost as much as $19. While Maxwell claims to have gained access to the papers many years prior through legal means (likely through a college or library’s database access), he has been fearful of releasing the documents due to legal repercussions from the journal’s publishers. He claims that the legal troubles that Swartz is facing for (allegedly) downloading the JSTOR library has fueled his passion and changed his mind about not releasing them.

Maxwell justifies the release by stating that the authors and universities do not benefit from their work, and the move to a digital distribution method has yet to coincided with a reduction in prices. In the past the high cost (sometimes paid by the authors) has been such to cover the mechanical process of binding and printing the journals. Maxwell further states that to his knowledge, the money those wishing to verify their facts and learn more from these academic works “serves little significant purpose except to perpetuate dead business models.” The pressure and expectation that authors must publish or face irrelevancy further entrenches the publisher’s business models.

Further, GigaOm quoted Maxwell in stating:

“If I can remove even one dollar of ill-gained income from a poisonous industry which acts to suppress scientific and historic understanding, then whatever personal cost I suffer will be justified . . . it will be one less dollar spent in the war against knowledge. One less dollar spent lobbying for laws that make downloading too many scientific papers a crime.”

Personally, I’m torn on the ethics of the issue. On one hand, these academic papers should be made available for free (or at least at cost of production) to anyone that wants them as they are written for the betterment of humanity and pursuit of knowledge (or at least as a thought provoking final paper). On the other hand, releasing the database via a torrent has it’s own issues. As far as non-violent protests go, this is certainly interesting and likely to get the attention of the publishers and academics. Whether it will cause them to reevaluate their business models; however, is rather doubtful (and unfortunate).

Image courtesy Isabelle Palatin.

Source: GigaOm

Gmail Now Supports Multiple Calls and Placing Calls On Hold

Subject: General Tech | July 21, 2011 - 04:27 PM |
Tagged: networking, voip, google

The Gmail blog recently showed off a new feature that allows you to put one call on hold while accepting another, a feature that standard phones have had for a long time now. Inside Gmail, you are able to start a call to another computer or a physical phone and then you are free to place this call on hold by hitting the “hold” button. When you wish to return to the call, you simply hit the “Resume” button- just like a normal phone. When a second person calls you, you will be asked to accept or reject it, and if you accept the call the first call will automatically be placed on hold.

multiplecalls.png

According to Google, the call hold feature “works across all call types (voice, video, and phone)” and the only caveat is a limit of two outgoing calls to physical phones can be active at a time. The only feature I see missing from this function is integration with Google Music that would allow me to set up custom hold music to the chagrin to telemarketers and customer support everywhere. After all, it is almost a Friday and everyone would just love to hear some Rebecca Black, right!?

Source: Gmail Blog

US Pentagon To Test Cyber Warfare Tactics Using Internet Simulator

Subject: Editorial, General Tech | June 20, 2011 - 03:24 AM |
Tagged: simulator, networking, Internet, cyber warfare

Our world is the host to numerous physical acts of aggression every day, and until a few years ago those acts have remained in the (relatively) easily comprehensible physical world. However, the millions of connected servers and clients that overlay the numerous nations around the world have rapidly become host to what is known as “cyber warfare,” which amounts to subversion and attacks against another people or nation through electronic means-- by attacking its people or its electronic and Internet-based infrastructure.

While physical acts of aggression are easier to examine (and gather evidence) and attribute to the responsible parties, attacks on the Internet are generally the exact opposite. Thanks to the anonymity of the Internet, it is much more difficult to determine the originator of the attack. Further, the ethical debate of whether physical actions in the form of military action is appropriate in response to online attacks comes into question.

It seems as though the Pentagon is seeking the answers to the issues of attack attribution and appropriate retaliation methods through the usage of an Internet simulator dubbed the National Cyber Range. According to Computer World, two designs for the simulator are being constructed by Lockheed Martin with a $30.8 million USD grant and Johns Hopkins University Applied Physics Laboratory with a $24.7 million USD grant provided by DARPA.

The National Cyber Range is to be designed to mimic human behavior in response to various DefCon and InfoCon (Informational Operations Condition) levels. It will allow the Pentagon and authorized parties to study the effectiveness of war plan execution as it simulates offensive and defensive actions on the scale of nation-backed levels of cyber warfare. Once the final National Cyber Range design has been chosen by DARPA from the two competing projects (by Johns Hopkins and Lockheed Martin), the government would be able to construct a toolkit that would allow them to easily transfer and conduct cyber warfare testing from any facility.

 

Image cortesy Kurtis Scaletta via Flickr Creative Commons.

 

Demand For IT Workers Remains High In US Despite Economy

Subject: General Tech | June 12, 2011 - 01:12 PM |
Tagged: US, technology, networking, IT

The US has seen a rather rapid rise in unemployment in the last few years as companies cut back on staff and computing costs. According to Computer World, Tom Silver has been quoted in saying “several years ago companies cut back pretty far, particularly in infrastructure and technology development.” Silver further believes that the tech unemployment rate is half that of the national unemployment rate due to companies needing to replace aging hardware, software, and deal with increased security threats. 65% of 900 respondents in a recent biannual hiring survey conducted by Dice.com found that hiring managers and head hunters plan on bringing even more new workers into their businesses in the second half of 2011 versus the first half.

Workers with mobile operating system, hardware, and ecosystem expertise and java development skills are the most desirable technology workers, according to Computer World. Although anyone with an IT background and recent programming skills have a fairly good chance of acquiring jobs in a market that is demanding now-rare talent. Employers are starting to be more confident in the economy and thus are more willing to invest in new workers. In an era where Internet security is more important that ever, skilled enterprise IT workers are becoming a valuable asset to employers, who are increasingly fighting for rare talent and incentivizing new workers with increased salaries.

Even though businesses are still remaining cautious in their new hiring endeavors, it is definitely a good sign for people with tech backgrounds who are looking for work as the market is ever so slowly starting to bounce back. For further information on the study, Computer World has the full scoop here.

Are you in or studying to enter into the IT profession? Do you feel confident in the US employers' valuation of their IT workers?

Dell Survey Suggests CEOs Believe Cloud Computing Is A Fad

Subject: General Tech | June 12, 2011 - 10:36 AM |
Tagged: networking, dell, cloud computing

A recent survey conducted during the first two days of the Cloud Expo by Marketing Solutions and sponsored by Dell suggests that IT professionals believe that their less technical CEOs believe cloud computing to be a "fad" that will soon pass. On the other hand, IT departments see the opportunities and potential of the technology. This gap between the two professions, according to Dell, lies in "the tendency of some enthusiasts to overhype the cloud and its capacity for radical change." Especially with a complex and still evolving technology like cloud computing, CEOs are less likely to see the potential benefits and moreso the obstacles and cost to adopt the methods.

The study surveyed 223 respondents from various industries (excluding technology providers), and found that the attitudes of IT professionals and what they felt their respective CEOs' attitudes were regarding "the cloud" were rather different. The pie graphs in figure 1 below illustrate the gap between the two professions mentioned earlier. Where 47% of those in IT see cloud computing as a natural evolution of the trend towards remote networks and virtualization, only 26% of IT believed that CEOs agreed. Also, while 37% of IT professions stated that cloud computing is a new way to think about their function in IT, "37 percent deemed their business leaders mostly likely to describe the cloud as having “immense potential,” contrasted with only 22 percent of the IT pros who said that was their own top descriptor."

Further, the survey examined what both IT professionals and CEOs believed to be obstacles in the way of adopting cloud computing. On the IT professionals' front, 57% believed data security to be the biggest issue, 32% stated industry compliance and governance as the largest obstacle, and 27% thought disaster recovery options to be the most important barrier, contrasted with 51%, 30%, and 22% of CEOs. This comparison can be seen in figure 2 below.

While the survey has handily indicated that enterprises' IT departments are the most comfortable with the idea of adopting cloud computing, other areas of the business could greatly benefit from the technology but are much more opposed to the technology. As seen in figure 3, 66% of IT departments are willing to advocate for cloud computing, only 13% of Research and Development, 13% of Strategy and Business Development, and a mere 5% of Supply Chain Management departments feel that they would move to cloud computing and benefit from the technology.

Dell stated that IT may be able to help in many more functions and departments by advocating for and implementing cloud computing strategies in information-gathering and data-analyzation departments. In doing so, IT could likely benefit the entire company and further educate their CEOs in cloud computing's usefulness to close the gap between the IT professionals' and CEO's beliefs.

You can read more about the Dell study here. How do you feel about cloud computing?

Source: Dell

Cisco Faces Second Lawsuit For Allegedly Supplying Technology To Track Chinese Dissidents

Subject: Networking | June 12, 2011 - 04:24 AM |
Tagged: networking, Lawsuit, Internet, Cisco

Cisco, the worldwide networking heavyweight, is now facing a lawsuit from three Chinese website authors for Harry Wu. Mr. Wu is a Chinese political activist who spent 19 years in Chinese forced-labor prison camps, according to Network World. The charges raised against Cisco allege that Cisco optimized its networking equipment and worked with the Chinese government to train them to identify and track individuals on the Internet that speak out against the Chinese government with pro-democratic speech.

In a similar vein, the networking company was presented with an additional lawsuit last month by members of the Falun Gong religious group. This previous lawsuit claims that Cisco supplied networking technology to the Chinese government with the knowledge that the technology would be used to oppress the religious movement. Falun Gong is religious and spiritual movement that emphasizes morality and the theoretical nature of life. It was banned in July 1999 by the Communist Party Of China for being a “heretical organization”. Practitioners have been the victims of numerous human rights violations throughout the years.

Cisco has stated on the company’s blog that they strongly support free expression on the Internet. Further, they have responded to the allegations by stating that “Our company has been accused in a pair of lawsuits of contributing to the mistreatment of dissidents in China, based on the assertion that we customize our equipment to participate in tracking of dissidents. The lawsuits are inaccurate and entirely without foundation,” as well as “We have never customized our equipment to help the Chinese government—or any government—censor content, track Internet use by individuals or intercept Internet communications.”

It remains to be seen whether the allegations hold any truth; however, Cisco has been here before and are likely to see further lawsuits in the future. How do you feel about the Cisco allegations?

World IPv6 Day Goes Off Without A Hitch

Subject: Networking | June 8, 2011 - 10:19 PM |
Tagged: networking, ipv6, Internet

As has been said numerous times throughout the Internet, the pool of available IPv4 (32 bit) addresses are running out. This is due to an ever increasing number of Internet users all over the world. In response to this, the standards for IPv4’s successor were developed by the Internet Engineering Task Force. The new IPv6 standard uses 128-bit addresses, which supports 2^128 (approximately 340 undecillion) individual addresses. Compared to IPv4’s 32-bit addresses, which can support up a little under 4.30 billion addresses (4,294,967,296 to be exact), the new Internet protocol will easily be able to assign everyone a unique IP address and will enable support for multiple devices per user without the specific need for network address translation (NAT).

Today is World IPv6 Day, and is the first widespread (live) trial of IPv6. Over a 24 hour period numerous large and small websites plan to switch on IPv6 to test for consumer readiness for the protocol. Big companies such as Google, Facebook, Yahoo, Youtube, and Bing are participating by enabling IPv6 alongside IPv4 to test how many users are able to connect to IPv6 versus IPv4. Led by the Internet Society, consumers and businesses are encouraged to help test the new IP protocol by visiting the participants sites and running readiness tests.

According to Network World, the trial has been going very well. They state that, according to Arbor Networks, overall IPv6 traffic volume has “doubled during the first 12 hours.” Further, Akamai experienced a tenfold increase in IPv6 traffic just before the event. Fortunately, this increase did not result in an increase of DDoS attacks, which Akami states were minimal. The June 8, 2011 event was used as a deadline of sorts by many businesses, and resulted in many large corporations getting their IPv6 protocol up and running.

While the event only lasts 24 hours, some large websites likely will continue to enable IPv6 alongside IPv4 addressing. Network Wold quotes Champagne in hoping more businesses will move to IPv6 after seeing the successes of the World IPv6 Day participants now that “everybody went into the water today and found out that the water is fine.”

It will certainly be interesting to see if the success continues and if consumers still on IPv4 can be made ready before the 32-bit address well runs dry, much like the move to digital TV broadcasting in the US saw many deadline push-backs.  Are you ready for IPv6?