All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: General Tech, Chipsets, Mobile | May 9, 2011 - 10:56 AM | Jeremy Hellstrom
NVIDIA is really making moves towards providing the mobile industry with the computing power to bring on better and faster phones. They took a big hit losing the DMI/QPI license from Intel, though the $1.5 billion court settlement took some of the sting from that loss, the battle essentially spelled the end for NVIDIA's motherboard chipset line. Only being able to make motherboard chipsets for their main GPU competitor, AMD, might be amusing in an ironic sense but not an economically sound decision.
Tegra saw a change in NVIDIA's target market, suddenly they provided a mobile chip that provided very impressive computing power and did not absorb a huge amount of power. With the acquistion of Icera they now have a team designing the chips most necessary for a phone to have, RF and baseband transmission. Perhaps they've a big enough foot in the door of the mobile market that they won't be going anywhere soon.
Icera’s baseband and RF technologies span 2G, 3G and 4G networks. Joining them with our Tegra mobile super chip will result in a powerful combination. Tegra has secured a number of design wins across the mobile device spectrum, and we have extensive relationships across the industry, from device manufacturers to carriers. In short, we can scale Icera’s great innovation. For additional context on Icera’s industry-leading technology, check out this report from Strategy Analytics.
Our OEM partners will reap the benefits of faster time-to-market, better integration and enhanced performance. The deal will also open up a new market to NVIDIA. The $15 billion global market for baseband processors is one of the fastest-growing areas in the industry.
Looking ahead, Icera’s programmable baseband processor architecture will allow NVIDIA and its OEM customers to innovate and adapt signaling algorithms in the rapidly evolving mobile telecommunications market — network responsiveness is critical to delivering on the promise of untethered wireless visual computing. Icera’s highly efficient architecture makes it possible to cleanly integrate their baseband processor into system and software platforms rapidly and, ultimately, into the super chip itself, if that’s the best product approach.
Subject: General Tech, Motherboards | May 7, 2011 - 10:51 PM | Scott Michaud
Tagged: coreboot, amd
When you boot your computer, you probably see a splash screen from whatever motherboard manufacturer or system builder you purchased from. Under that splash screen your computer is busily preparing itself to accept your operating system of choice with a lot of proprietary code. coreboot, formerly LinuxBIOS, is an Free Software project first released over a decade ago designed to replace your aforementioned proprietary BIOS with their own lightweight code. They claim boot times to a Linux console of just 3 seconds.
AMD Embedded: In this article.
Thursday, AMD announced on their blog that they have committed to supporting coreboot for all future products starting with Llano APU. They claim that support will continue for the foreseeable future for both features and products.
We are not expecting our readers to replace their BIOSes with coreboot except for a small segment of hardcore enthusiasts with a decent understanding of C. That said, the motivation of coreboot is not currently in the consumer market: the embedded market is the focus and AMD’s pledge of continued support should mean that cash registers, kiosks, and set-top boxes will have a little more AMD inside driving them.
Subject: General Tech, Graphics Cards | May 6, 2011 - 05:25 PM | Scott Michaud
Tagged: linux, kgpu, gpgpu
PC Per has discussed using the GPU as a massively-parallel augment to the CPU for a very long time to allow the latter to focus on the branching logic (“if/then/else”) and other processes it is good at that GPUs are not. AMD and Intel both have their attempts to bundle the benefits of a GPU on to their CPU parts with their respective technologies. Currently most of the applications outside of the scientific community are gaming and multimedia; however, as the presence of stronger GPUs saturates, we are seeing more and more functions relegate to the GPU.
So happy together!
KGPU is an attempt to bring the horsepower of the GPU to the fingertips of the Linux kernel. While the kernel itself will remain a CPU function, the attempt allows the kernel to offload the parallel stuff to the GPU for large speed-ups and keep the CPU free for more. Their current version shows whole multiple speedups of eCryptfs, an encrypted filesystem, in terms of maximum read and write bandwidth by allowing the GPU to deal with the AES cipher.
We should continue to see speedups as tasks that would be perfect for the GPU are finally allowed to be with their true love. Furthermore, as the number of tasks relegated to the GPU increases we should continue to see more and stronger GPUs embedded in PCs which should decrease the fears for PC game developers worried about the number of PCs capable of running their applications. I am sure that is great news to many of our frequent readers.
Subject: General Tech | May 6, 2011 - 12:35 PM | Jeremy Hellstrom
Tagged: fab, TSMC, 12, inch
At 10 million 8-inch equivalent wafers produced in 2010 and an expected 20 million by 2015 it is a good thing that not only is TSMC not having major production issues anymore but it also ahead schedule with the setup of Fab 15, which will be producing 28nm chips on 12 inch wafers. Moving from 8 to 12 inches should also mean less cost per chip, though whether the savings will be absorbed by the costs of the new fab or if they will be passed straight on to the consumer is a question that cannot be answered until summer next year when they expect to get production capacity up to full speed. DigiTimes has the scoop here.
"Taiwan Semiconductor Manufacturing Company (TSMC) has begun equipment move-in for the phase 1 facility of a new 12-inch fab (Fab 15) with volume production of 28nm technology products slated for the fourth quarter of 2011, according to the foundry.
TSMC previously said it would begin moving equipment into the facility in June, and expected volume production to kick off in the first quarter of 2012.
Pilot runs at the phase 1 facility of Fab 15 are expected to start in the third quarter of 2011, following by volume production in the fourth quarter, said Jason Chen, senior VP of worldwide sales and marketing for TSMC, at a company event held on May 5. With new capacity coming online, TSMC will see its combined 12-inch capacity top 300,000 units a month."
Here is some more Tech News from around the web:
- USB 3.0 to Debut as Chip Interconnect @ SemiAccurate
- Meet DOCSIS, Part 1: the unsung hero of high-speed cable Internet access @ Ars Technica
- Transistors go 3D as Intel re-invents the microchip @ Ars Technica
- Apple dumps Intel from laptop lines @ SemiAccurate
- What MDS Should Mean to a CIO @ CoD
Subject: General Tech | May 6, 2011 - 09:20 AM | Tim Verry
Tagged: sony, Internet, Data Breach, Anonymous
As Sony analyzed the forensic data of the recent PSN/SOE attack, they discovered a text file named "Anonymous" and containing the phrase "We are legion," according to Network World. As a result of this, Sony even went so far as to accuse the hacker group as the responsible party in hacking the Playstation Network (and stealing customers' information) in a letter to the U.S. congress.
Anonymous responded to the implications brought by Sony today. Network World reports that Anonymous has stated they were not involved in the attack and that "others performed the attack with the intent of making Anonymous look bad." Based on a press release by the hacker group, it's prior victims had motive to irreparably defame the group in the public eye. Anonymous stated that they have never been involved in credit card theft. Further, they claim to be an "ironically transparent movement," and had they truly been behind the attack they would have claimed responsibility for their actions.
The press release goes on to state that "no one who is actually associated with our movement would do something that would prompt a massive law enforcement response." They further claim that the world's standard fare of Internet thieves would have invested interest in making Sony and law enforcement agencies believe it was Anonymous to throw police off of their trail.
The hacker group names such former victims as Palantir, HBGary, and the U.S. Chamber Of Commerce of being organizations that would like to discredit Anonymous. "Anonymous will continue its work in support of transparency and individual liberty; our adversaries will continue their work in support of secrecy and control," they state in their press release "we are anonymous."
As Anonymous, Sony, and spectators the world over debate, the affected public continues to wait for the true identies of the hackers involved in stealing 77 milion Sony customers' private information to come to light.
Subject: Editorial, General Tech | May 5, 2011 - 11:31 PM | Tim Verry
Tagged: Netflix, Customer Data, Corporate theft
It seems as though this Spring season is just a bad time for customers' personally identifiable information. Especially in the wake of the Sony PSN and SOE attack fiasco, to have yet another large corporation found to be involved in compromised customer data is rather disheartening for customers who trust companies with their private information.
Update: LastPass has also reported a data breach, resulting in customers' emails being compromised. Luckily; however, users' passwords were salted and hashed so users accounts on other sites should not be compromised in contrast to the Sony case where the passwords were compromised.
Fortunately, in the case of Netlfix, they have determined who the responsible party was and have moved swiftly to address the issue. Maximum tech reports that an un-named call center employee for Netlfix was terminated for accessing customers' information without permission. On April 4, 2011 Netflix discovered that one of their call center employees had been accessing confidential information of a number of customers that he had spoken with over the phone. He was found to have accessed the name and credit card information of two customers in New Hampshire.
According to the article, Netflix is now in the process of notifying the two customers in question.
The amount of private data that customers entrust will be kept private by the companies that they do business with everyday is rather daunting. When large corporations like Sony and Netflix run into problems with keeping information secure, one has to wonder how much compromised information goes under the radar of the majority of people. While there is not much one can do to stop others accessing their data without permission once information has been lost in a data breach or as a result of corporate theft, people do have control over what information is given to compainies to begin with.
It may seem rather paradoxical for me to quote Sony of all people; however, they have definitely seen the consequenses and thus can assuredly recommend that customers stay vigilant and protect themselves from fraud. Using one time credit card numbers (if your bank/card provider offers this) or reloadable visa debit cards with just enough money on them fro the desired transactions can help to protect you from data breaches such as this. Further, only provide the minimum amount of information necessary for a transaction, especially if it's to a company that you're unsure about. While various forms of fraud protection can help, preventing yourself from ever needing to use fraud protection in the first place is the best thing you can do for yourself and your private data. "Remain vigilant."
Subject: Editorial, General Tech | May 5, 2011 - 10:58 PM | Ken Addison
Tagged: ultrasharp, u3011, podcast, Phenom II X4 980, Intel, dell, amd
PC Perspective Podcast #153 - 5/05/2011
This week we talk about the Dell UltraSharp U3011 monitor, AMD Phenom II X4 980, 3D transitors and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the iTunes Store
- RSS - Subscribe through your regular RSS reader
- MP3 - Direct download link to the MP3 file
Hosts: Jeremy Hellstrom, Josh Walrath and Allyn Malventano
This Podcast is brought to you by MSI
Program length: 1:23:01
- 0:00:35 Introduction
- 1-888-38-PCPER or firstname.lastname@example.org
- http://twitter.com/ryanshrout and http://twitter.com/pcper
- 0:02:13 That's not a monitor, this is a monitor! Meet the Dell UltraSharp U3011 Review
- 0:23:10 This Podcast is brought to you by MSI
, and their all new Sandy Bridge Motherboards!
- 0:23:55 AMD Phenom II X4 980 Black Edition Review: Last of the Breed
- 0:34:05 Graphics shipments rise 10% despite falling PC sales; NVIDIA share drops
- 0:37:20 AMD's new DX11 compatible embedded E6760 GPU can handle 6 displays
- 0:40:00 Intel and 3D Transistors: A love story
- 0:56:55 Seagate at 1TB per platter
- 1:08:33 Hardware / Software Pick of the Week
- http://twitter.com/ryanshrout and http://twitter.com/pcper
- 1:21:33 Closing
Subject: General Tech, Systems, Mobile | May 5, 2011 - 10:37 PM | Scott Michaud
Tagged: usb computer, Education
In case you did not get enough solder for one day: you are in luck! David Braben, previously known for his work developing such games as Rollercoaster Tycoon, Thrillville, and Kinectimals, created an extremely low cost PC for educational use. His goal is ultimately to have computers like the one he created be accessible such that there would be functionally zero barriers to entry for students to pursue studying computing. A charity was created, the Raspberry Pi Foundation, under these beliefs to distribute this device hopefully sometime within the next 12 months.
Am I the only one who finds it weird that an affordable PC uses HDMI?
Subject: General Tech | May 5, 2011 - 06:05 PM | Scott Michaud
Tagged: security, lastpass
One of the most important parts of security is authentication. A lot of our methods of authentication online revolve around passwords. There is an expectation these days that you are required to remember large passwords composed of completely random characters including numbers and symbols each unique from each other in the event that one source compromises the password you provide it. This necessity confronts our human nature of having terrible memory. Many programs have made attempts at solutions by storing and generating secure passwords for you.
Subject: General Tech, Storage | May 5, 2011 - 06:05 PM | Scott Michaud
Tagged: mod, microSD, atari 810
It is common knowledge that technology gets smaller as time advances. There is, however, a point where a certain level of advancement trots along the border to absurdity and makes you think about exactly what is possible with modern technology and occasionally an innovative spirit. Leave it to the hackers to consistently push that boundary and entertain the rest of us less talented individuals.
Subject: General Tech | May 5, 2011 - 06:04 PM | Jeremy Hellstrom
Tagged: mouse, wireless, gaming, gigabyte
Gigabyte has joined in the attempts of many companies to convince gamers that wireless mice are cool. With 50 hours of battery life and 6500DPI sensor the Aviva M8600 sounds good on paper but until you get it on the mat you will never know how well it performs. Hardware Secrets were certainly impressed by its ambidexterity, they were just as uncomfortable using it with the left hand as with the right. No complaints about input lag though.
"Gamers usually shun wireless peripherals, always wary of a possible energy loss. No one wants to rummage around for a cable and lose an online match. With that in mind, Gigabyte has released a wireless gaming-grade mouse with a long lasting 50 hour battery that comes with an extra battery that you can rapidly switch. Besides those characteristics, the Aivia M8600 reaches 6,500 DPI and features a design for both right- and left-handed users, plus ten reprogrammable buttons. Let's talk first about its physical aspects and then test its wireless operation."
Here is some more Tech News from around the web:
- Microsoft Express Mouse @ Maximum CPU
- Roccat Kone [+] Review @ t-break
- ROCCAT Alumic Gaming Mousepad Review @ Madshrimps
- Razer Onza Tournament Edition Controller Review @ t-break
- Razer Onza Tournament Edition XBOX 360 Controller Review @ HardwareHeaven
- Razer BlackWidow Ultimate Mechanical Gaming Keyboard Review @ HardwareHeaven
Subject: Editorial, General Tech | May 5, 2011 - 08:35 AM | Tim Verry
Tagged: Internet, Education, Cyber Security
Microsoft recently posted a press release detailing the results of its sponsored study by the NCSA (National Cyber Security Alliance). The study sought to determine whom people believe bears the responsibility for teaching children how to protect themselves on the Internet, as well as what the current situation is as far as K-12 students’ level of preparedness and education. The executive director of the NCSA, Michael Kaiser, had this to say:
“Just as we would not hand a child a set of car keys with no instruction about how to drive, we should not be sending students out into the world without a solid understanding of how to be safe and secure online."
According to Microsoft, the NCSA advocates for a “comprehensive approach” to teaching children from K-12 how to stay safe and secure online. While the consensus seems to be that students do need educated in Internet security, people are divided on exactly who bears the primary responsibility for teaching children. Children’s teachers, parents, and even government leaders and law enforcement have all been raised as possible responsible parties. The majority of teachers (80 percent) and school administrators (60 percent) surveyed are proponents of parents being responsible for teaching their kids about “digital safety, security, and ethics.” On the other hand, more than 50 percent of the IT coordinators surveyed believe that teachers are the ones that bear the most responsibility of educating kids. From the survey, one area where all groups do seem to agree is on the question of government responsibility in educating kids. Microsoft states that less than one percent believe law enforcement and government officials should bear the responsibility.
While cyber security is important for students to learn, as 97 percent of school administrators believe schools should have courses and an educational plan for students throughout their K-12 grades, only 68 percent of administrators “believe their schools or school districts are doing an adequate job of preparing students...”
The situation of adequate education looks even bleaker when teachers where surveyed. When asked whether they feel prepared to teach students adequately, 24 percent believed they were adequately prepared to talk about and educate kids on protecting personal information on the Internet, and 23 percent are comfortable teaching the risks of cyberbullying. Further, only one-third of teachers surveyed believe they are prepared to educated students on basic Internet security skills “such as password protection and backing up data.” The low numbers are attributed to the lack of professional development training that teachers are receiving. Microsoft states that “86 percent received less than six hours of related training.” Microsoft quotes Kaiser in saying that “America’s schools have not caught up with the realities of the modern economy. Teachers are not getting adequate training in online safety topics, and schools have yet to adopt a comprehensive approach to online safety, security and ethics as part of a primary education. In the 21st century, these topics are as important as reading, writing and math.”
In all of this, there is a ray of hope. Comparing the 2010 study to the NCSA’s 2008 study which you can read here, an increasing number of teachers believe cyber security and professional development training is a priority.More than 60 percent of school officials and teachers are interested in pursing further security training. This interest in training among teachers is up to 69 percent from 55 percent in 2008. IT coordinators and administrators are also becoming more interested in revamping the educational curriculum to better teach their students and workers. Further improvements in interest among educators pursuing further security training can be seen between the 2010 and the 2011 NCSA study. Also, slightly higher percentages exist across the board for teachers who have tought aspects of security in their classrooms compared to both the 2010 and 2008 studies.
On the other hand, while interest in training is increasing for teachers, from 2010 to 2011, security topics taught in clases have actually dropped. This is in addition to a decrease in teachers' beliefs that they bear responsibility in educating kids.
A comparison paper between the 2008 and 2010 study can be downloaded here (PDF).
What are your thoughts on this issue; who bears the primary responsibility in educating children on the importance of Internet safety?
Image 1 courtesy 2011 NCSA study. Image 2 courtesy 2008 to 2010 NCSA comparison study. Material is copyright NCSA, and used according to fair usage guidelines for the purpose of commentary and reporting.
Subject: General Tech, Storage | May 5, 2011 - 08:28 AM | Tim Verry
Tagged: Hard Drive, Areal Density, 1TB Platter
In an amazing feat of data density, Seagate has once again made a leap to the next level of storage technology unveiling 1 Terabyte per platter drives. WIth an areal density of 625 Gigabytes per square inch, Seagate claims the new drives are capable of storing “virtually countless hours of digital music,” and “1,500 video games.”
The move to 1TB per platter drives is an especially important step for high capacity drives. Current 1TB+ drives are using two 500 GB platters, while current 3TB drives are using either four 750 GB platters in the form of the WD Caviar Green 3 TB that PC Perspective has reviewed here, or the five 600 GB platters. With Seagate’s new technology, they will be able to cut the number of platters in their highest capacity 3 TB drives almost in half. By moving from five platters to three, their drives will run cooler, faster, and with less power draw. Improved areal density also reduces the number of moving parts, and thus decreases the points of failure, even with the inclusion of newer and more sensitive read heads.
The place in the market where this new technology will make the most noticeable difference is in the mobile segment. With just a single platter, mobile users will have close to 1.5 terabytes of internal storage in a two platter drive, or 750 GB in a one platter drive while using less power and being capable of faster reads. This means that road warriors will be able to keep more of their files with them without reducing battery life compared to the current crop of mobile hard drives.
Unfortunately, mobile users will have to wait, as Seagate has only announced 3.5” desktop and external drives. These drives will be branded under both the Seagate Barracuda XT and GoFlex lines respectively.
For desktop users, they can currently expect capacities ranging from 1TB to 3TB drives. In a RAID array, these new lower power and potentially faster drives would make for a great addition to an HD video editing rig. Call me crazy, but I’m going to hold onto my old school 320 GB Seagate drives until I can jump straight to 4 TB. So, where’s my 4 platter, 4TB drive Seagate?
Are you excited about this new platter technology? What would you do with 3 terabytes of storage?
Subject: General Tech, Graphics Cards, Systems, Storage | May 4, 2011 - 06:43 PM | Jeremy Hellstrom
Tagged: ssd, everest, benchmarking, benchmark, aida64, aida
BUDAPEST, Hungary - May 04, 2011 - FinalWire Ltd. today announced the immediate availability of AIDA64 Extreme Edition 1.70 software, a streamlined diagnostic and benchmarking tool for home users; and the immediate availability of AIDA64 Business Edition 1.70 software, an essential network management solution for small and medium scale enterprises.
The new AIDA64 release further strengthens its solid-state drive health and temperature monitoring capabilities, and implements support for the latest graphics processors from both AMD and nVIDIA.
New features & improvements
- LGA1155 B3 stepping motherboards support
- Preliminary support for AMD “Bulldozer” and “Llano” processors
- Intel 320, Intel 510, OCZ Vertex 3, Samsung PM810 SSD support
- GPU details for AMD Radeon HD 6770M, Radeon HD 6790
- GPU details for nVIDIA GeForce GT 520, GT 520M, GT 550M, GT 555M, GTX 550 Ti, GTX 590
Pricing and Availability
AIDA64 Extreme Edition and AIDA64 Business Edition are available now at www.aida64.com/online-store. Additional information on product features, system requirements, and language versions is available at www.aida64.com/products. Join our Discussion Forum at forums.aida64.com.
Subject: General Tech | May 4, 2011 - 05:28 PM | Scott Michaud
Tagged: mse, Malware, antivirus
One of the major drawbacks of having general purpose computation devices is malware. Your computers are designed to manipulate and store instructions and information and they do that amazingly. Your computers, however, cannot tell who gave what instruction; they follow a set of instructions until it links to another, which they follow, ad infinitum. When someone who wants to use your computer can get their series of instructions run by your computer: that is when you got a problem.
Subject: General Tech, Mobile | May 4, 2011 - 02:32 PM | Scott Michaud
Tagged: tablet, kindle, amazon
Amazon certainly has a knack for causing a ruckus in just about any industry they step into. Their inception placed them in stiff competition with bookstores and mail-order catalogs; since then they have branched out even as far as rental computing and storage, content production and publishing, and consumer electronics.
A recently rumored OEM order to Quanta Computer, already an OEM partner of RIM and Sony, proposes that Amazon is looking to beef up their portfolio to include Tablet PCs.
Could Amazon be Kindling for a much bigger fire?
Subject: General Tech, Processors | May 4, 2011 - 01:55 PM | Tim Verry
Tagged: transistor, Intel
"After a decade of research, Intel has unveiled the world's first three dimensional transistor" states Mark Bohr, a Senior Fellow for Intel. Silicon based transistors in computers, mobile devices, vehicles, and embedded equipment have only existed in a planar, or two dimensional, form until today.
The new three dimensional transistor, dubbed "Tri-Gate," is now ready for high volume production, and will be included in Intel's new Ivy Bridge 22nm processors. This new Tri-Gate transistor is a huge deal for Intel as it will enable them to maintain the pace of current chip evolution as outlined by Moore's Law. If you are not familiar with Moore's Law, it states that approximately every 18 months, transistor density will double, bringing with it increases in performance and yeild while decreasing cost of production. Intel states that "It has become the basic business model for the semiconductor industry for more than 40 years."
As processors become smaller and smaller, the electric current becomes more and more difficult to contain. There are hundreds of thousands of minute connections and switches inside today's processors, and as manufacturing processes shrink, the amount of current leakage increases. With Intel's Core 2 Duo processors, Intel created a new "high-k"(high dielectric constant, which is a property of matter relating to the amount of charge it can hold) metal gate transistor using a material called Hafnium. The new material replaced the silicon dioxide dielectric gate of the transistor to combat the current leakage problem at 32nm. This allowed the chip process to shrink while scaling to produce less current leakage and heat. To be more specific, Intel states that "because high-k gate dielectrics can be several times thicker, they reduce gate leakage by over 100 times. As a result, these devices run cooler."
Unfortunately, at the much smaller 22nm process, Intel was not achieving results congruent with Moore's Law using even their high-k gate transistors. In order to maintain the scaling predicted in Moore's Law, Intel had to once again re-invent their transistors. In order to create a smaller manufacturing process while overcoming current leakage, Intel had to develop a way to use more of what little space they had available to them. It is here that they entered the third dimension. By designing a transistor that is able to control the electrical current on three sides instead of a single plane, they are able to shrink the transistor while ending up with more surface area to "control the stream" as Mark Bohr puts it.
The proposed benefits of Tri-Gate lie in it's ability to operate at lower voltages, with higher energy efficiency, all while running cooler and faster than ever before. More specifically, up to 37 percent increases in performance at low voltages versus Intel's current line of 32nm processors. Intel further states that "the new transistors consume less than half the power when at the same performance as 2-D planar transistors on 32nm chips." This means that at the same performance level of the current crop of Intel CPUs, Ivy Bridge will be able to do the same calculations either while using half the power needed of Sandy Bridge or nearly twice as fast (it is unlikely to scale perfectly as there is overhead and other elements of the chip that will not be as radically revamped) at the same level of power consumption. If this sort of scaling turns out to be true for the majority of Ivy Bridge chips, the overclocking abilities and resulting performance should be of unprecedented levels.
The use of Tri-Gate transistors is also mentioned as being beneficial for mobile and handheld devices as the power efficiency should allow increases in battery life. This is due to the chip running at decreased voltages while maintaining (at least) the same level of performance as current mobile chips. While Intel did not demo any mobile CPUs, they did state that Tri-Gate transistors may be integrated into future Atom chips.
Subject: Editorial, General Tech | May 4, 2011 - 01:36 PM | Tim Verry
Tagged: Law, Copyright, Bit Torrent
This past year has seen a surge of copyright infringement cases where copyright holders have brought suits against not one, but hundreds or even thousands of defendants. These kinds of wide sweeping cases are highly controversial, and, according to TorrentFreak opponents have even gone so far as to call these types of cases as "extortion".
The main reason for the controversy is that rights-holders are acquiring lists of IP addresses that connect to, download, and/or share illegal files that they own the original copyright for. They are then bringing lawsuits against the so called John Does listed in the IP addresses, and using legal subpoenas to force ISPs to release personal information of the account holder(s) connected to that IP at the times the IP address was logged downloading and/or sharing their files. While many may not realize the flaw in this logic, it seems as though a District Court judge by the name of Harold Baker has questioned the legality and implications of assuming an IP address is grounds enough to obtain further personal information.
The issue of connecting solely an IP address to a person is that while a log with an IP address along with specific dates and times can be connected to an ISP’s subscriber and their Internet connection, there is no way to know that it was that particular person who represented that IP address in that matter. It could just as easily have been another person living in the household, a friend or visitor who used the wireless connection, or a malicious individual piggy-backing on that subscriber’s Internet connection (and thus the IP address).
TorrentFreak reports that “Judge Baker cited a recent child porn case where the U.S. authorities raided the wrong people, because the real offenders were piggybacking on their Wi-Fi connections.” They also state that Judge Baker believes that these types of cases, particularly when it involves adult entertainment, assuming an IP address is enough material to subpoena for further personally identifiable information could obstruct a “‘fair’ legal process.” This is because, bringing a suit against someone by connecting them to solely an IP address, especially when it involves adult entertainment, could irreparably defame an innocent persons character.
Judge Baker goes on to say that rights-holders could potentially use the delicate issue of the accusation of allegedly sharing adult material to encourage even innocent people to settle out of court. TorrentFreak reports that “Baker conlcudes [sic] by saying that his Court is not supporting a “fishing expedition” for subscribers’ details if there is no evidence that it has jurisdiction over the defendants.”
There is no question that Judge Baker’s ruling could potentially change the landscape of bit torrent related lawsuits throughout the United States. Rights-holders are no doubt going to aggressively combat this ruling; however, civil rights groups and countless innocent people are rejoicing at the knowledge that it may very well be the beginning of the end for John Doe bit torrent lawsuits in the Unite States.
Image courtesy MikeBlogs via Flickr (creative commons 2.0 w/attribution).
Subject: General Tech | May 4, 2011 - 11:56 AM | Jeremy Hellstrom
Tagged: lag, buffer, bloat, input lag, gaming, online
Packet loss, network latency and input lag are often blamed for the reason your character is now a corpse and why your opponent is doing a happy dance on your naughty bits but there is another target to blame for your lousy online gaming skills, buffer bloat. It seems that larger storage space is not always a good thing as TCP/IP needs dropped packets to tell it to slow down and when a network sports a buffer that can hold 10 seconds or so of data in its buffer before dropping a packet and informing the connection that there is a problem. If you've ever played a game which slows down and then does a quick speed up for a few seconds you have probably met buffer bloat. Slashdot doesn't have a solution but they do have more information for you.
"Gamers often find 'input lag' annoying, but over the years, delay has crept into many other gadgets with equally painful results. Something as simple as mobile communication or changing TV channels can suffer. Software too is far from innocent (Java or Visual Studio 2010 anyone?), and even the desktop itself is riddled with 'invisible' latencies which can frustrate users (take the new Launcher bar in Ubuntu 11 for example). More worryingly, Bufferbloat is a problem that plagues the internet, but has only recently hit the news. Half of the problem is that it's often difficult to pin down unless you look out for it. As Mick West pointed out: 'Players, and sometimes even designers, cannot always put into words what they feel is wrong with a particular game's controls ... Or they might not be able to tell you anything, and simply say the game sucked, without really understanding why it sucked.'"
Here is some more Tech News from around the web:
- Taking Up Space: Mass Effect 3 Screens @ Rock, Paper, SHOTGUN
- Portal 2 DLC coming soon @ HEXUS
- Total War: Shogun 2 Performance w/ AMD Radeon HD 6850 @ Legit Reviews
- Portal 2 Review @ Techgage
- Artistic trickery: Ars looks at indie mech game Hawken
- Thinking on rails: why Portal 2 isn't as good as the original @ Ars Technica
- Can you really learn to race by playing racing games? Ars takes to the track
- Section 8 Prejudice Launch, Unlockable Mode @ Rock, Paper, SHOTGUN
- Portal 2 Trickshots @ Rock, Paper, SHOTGUN
- Pilotwings Resort Nintendo 3DS @ Tweaktown
- Gears of War 3 beta: senior gameplay designer offers tips @ Ars Technica
- Rocksmith Preview @ Computing on Demand
- Operation Flashpoint Red River (XBOX 360) Review @ GamingHeaven
Subject: General Tech | May 4, 2011 - 11:19 AM | Jeremy Hellstrom
Tagged: peddie, nvidia, market share, Intel, gpu, amd
SemiAccurate got hold of Jon Peddie's most recent look at the GPU market and how it is divvied up between the major competitors; which doesn't include SIS who hit 0% this year. The two current discreet GPU makers swapped positions last quarter with AMD in the lead and that remains true this quarter as they have grown to 24.8% while NVIDIA fell to 20%. Last year at this time NVIDIA had a comfortable 8% more of the market than AMD, but with a Fermi launch that just didn't go as well as hoped and AMD coming out strong and generally less expensive, that lead has evaporated thanks not only to the discreet GPUs but also Brazos.
Speaking of APUs, the more mathematically inclined readers may notice that a large chunk of the graphics market is missing in those figures. 54.4% of that missing market belongs to Intel who have seen their share of the market jump by alnost 10% since Q1 2010. The vast majority of their market share belongs to the embedded GPU present in many Intel systems but at least some of that growth is thanks to the new SandyBridge platform which many enthusiasts are purchasing and which counts towards market share even if it is only being used for transcoding in a system with a discreet GPU.
"The latest GPU marketshare numbers from Jon Peddie are out, and it looks like we have a new leader in GPUs, AMD. According to the numbers released today, Q1 saw AMD overtake Nvidia in year over year GPU marketshare, and the turn-around promised last February fizzle."
Here is some more Tech News from around the web:
- The HTML5 future of the web starts to take shape @ The Inquirer
- HP engineering veep spills cloud plans onto LinkedIn @ The Register
- How-To: Portal Sentry Turret Egg Cup @ Make:Blog
- Seagate to control 40% of HDD market with Samsung acquisition, says IHS iSuppli @ DigiTimes
- Canon PowerShot SX230 HS Review @ TechReviewSource
- Level One Wireless 300Mbps N_Max Ceiling PoE Access and 4 GE PoE + 1 GE Switch Review @ OverclockersHQ
- DemoCamp April 2011 Coverage @ t-break
- Win a MSI N550GTX-Ti graphics card @ t-break
- Win a Linksys E3000 wireless router @ t-break
- Win a ECS Black GTX460 graphics card @ t-break