Subject: General Tech | April 12, 2012 - 01:00 PM | Jeremy Hellstrom
Tagged: corsair, Vengeance K60, Vengeance K90, mechanical keyboard, cherry mx red, input
If you haven't mastered the ability to identify the difference between mechanical keyboard switches then you should check out Scott's primer on the four main flavours of Cherry. Then you can cheek out a review of Corsair's Vengeance K60 and K90 keyboards at The Tech Report which both utilize the Cherry MX Red variety and are considered a great choice for gamers. The big difference between the two models is the array of programmable macro keys which exist on the left hand side of the K90 as well as the rubber dampers which are added. The Tech Report were not impressed with the dampers, they felt it muddied the keystroke and made it feel more like a membrane type keyboard. Check them both out in the full review.
"Join us as we rattle away on the lovely mechanical keyswitches of Corsair's aluminum-clad Vengeance K60 and K90 keyboards."
Here is some more Tech News from around the web:
- Corsair Vengeance K90 Keyboard @ Bjorn3D
- Enermax KW001 Briskie Keyboard mouse combo @ Guru of 3D
- Corsair Vengeance K90 & M90 MMO/RTS Keyboard and Mouse Review @ Legit Reviews
- Corsair Vengeance K60 Performance FPS Mechanical Gaming Keyboard @ Tweaktown
- CM Storm Trigger Mechanical Gaming Keyboard Review @ HardwareHeaven
- Cooler Master QuickFire Pro Mechanical Gaming Keyboard @ Pro-Clockers
- ROCCAT Isku Illuminated Gaming Keyboard @ Tweaktown
- SteelSeries Kinzu V2 Pro Edition Gaming Mouse @ Kitguru
- Corsair Vengeance K60 Gaming Keyboard Review @ Hardware Secrets
- Corsair Vengeance M90 and K90 Review @ OCC
- Corsair Vengeance M60 Mouse Review @ Hardware Secrets
- Corsair Vengeance M60 Performance FPS Laser Gaming Mouse @ Tweaktown
- Corsair Vengeance M90 Gaming Mouse Review @ TechwareLabs
- ROCCAT Kone Plus Max Customization Laser Gaming Mouse @ Tweaktown
- Corsair Vengeance M60 Laser Gaming Mouse @ Benchmark Reviews
Subject: General Tech | April 12, 2012 - 12:32 PM | Jeremy Hellstrom
Tagged: microserver, Centerton, seamicro, atom, low power
Microservers are the newest old idea to hit the PR flacks, anyone who remembers the original blade servers already has a good idea what a microserver is. Intel has once again tried to take ownership of a form factor, in this case defining what they feel the market should consider a microserver. In some ways, the single socket design seems to run counter to current low power servers, which tend towards large arrays of low powered APUs but at the same time when you no longer have to worry about the interconnects between those APUs you can drop the price significantly.
AMD has had several forays into this market and while Intel has never put much effort into this segment vendors like Dell and HP have been creating microservers using an Intel processor for some time. This heralds a change in Intel's strategy when taking on ARM and AMD in the server room, with the 6W Atom Centerton chip they announced at IDF. The Inquirer was also told of 10W and 15W parts which would be more powerful although they could also require a bit more space than what the 6W part could survive in. It seems that those looking for inexpensive servers which require very little infrastructure will have a lot of choices to spend their money on by the end of this year.
"CHIPMAKER Intel dropped an Atom bomb on the second day of IDF in Beijing, announcing its 'Centerton' microserver chip that will draw just a miserly 6W of thermal design power (TDP).
It defines a microserver as a computer with one socket, error correction, 64-bit processing, and minimal memory and I/O. The Atom Centerton platform will have two cores, Hyperthreading and support for ECC DDR3 as well as VT-x virtualisation technology. Intel said the Atom Centerton chip will be available in the second half of this year."
Here is some more Tech News from around the web:
- Intel introduces 910 Series PCIe SSD @ The Tech Report
- Intel's SSD 910: Finally a PCIe SSD from Intel @ AnandTech
- Intel expected to bring forward the launch of Ivy Bridge @ DigiTimes
- Malware-infected flash cards shipped out with HP switches @ The Register
- US sues Apple and book publishers for ebook price fixing @ The Inquirer
- WD pushes out super-slim shock-resistant Ultrabook drive @ The Register
- Samsung WB150F Review @ TechReviewSource
- Intel predicts 15 billion connected devices by 2015 @ Kitguru
- Ninjalane Podcast - Server Update Choosing a Video Card HWBot Team Update
- Microsoft Visual Studio 11 Preview @ Techgage
Subject: General Tech | April 11, 2012 - 04:19 PM | Jeremy Hellstrom
Tagged: gaming, dragon age 3, BioWare
If you can believe the hints that BioWare is dropping, Dragon Age 3 should feel a lot less like a very expensive, poorly designed add on and more like an actual new game. Of course, this comes from a very quick talk given at PAX about an unannounced game which might or might not be in development, so your sodium intake for the day should be adjusted to compensate. They also implied that your choices would Effect the ending of the game ... now where have we heard that before?
"Rumours are flying that Dragon Age 3 might be something more like the sequel to Dragon Age we’ve been hoping for. After Dragon Age 2 came out feeling more like a side-project, BioWare have dropped some hefty hints that they’re looking to redress much of that in an unannounced third game for the series. At a PAX East panel, as spotted by Eurogamer and recorded by Gamespot, Dragon Age developers discussed what a hypothetical game might contain, were it to exist, which it currently doesn’t, but obviously does. It’s to be a far more varied game, with new locales, and decisions that carry over from previous games."
Here is some more Tech News from around the web:
- Need For Speed: World PC (Free-to-Play) Review @ eTeknix
- Harassing Ford: Han Solo Adventures @ Rock, Paper, SHOTGUN
- Game Of Thrones RPG Out In June, Kissed Its Sister @ Rock, Paper, SHOTGUN
- Fan Makes FTC Complaint Over Mass Effect 3 Ending, But it Won't Hold Water @ Forbes
- Kinect Star Wars (XBOX 360) Game Review @ HardwareHeaven
Subject: General Tech | April 11, 2012 - 02:33 PM | Jeremy Hellstrom
Tagged: toshiba, NPEngine, video streaming
The NPEngine from Toshiba has a big future for content providers who would be interested in streaming thousands of concurrent videos and save on space and power requirements. The quote from the article at The Register claims a 70% reduction in size and a 77% reduction in power consumption for a provider who could provide 100,000 separate streams. That savings demonstrates the benefit of purpose built hardware, what it lacks in versatility it more than makes up for in savings. The server version is due out this year, with the possibility of a single chip version for laptops and SFF machines which will take the CPU completely out of the picture when playing HD video.
"Toshiba's NPEngine hardware directly streams video from SSDs to IP networks without using host server CPU cycles or memory.
Tosh claims the dedicated hardware delivers up to 64,000 x 40Gbit/sec video streams – way more than the 20,000 or so an average 2U server is said to be able to stream. The Toshiba hardware, a server card, can replace at least two video-streaming servers and enable its host server to do other work."
Here is some more Tech News from around the web:
- Mosh: Modernizing SSH With IP Roaming, Instant Local Echo @ Slashdot
- Gadget Show 2012 Roundup @ XSReviews
- OnLive goes legit with licensing downshift for virtual Windows @ The Register
- Gadget Show Live 2012 Coverage @ Kitguru
- Final Day to enter ASUS Masters of Overclocking Event with HardwareHeaven (UK) Win an Ivy-Bridge System at i45
Subject: Editorial, General Tech | April 10, 2012 - 10:45 PM | Scott Michaud
Tagged: piracy, epic games, bulletstorm
Mike Capps of Epic Games, among many other developers and publishers, completely misses the point about piracy. No-one can control piracy, they can only control factors which influence it -- but controlling those factors is meaningless if sales are sacrificed in the process. No-one gets paid by not being pirated; people get paid by making sales.
Frequent readers of my editorials are probably well aware that I am quite vocal about many topics including piracy, the consumables model of art, censorship, and used content sales. I take a very mathematical approach to a lot of complicated topics. Unfortunately, a lot of what is considered common truths is based on fundamentally invalid statistics. It gives me a lot to write about.
Mike Capps of Epic Games was interviewed by GameSpot during PAX East and at some point in the discussion the topic floated across Bulletstorm. On the topic of its lower-than-expected sales, Capps added that the PC version was adversely affected by piracy.
Piracy gnashing its teeth?
Similar statements have been made for countless other games at countless other times. Each of those statements makes a subtle but gigantic mistake in formulating the problem: piracy is not something which does, piracy is something which is. Piracy does not affect your sales, but whatever affected piracy might also affect sales in one way or another.
The intuition is that sales decrease as piracy increases and vice versa. That assumption is nullified by counter-example: do not release a product. Piracy and sales, if you do not release a game, will trend in the same direction: to zero. It is now obvious that sales and piracy do not always inversely correlate.
As Mike Capps also stated in the interview, Bulletstorm had a very rough launch and lifespan on the PC. Bulletstorm required for Games for Windows Live, encrypted its settings, and did other things to earn a reputation since launch as a bad console port to the PC. Customers complained about the experience on the PC which fueled an inferno of uncertainty and doubt for potential buyers.
Being pirated is not losing a sale, but losing a customer before their purchase is.
I was personally on the fence about Bulletstorm and this negative word-of-mouth lead me to ignore the title. I did not purchase the game, I did not pirate the game; I ignored the game. Perhaps those who pirated your title did so because they were interested, became discouraged, but were not discouraged enough to avoid giving it a chance with piracy?
What I am saying is -- piracy cannot reduce your sales (it cannot do anything, it is a measurement), but perhaps whatever combination of factors reduced your sales may also have increased your piracy?
Piracy is an important measurement to consider -- but it, like sales, is just that, a measurement, nothing more. Strive to increase your sales -- keep an eye on your piracy figures to learn valuable information -- but always exclusively strive to increase your sales. It is the measurement that will pay your bills.
Subject: General Tech, Graphics Cards | April 10, 2012 - 07:18 PM | Scott Michaud
Tagged: nvidia, leak, GTX 690
More information has surfaced about NVIDIA’s GeForce GTX 690 video card. While other tidbits came to light, perhaps most interesting is the expected May release.
NVIDIA has suffered from quite a few leaks near their launch of the GeForce GTX 680 GPU and its associated cards. Benchmarks were accidentally published early and product pages were mistakenly posted premature. The hot streak continues.
It may be time to just reset fate and skip to the GTX 700-series. I mean they will eventually be rebranded 700-something anyway.
I kid, I kid.
Not many specifications were leaked, although there is not much left that cannot already be assumed about the card due to the similarities with its sister part.
The reference model GTX 690 will require two 8-pin power connectors and output via three DVI ports as well as a mini DisplayPort. The already released GTX 680, by contrast, requires two 6-pin connectors and outputs by two DVI, an HDMI, and a full size DisplayPort.
The new card will require more power for its dual GK104 GPUs as the larger power connectors would suggest. While the GTX 680 is happy with 550W of total system power, the GTX 690 would like a system power supply of at least 650W. Since the 680 is expected to draw a maximum of 195W, an extra 100W would put estimates for the 690 power draw at somewhere around 295W.
Unfortunately estimates based on rated total system power are very inaccurate as power supply requirements are often raised to the nearest 50W. Really, the 690 could be anywhere between 245W and 295W and even those figures are just estimates.
Still, it looks as though my 750W power supply will survive past May when the leak claims that the GTX 690 is expected to arrive. Yay! May!
Subject: General Tech, Cases and Cooling | April 10, 2012 - 06:03 PM | Scott Michaud
Tagged: graphene, cooling
Researchers at NC State University have tested the heat dissipation properties of copper-graphene. Their findings suggest that the material could be cheaper and more effective than pure copper.
Some people have gone to ridiculous lengths to cool their components. Some people flush their coolant regularly. Some people will never live down mineral oil jokes. No two computers are not on fire. Awwww.
Copper is regularly used as a method of dissipating heat as it is highly efficient when sufficiently pure. While copper is expensive, it is not expensive enough to be prohibitive for current use. Alternatives are still being explored and a researcher at NC State University believes graphene might be part of the answer.
Some people stick a bathroom suction fan out a window and run a 3” drier hose into their case.
As always, I become immediately skeptical when a team of researchers make a claim such as this. Whether or not these issues are valid have yet to be seen, but they come to mind none-the-less. The paper claims that the usage is designed for power amplifiers and laser diodes.
My first concern is with geometry. Effective cooling is achieved by exposing as much surface area between two materials as is possible for the situation. Higher heat conductance allows heat to get away much more efficiently, but the heat still needs to be removed to a reservoir of some sort, such as your room. There has not been much talk about the possibilities to then remove the heat after copper-graphene so efficiently sucks from the heat source.
My second concern is with the second layer of indium-graphene. While it seems as though the amount of indium required is quite small -- just a single layer between the heat source and the copper-graphene -- we do not really know for certain how that relates to real world applications. Indium is still a very rare element which is heavily mined for touch screen devices. It might prove to be cheap, but there is only so much of it. Would we also be able to reclaim the Indium later, or will it end up in a landfill?
These concerns are probably quite minor but it is generally good practice to not get too excited when you see a research paper. Two points if you see any of the following: Nano, Graphene or Carbon Nanotubes, Lasers, and anything related to High-Frequency.
Subject: General Tech | April 10, 2012 - 11:48 AM | Jeremy Hellstrom
Tagged: kingston, fab, tour, ssd
Tweaktown was invited put on a bunny suit and take a tour of Kingston's SSD manufacturing facility in Taiwan. Starting from a pile of surface mount transistors which are automatically soldered and inspected before being baked at up to 270C once all the components have been mounted to the PCB, they snapped pictures of as much of the process as they could. From there it is off to the testing facility where Kingston ensures that all the drives that came off of a particular run are up to the expected standards. TweakTown does mention a burn-in machine, but unfortunately they were told not to post them as Kingston wanted to keep at least a few trade secrets from getting out. It could also be that they don't want the world to know that they cloned Al several times and use his SSD killing expertise as the final test before releasing a drive to the channel to be sold.
"We were exclusively invited into the Kingston factory where few media have been and got shown the process of making an SSD from start to finish. Due to media restrictions, we were not allowed to produce a video of the tour, but we were allowed to take photos. Obviously Kingston is a market leader in memory and SSD products and there is plenty of sensitive machinery and such - and we needed to respect that and their rules."
Here is some more Tech News from around the web:
- The TR Podcast 109: Dude, where's my Apple tax?
- Former Intel employee pleads guilty to stealing secrets before joining AMD @ The Inquirer
- GCC 4.7 Compiler Performance On AMD FX-8150 Bulldozer @ Phoronix
- ClearOS,the Missing Link LAN Server @ Linux.com
- Ubuntu 12.04 LTS KVM Virtualization Battles 8.04.4, 10.04.4 LTS @ Phoronix
- Tegra T37/AP37 pops up on the Nvidia roadmaps @ SemiAccurate
- Sony Cyber-shot DSC-HX10V Review @ TechReviewSource
- Weekly Giveaway #26: Gigabyte Z77X-UD5H (Intel Z77) Motherboard @ eTeknix
Subject: General Tech, Systems | April 8, 2012 - 08:38 PM | Tim Verry
Tagged: Raspberry Pi, pcb, emc test, computer, compliance testing, arm
The highly anticipated Raspberry Pi ARM computer has run into several launch hiccups, the most recent being that the distributors -- RS and Farnell -- refused to sell and ship the devices without the Raspberry Pi passing the proper electromagnetic interference testing. While such certification is not required for Arduino or Beagle Boards, the companies stated that because the Raspberry Pi was (more) likely to be used as a final consumer product (and not a development board) it needed to obtain and pass EMC testing to ensure that it would not interfere with (or be interfered by) other electronic devices.
According to a recent blog post by the charity behind the ARM powered Linux computer, the Raspberry Pi has passed the EMC compliance testing with flying colors -- after a few hiccups with a network hub used to test the Raspberry Pi while it was being hit with an EM field were sorted out.
The team has been working out of Panasonic’s facility in South Wales to test the Raspberry Pi. Due to having the lab area for a whole week, they managed to knock out consumer product inference testing for several other countries as well. Mainly, the Raspberry Pi is now compliant with the UK CE requirements, the United States’ FCC, Australia’s CTick, and Canada’s Technical Acceptance Certificate (TAC).
Assuming the paper work is properly filed and RS and Farnell accept the certifications, the Raspberry Pi units should begin winging their way to customers shortly. Are you still waiting on your Raspberry Pi, and if so have you decided what you intend to use it for yet?
If you are interested in the Raspberry Pi, be sure to check out some of our other coverage of the little ARM computer!
Subject: General Tech, Mobile | April 7, 2012 - 07:11 PM | Tim Verry
Tagged: tegra 4, tegra, SoC, nvidia, mobile
The Chinese language VR-Zone website has allegedly managed to get their hands on a leaked specifications sheet for NVIDIA’s upcoming Tegra 4 System-on-a-chip (SoC) aimed at mobile tablets. Codenamed “Wayne,” the new SoC will come in several flavors and will arrive next year.
The upcoming chips will have 10x the performance of NVIDIA’s original Tegra and five times the performance of the current generation Kal-El Tegra 3 chip. NVIDIA has run into several hurdles in integrating an LTE cell radio into their SoCs, but if the leaked document is true, the company will finally release a Tegra chp with built-in LTE 100 and HSPA42 cell radio capabilities as early as the third quarter of 2013.
Further, the Tegra 4 SoCs will come in four flavors: T40, T43, AP40, and SP3X. T40 will represent the first Tegra 4 chp that manufacturers and consumers will be able to get their hands on -- as early as Q1 2013. It is a quad core part with one companion core and will run at 1.8 GHz. T43 is an evolution of the T40 and will bump up the clockspeed to 2.0 GHz. The AP40 chip will be the first budget Tegra 4 processor and will run anywhere between 1.2 GHz and 1.8 GHz. The T43 and AP40 SoCs are reportedly coming out in Q3 2013. All three chips -- The T40, T43, and AP40 -- are based on the ARM Cortex A15 architecture.
|Release Date||Q1 2013||Q3 2013||Q3 2013||Q3 2013|
|Markets Aimed At||Flagship||Flagship||Mainstream||Mainstream|
|Tablet Device Screen Size||10"||10"||10"||7"|
|Processor Clockspeed||1.8 GHz||2.0 GHz||1.2-1.8 GHz||1.2-2.0 GHz|
The final Tegra 4 chip is called SP3X, and it will arrive in Q3 2013. Aimed at mainstream tablets with 7” or smaller screens, the upcoming SoC will feature LTE support and will have a clockspeed of 1.2 GHz to 2.0 GHz. It is a quad core (plus one companion core) part but is reportedly based on the ARM Cortex A9 architecture. The leaked release dates do seem to be in line with earlier reports, though they should still be taken with your daily dose of salt.
Right now Tegra delivers on performance and many high end mobile devices have incorporated the NVIDIA chip. Even so, they still have very little market share, and the two mainstream Tegra 4 chips -- especially the SP3X with LTE radio -- should help them make inroads against Qualcomm and Samsung who hold a great deal of market share.