All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: General Tech | February 22, 2012 - 12:24 PM | Jeremy Hellstrom
Tagged: contest, nvidia, mass effect 3
Here’s your chance to show off your knowledge on all things Mass Effect. All you have to do is complete the five Mass Effect 3-based trivia questions for your chance to win. Just for participating, you’ll get access to an exclusive Mass Effect 3 digital asset. There’s a new one each week, so keep playing! Plus, you can win awesome prizes like graphics cards—or even a complete gaming rig. Make the right choice. Enter today and gear up for victory!
Week 1 is now over, but Week 2 just started!
Week 2 prizing: 3 GTX 570 (1 winner each)
Week 2 Grand Prize: A MAINGEAR gaming PC valued at over 3K!
The landing page is for the global contest, choose your country and see if you can answer the questions and not only get a chance to win great prizes but you also get a mysterious in game bonus for playing.
Subject: General Tech | February 21, 2012 - 04:00 AM | Scott Michaud
(Update: Yep -- it has been confirmed. Amnesia: A Machine for Pigs. Check it out at The Escapist)
Frictional Games have been releasing PC games since 2006 for Windows, the Mac, and Linux. Each game up to this point was survival horror themed and within the puzzle adventure genre. The most popular and definitely most scary installment was their most recent Amnesia: The Dark Descent. It is often considered to be the most terrifying game of all time. What could be next?
He definitely was not having a Funday
For a few days, Frictional Games created an alternate reality game which slowly doles out clues to the next project. What we know is that it the title is planned to be out in fall of this year; it will very likely have A Machine for Pigs either as a title or subtitle; and will also very likely be related to Amnesia. Clues suggest that the full title will be Amnesia: A Machine for Pigs.
Why do I have a feeling this game will make Minecraft players cry?
Also interesting is that Frictional Games has discussed new features in their engine for their upcoming game. The new engine, which the upcoming game is expected to use, will support terrain as well as global sunlight with shadows. They are not terribly impressive features by any stretch of the imagination, but they were specifically developed for the engine to be used with the new game. Why is there such a need for outdoor lighting and terrain?
Subject: Editorial, General Tech | February 21, 2012 - 01:21 AM | Scott Michaud
Tagged: antivirus, windows 8
Imagine if it were illegal for a dominant homebuilder to sell a house with locks on the door to be fair to the market of locksmiths?
The legality of Microsoft’s planned upgrades to its Windows Defender security suite has been questioned in an article up at ZDNet Asia. While the article itself is very correct in its analysis of the situation it does implicitly ask at what point a market should be obsolete.
Does it really protect consumers to intentionally unbundle security from a core application? Is it better to unbundle security to promote an industry worth of companies with products designed to successfully do little more than alert you when a breach has occurred?
Industry status - Not Protected
Despite the wording of the above three paragraphs, the answer actually is not simple. There is a lot of merit to disallowing the bundling of internal security applications and protect the antivirus industry.
Ponder this, what if Microsoft’s system was really bad? Would promoting competition ultimately drive for a stronger and more secure product in the end? Or alternatively, would the pressure from the attackers themselves be sufficient competition to not need to protect antivirus companies?
It really is an interesting problem when you look into it. What do you think? The comments await, and registration is not required to voice your opinion.
According to an article in New Scientist, a UK firm called Oxford Nanopore has managed to build a DNA sequencer into what looks to be an overweight USB stick. They have named the device the MinION, and it will sell for $900 later this year. It can be used to sequence DNA with as many as 10,000 base pairs in one continuous read. While it can sequence a human genome in about 6 hours, they intend the device to be used to sequence shorter genomes with tasks like identifying pathogens and screening for genetic mutations that can lead to diseases.
The company demonstrated the MinION in action recently at the Advances in Genome Biology and Technology conference where it sequenced Phi X, a virus with 5,000 genetic base pairs. A bioinformatician at Pallen Research Group stated that "Phi X was the first DNA genome to be sequenced ever" and that if it can be sequenced than much larger genomes can be as well.
In addition to the portable MinION, the company is developing a larger scale GridION for lab work that requires more processing horsepower. The sequencing technology in the MinION and GridION operate "like a tickertape reader" by unzipping the DNA using enzymes and electricity.
The article author states that the MinION and portable DNA sequencers like it are going to greatly enhance public health and medicine. When doctors will be able to carry around portable DNA sequencers, they will be able to quickly diagnose genetic issues and identify viruses and other pathogens. Sounds pretty cool (if a bit scary) to me!
Subject: Editorial, General Tech | February 20, 2012 - 08:08 PM | Scott Michaud
Tagged: valve, piracy, Gabe Newell
Ben Kuchera of Penny Arcade caught an interview with Valve Software’s managing director and co-founder, Gabe Newell. The topics were quite typical for a Gabe Newell interview and involve working at Valve, the future of gaming, and DRM. Gabe also joined the beard club; welcome Gabe!
Photo Credit: Giant Bomb
A little over halfway through the interview, Penny Arcade asked Gabe whether they believe that they sidestepped the problems of used games and piracy with Steam. Gabe instead responded to the premise of the question, rather than the question itself:
You know, I get fairly frustrated when I hear how the issue is framed in a lot of cases. To us it seems pretty obvious that people always want to treat it as a pricing issue, that people are doing this because they can get it for free and so we just need to create these draconian DRM systems or ani-piracy(sic) systems, and that just really doesn’t match up with the data.
This quote echoes a problem I have had with the piracy discussion for quite some time. The main problem with the concept of piracy is that people wish to frame it in a context that seems intuitive to them rather than experiment to discover what actually occurs. Piracy is seen as a problem which must be controlled. This logic is fundamentally flawed because piracy is not itself a problem but rather a measurement of potential problems.
Gabe continues with an anecdote of a discussion between a company who used third-party DRM for their title and himself:
Recently I was in a meeting and there’s a company that had a third party DRM solution and we showed them look, this is what happens, at this point in your life cycle your DRM got hacked, right? Now let’s look at the data, did your sales change at all? No, your sales didn’t change one bit. Right? So here’s before and after, here’s where you have DRM that annoys your customers and causing huge numbers of support calls and in theory you would think that you would see a huge drop off in sales after that got hacked, and instead there was absolutely no difference in sales before or after. You know, and then we tell them you actually probably lost a whole bunch of sales as near as we can tell, here’s how much money you lost by bundling that with your product.
Gabe highlights what a business should actually be concerned with: increasing your measurement of revenue and profits, rather than decreasing your measurement of piracy. You as a company could simply not develop products and completely kill piracy, but that would also entirely kill your revenue as you would have nothing to gain revenue from.
Before we begin to discuss piracy, the very first step is that we need to frame it as what it really is: a measurement. While violating terms of a license agreement is in fact wrong, if you focus your business on what is right or wrong you will go broke.
If you believe that there is value in preventing non-paying users from using your product then you will only hurt yourself (and if SOPA/PIPA taught us anything, innocent adjacent companies). It is possible that the factors which contribute to piracy also contribute to your revenue positively as well as potentially negatively. It is also entirely possible that increased piracy could be a measurement of a much bigger problem: your business practices.
You know, it’s a really bad idea to start off on the assumption that your customers are on the other side of some sort of battle with you. I really don’t think that is either accurate or a really good business strategy ((…)) we’ve run all of these experiments, you know, this has been going on for many years now and we all can look at what the outcomes are and there really isn’t – there are lots of compelling instances where making customers – you know, giving customers a great experience and thinking of ways to create value for them is way more important than making it incredibly hard for the customers to move their products from one machine to another.
Subject: General Tech, Storage | February 20, 2012 - 05:53 PM | Scott Michaud
Tagged: ssd, PS3
There is an interesting article down at Eurogamer which covers the possible benefits of upgrading a PS3 with a solid state drive. Those who know me can guess that I am snickering while crossing another perceived advantage off of my console versus PC list. Still, if for some reason you want to play exclusives to a disposable platform that are only exclusive because you let them be and you desire to upgrade your experience, check out the interesting article.
Isn’t “not needing to do this” the whole reason for having a console?
Consoles titles are naturally becoming as hard drive-intensive as they are allowed to be due to their abysmally small quantity of RAM. Developers have been using tricks to increase the usefulness of their available RAM such as disallowing split screen, streaming content as needed, and rendering at low resolutions.
The first Halo, for instance, was famous for their quick load times. The load speed is due in part to having their game assets copied multiple times on the disk which allows choice in loading whichever copy requires the least seek time to access. Also, having a hard drive helped Halo too.
The article itself focuses mostly on RAGE and Skyrim due to their harsh issues with lag and pop-in. Skyrim has had known issues with getting progressively worse as time progressed. This issue was mostly corrected in version 2.03 as was also demonstrated in Eurogamer’s article making an SSD almost unnecessary, but prior to 2.03 an SSD surprisingly helped substantially with the problem. It should also be no surprise that throwing faster storage at RAGE helped immensely just as it does on the PC.
Don't assume the price dictates the audio quality; try real studio quality headsets from Audio-Technica
Subject: General Tech | February 20, 2012 - 03:02 PM | Jeremy Hellstrom
Tagged: audiophile, headset, audio, audio-technica, ATH-A900
At an MSRP of $250, the Audio-Techinca ATH-A900 headphones are not intended for the casual gamer and as you can tell by the 1/4" connector they are designed for someone who owns a high end microphone amp. On the other hand if you need studio quality audio and will be wearing the headsets for hours at a time then the high end features built into these headphones are worth the investment. The 53mm drivers are in an enclosed earcup which helps bring the bass up close and personal and are designed with much sturdier materials than other popular headsets. To contrast the difference [H]ard|OCP tried Beats by Dre Studio headphones which cost more than the ATH-A900s and in every case they felt the ATH-A900s were vastly superior. As far as [H] is concerned the two headsets aren't even in the same class.
"Audio-Technica's open headphones are known to gamers for the wearing comfort and huge soundstage that these provide, but the open back models simply lack bass and isolation. Today, we will see if a pricey pair of the company's closed back audiophile headphones can offer the compromise many of you are looking for in PC audio."
Here is some more Tech News from around the web:
- Monster Gratitude In-Ear Headphones @ Techware Labs
- Jabra Drive Bluetooth Speakerphone Review @ Tech-Reviews
- RHA SA-850 Headphones and MA-350 Earphones Review @ HardwareHeaven
- Phonak Aud PFE232 +mic In-ear Headset @ techPowerUp
- ROCCAT Kave 5.1 Headset Review @ Hi Tech Legion
- Antec Soundscience Rockus 3D 2.1 Review @ HardwareLOOK
- ARCTIC Audio Relay - DLNA Audio Renderer Review @MissingRemote
- ASUS Essence One @ Guru of 3D
- ASUS Xonar U3 USB Sound Card @ Pro-Clockers
Subject: General Tech | February 20, 2012 - 02:15 PM | Jeremy Hellstrom
Tagged: DIY, model m, input, frank zappa, blue alps sliders
Sometimes hacks and mods are done to save you time and money or possibly both but other times you find yourself stuck in the position of Frak Zappa and cannot find a giraffe filled with whipped cream and have to make it yourself. Such is the case with this completely made custom keyboard described at Hack a Day, in which every part was either custom ordered or made by the designer themselves. None of the keys seem to be in their accustomed places and your thumbs will get a workout from all of those keys mounted in the centre of the board but for a programmer this could be the perfect design. It has taken over a year to build and likely cost more than a mass produced designed keyboard but if you want something done right ...
"[dmw] posted a pseudo-build log over at the geekhack keyboard forums. Every single part of this keyboard is custom-made. The key caps were made by Signature Plastics, the case was made by Shapeways, and the custom PCB for the key switches came directly from Express PCB. The key switches are blue Alps sliders (one of the best key switches available) with a few white Alps switches taken from an old Apple keyboard."
Here is some more Tech News from around the web:
- Roccat ISKU Gaming Keyboard Review @ Hi Tech Legion
- Corsair Vengeance M90 @ XSReviews
- Enermax Briskie Wireless Keyboard and Mouse Bundle @ Kitguru
- Corsair's Vengeance K60 and K90 Keyboards @ AnandTech
- Corsair vengeance K60 @ Guru3D
- Corsair Vengeance K90 Performance MMO Mechanical Gaming Keyboard Review @ Madshrimps
- SteelSeries SRW-S1 @ OC3D
- Steelseries Simraceway SRW-S1 Controller Review @ XtremeComputing
- Corsair Vengeance M90 MMO Gaming Mouse @ Kitguru
- ROCCAT Kone[+] Laser Gaming Mouse @ techPowerUp
- HP Wi-Fi Touch Mouse X7000 Review @ TechReviewSource
- Cyborg R.A.T.7 Albino Edition Gaming Mouse Review @ eTeknix
- Steelseries Kinzu V2 Pro Edition Gaming Mouse @ Funky Kit
Subject: General Tech, Systems, Storage | February 20, 2012 - 01:53 PM | Scott Michaud
Tagged: Thecus, NAS
Home users are starting to look at Network Attached Storage (NAS) devices to serve their home media needs. Also popular are products which allow you to browse the internet and play media on your TV. Just announced by Thecus are two NAS devices which fit both roles and many others. The N2800 contains a built-in media card reader while the N4800 has a built in mini Uninterruptable Power Supply (UPS), OLED status screen, and a second USB3.0 port.
I hear they're a NASty bunch...
The obvious selling features of the two devices are the inclusion of HDMI output to enable the above roles as well as an updated 3rd Generation Intel Atom CPU D2700. The D2700 is a 2.13GHz Dual Core and hyper threaded Intel Atom processor manufactured at 32nm.
Check out the highlights of their press release below.
02/20/2012- As part of the Intel Embedded Alliance, Thecus has precedence and access to a multitude of Intel prototypes and the latest technologies. Working on those products for months now, Thecus is delighted to finally release its Vision Series.
The new N2800 and N4800 are going to be some of the first Intel(r) Atom(tm) D2700 based NAS! They will set the standard for what's best in the market to help you build a true multimedia center: USB 3.0, Dual Gigabit Ports, SD Card reader (N2800), Mini-UPS (N4800), etc.
And the most important feature is the HDMI output. With Thecus Local Display module, it's now possible to connect the NAS directly to a monitor and control it through USB mouse/keyboard. Playing HD movies, browsing the web, controlling the NAS... everything is now possible directly from your TV! Thanks to this feature, Thecus is now creating a new standard among the NAS industry.
Thecus(r) Technology Corp. specializes in IP Storage Server and Network Video Recorder solutions. The company was established in 2004 with the mission to make technology that is as transparent as it is easy-to-use and products that are not only the best on the market, but are accessible to experts and novices alike. Combining a world-class R&D team highly experienced in storage hardware and software development with a keen customer focus, Thecus(r) stays close to the market to develop high-quality products to fulfill the storage and surveillance needs of today's world.
Subject: General Tech | February 20, 2012 - 01:53 PM | Jeremy Hellstrom
Tagged: NTV, near threshold voltage, transistor, pentium, qubit
The eyes of the world are on the 22nm Ivy Bridge chip that Intel has slowly been giving us details of but there is also something interesting happening at 32nm with the world's most repurposed Intel CPU. Once again the old Pentium core has been dusted off and modified to showcase new Intel technology, in this case Near Threshold Voltage operations. In this case the Threshold refers to the amount of power needed to flip a bit on a processor, what you would be used to seeing as VCC and is the reason those dang chips get so toasty. Much in the way that SpeedStep and other energy savings technologies reduce the operating frequency of an underloaded processor, Intel has tied the amount of voltage to the frequency and lowers the power requirements along with the chips speed. The demonstration model that they showed The Register varied from a high end of 1.2 volts at 915MHz to a mere 280 millivolts at 3MHz and down to 2 millivolts in sleep. By scaling the power consumption Intel may have found a nice middle group between performance and TDP to keep ARM from making major inroads into the server room, if they can pull it off with more modern processors. They also showed off a solar powered CPU which might be handy on a cell phone but seems of limited commercial value in the short term as well as a
Keeping with the theme of small, The Register also has news of research which has created a working transistor out of a single phosphorus atom, an atomic radius of 0.110nm for those who like some scale with their transistors. The trick was the temperature; seeing as it is a measure of energy expressed as movement (to put it ridiculously simply) you need low temperatures to keep the atoms from moving more than 10nm. At -196°C the atom was stable enough for its position to be accurately predicted which is absolutely necessary if you plan to use the atom as a qubit. Overclocking is going to be difficult.
"The threshold voltage is the point at which transistors turn on and conduct electricity. If you can flip bits near this threshold, instead of using much larger swing that is typically many times this threshold voltage to turn zeros into ones and vice versa, then you can save a lot of power."
Here is some more Tech News from around the web:
- The TR Podcast 106: New Radeons and a new AMD
- HP ARM-based servers expected to be available for testing in 2Q12 @ DigiTimes
- Windows 8 and the disappearance of the Start button @ Ars Technica
- ARM On Ubuntu 12.04 LTS Battling Intel x86? @ Phoronix
- DNS flaw reanimates slain evil sites as ghost domains @ The Register
- Canon PowerShot Elph 310 HS Review @ TechReviewSource
- Arctic Land Rider 309, 307 and Transmitter T-01 @ Rbmods
- Megacon 2012 Orlando Event Coverage @ TechwareLabs
- Win an OCZ Agility 3 120GB SSD @ Kitguru
- 2012 Competition MADNESS @ OC3D
Microsoft Allegedly Overhauling SkyDrive With Increased Paid Storage, Applications, and Other Goodies
Subject: General Tech | February 20, 2012 - 11:12 AM | Tim Verry
Tagged: storage, skydrive, paid storage, free, cloud backup, bitlocker, app integration
Update: Some of the rumors have been confirmed by Microsoft in a blog post, though the individual file size increase was a bit off. Microsoft will be allowing files up to 2 GB in size as compared to the rumored 300 MB file sizes.
Every so often, I run across a rumor that sounds almost too good to be true. On the other hand, it sounds so good that I just can't stop myself from being excited about it. Over the weekend, I saw an article that talked about Windows Live Skydrive offering paid storage tiers and I now really want this to come to fruition.
For those curious, SkyDrive is Microsoft's "cloud storage" service that gives users 25 GB of free storage space to hold files. There are some restrictions with the individual file size (that can be worked around if you really want to backup a home movie for example), but otherwise it is a boatload of space for free and saved my butt when the, um, "formatting catastrophe" of 2010 happened by having most of my digital photos backed up!
SkyDrive as it is now, funny old photos and all!
The service is connected to your Microsoft Live or hotmail account and can be accessed by navigating to skydrive.live.com. There are some usability issues with the service; however, including the fact that it's a pain in the rear to upload more than one or two files. The website doesn't make it easy to batch upload, say, a folder or folders only a file at a time. Further, it is not nearly as easy to manage those files once they are in the SkyDrive as it should be. Now, if you use IE, the SkyDrive website will allow you to upload multiple files easier; however, the other browsers are left without a way to do it. There is also the aforementioned individual file size limit of 100 MB per file.
The exciting bit about the rumors and (allegedly) leaked screen shots is that if they stay true the service is about to get a whole lot better by offering cheap storage and fixing many of the issues people have had with the service.
The leaked image
On the storage front, Microsoft is allegedly adding new paid storage tiers and increasing the individual file size limit to 300 MB (from 100 MB). Among the new plans are 20 GB, 50 GB, and 100 GB offerings (which is in addition to the free 25 GB of space) for $10, $25, and $50 a year respectively. Not a bad price at all in my opinion! Assuming the pricing is accurate, they are vastly undercutting the competition. Dropbox, for example, is currently offering 50 GB for $99 a year and 100 GB for $199 per year. Granted, Dropbox has syncing functionality, no individual file size limit, and is a much easier to use service with an established user base, but at these prices the Microsoft offering is likely to win over many people who just want some cheap off site backup space!
|Paid Storage Space||SkyDrive (Price Per Year)||Dropbox (Price Per Year)|
Dropbox pricing just for comparision.
While there are currently mobile applications for Windows Phone and Apple iOS smart phones, users must turn to third party explorer extensions (like SDExplorer) for Windows OS integration on the desktop. More leaked images seem to suggest that Microsoft will be launching applications for Windows and Mac operating systems to better integrate SkyDrive into the OS (and hopefully enable easier cloud file management). SDExplorer is a third party extension that I used to upload all my photos to SkyDrive and it allows mounting the SkyDrive account as a "hard drive" under Windows Explorer. Unfortunately, it costs money to get the full feature set, so hopefully Microsoft can provide similar (or more) features for free with their OS.
In addition, Microsoft will allegedly be adding URL shortening for public and shared SkyDrive file links as well as the ability to share files to Twitter and Facebook from within the SkyDrive website. For the latter, there are already APIs and Microsoft is likely just leveraging them to make sharing files a bit more convenient. On the other hand, Microsoft will be using their own URL shortening service via the sdrv.ms domain instead of integrating with an existing service.
As a user of Libre Office (the fork off of what was once Open Office), I deal a lot with .odt files, which is the open document standard. For users of Microsoft's web application of Office, they have been forced to save files to the Microsoft standards; however, rumors suggest that the service will soon support creating and saving to the .odt, .odp, and .ods document formats. If you are using Office Web Apps, then you are already likely fairly integrated into the Office universe, and this feature won't mean much. On the other hand, this will help out others who may need to edit one of the Libre Office created documents backed up to their SkyDrive on the go. Better compatibility is always a step in the right direction for MS after all.
Last up on the rumor pile for SkyDrive is the ability to store BitLocker recovery keys directly to SkyDrive so that you have a backup should you ever forget your encryption password. The flip side of that convenience feature is that it provides another attack vector should someone attempt to get their hands on your encryption keys, and it is a location that you must depend on someone else to keep secure. As weird as it may sound, you might want to encrypt your encryption key before uploading it to any "cloud" service (heh), just in case. Still, it's good to have options.
Needless to say, there was quite the leak this weekend over Microsoft SkyDrive features! It is a lot to take in, but in my opinion it sounds like they are really giving the service special attention it needs to get it into fighting form. And if the rumors hold true it will be much more comptetitive with other cloud storage backup options as a result of the overhaul. I'm excited about this, obviously, but what about you? Do you use SkyDrive?
Subject: General Tech, Processors, Systems, Mobile, Shows and Expos | February 20, 2012 - 01:50 AM | Scott Michaud
Tagged: Rosepoint, ISSCC 2012, ISSCC, Intel
If there is one thing that Intel is good at, it is writing a really big check to go in a new direction right when absolutely needed. Intel has released press information on what should be expected from their presence at the International Solid-State Circuits Conference which is currently in progress until the 23rd. The headliner for Intel at this event is their Rosepoint System on a Chip (SoC) which looks to lower power consumption by rethinking the RF transceiver and including it on the die itself. While the research has been underway for over a decade at this point, pressure from ARM has pushed Intel to, once again, throw money at R&D until their problems go away.
Intel could have easily trolled us all and have named this SoC "Centrino".
Almost ten years ago, AMD had Intel in a very difficult position. Intel fought to keep clock-rates high until AMD changed their numbering scheme to give proper credit to their higher performance-per-clock components. Intel dominated, legally or otherwise, the lower end market with their Celeron line of processors.
AMD responded with series of well-timed attacks against Intel. AMD jabbed Intel in the face and punched them in the gut with the release of the Sempron processor line nearby filing for anti-trust against Intel to allow them to more easily sell their processors in mainstream PCs.
At around this time, Intel decided to entirely pivot their product direction and made plans to take their Netburst architecture behind the shed. AMD has yet to recover from the tidal wave which the Core architectures crashed upon them.
Intel wishes to stop assaulting your battery indicator.
With the surge of ARM processors that have been fundamentally designed for lower power consumption than Intel’s x86-based competition, things look bleak for the expanding mobile market. Leave it to Intel to, once again, simply cut a gigantic check.
Intel is in the process of cutting power wherever possible in their mobile offerings. To remain competitive with ARM, Intel is not above outside-the-box solutions including the integration of more power-hungry components directly into the main processor. Similar to NVIDIA’s recent integration of touchscreen hardware into their Tegra 3 SoC, Intel will push the traditionally very power-hungry Wi-Fi transceivers into the SoC and supposedly eliminate all analog portions of the component in the process.
I am not too knowledgeable about Wi-Fi transceivers so I am not entirely sure how big of a jump Intel has made in their development, but it appears to be very significant. Intel is said to discuss this technology more closely during their talk on Tuesday morning titled, “A 20dBm 2.4GHz Digital Outphasing Transmitter for WLAN Application in 32nm CMOS.”
This paper is about a WiFi-compliant (802.11g/n) transmitter using Intel’s 32nm process and techniques leveraging Intel transistors to achieve record performance (power consumption per transmitted data better than state-of-the art). These techniques are expected to yield even better results when moved to Intel’s 22nm process and beyond.
What we do know is that the Rosepoint SoC will be manufactured at 32nm and is allegedly quite easy to scale down to smaller processes when necessary. Intel has also stated that while only Wi-Fi is currently supported, other frequencies including cellular bands could be developed in the future.
We will need to wait until later to see how this will affect the real world products, but either way -- this certainly is a testament to how much change a dollar can be broken into.
Subject: General Tech | February 18, 2012 - 11:11 PM | Scott Michaud
So what do you do if you blew all of your money on a professional workstation and have nothing left for software?
Well, you get a better business strategy.
Occasionally there are open source products which rival or exceed the usability of the packaged products. I first started learning 3D modeling on a perpetual educational license of Maya and 2D art on a combination of Photoshop and GIMP. While I could not manage to eliminate Photoshop from my workflow I found the switch from Maya to pre-release Blender + Bmesh builds felt like an upgrade -- not just a manageable change. Blender is rapidly getting even better with each new bi-monthly version such as their just-released 2.62 update.
(Photo Credit: Blender Project / Alexey Lugovoy)
Blender decided to introduce many new features throughout the 2.6 series of releases by developing them in parallel and merging branches into the release trunk as they became worthy. This release yielded a new renderer known as “Cycles”, new UV unwrapping tools, reprogrammed Boolean tools, and motion tracking features.
Personally, I look most forward to the official 2.63 release scheduled for April. It appears as if the secrecy surrounding the development status of Bmesh was lifted and its introduction to the canon application is pinned to that release. Prior to pre-release Bmesh builds, Blender just felt too distant to the style of modeling which I developed in my years of using Maya. Since the addition of Bmesh, Blender was able to fit all of the essential roles which Maya satisfied and avoided some of my long-standing gripes with Autodesk’s $3000 package in the process. I was not even referring to its cost.
By the end of the 2.6 line, I expect that Blender will be an upgrade for users of many 3D applications. Check it out, for free, at their website.
Subject: General Tech, Processors, Mobile | February 18, 2012 - 09:06 PM | Scott Michaud
Tagged: Intel, mobile, developer
Clay Breshears over at Intel posted about lazy software optimization over on the Intel Software Blog. His post is a spiritual resurrection of the over seven year’s old article by Herb Sutter, “The Free Lunch is Over: A Fundamental Turn Toward Concurrency in Software.” The content is very similar, but the problem is quite different.
The original 2004 article urged developers to heed the calls for the multi-core choo choo express and not hang around on the single core platform (train or computing) waiting for performance to get better. The current article takes that same mentality and applies it to power efficiency. Rather than waiting for hardware that has appropriate power efficiency for your application, learn techniques to bring your application into your desired power requirements.
"I believe your program is a little... processor heavy."
The meat of the article focuses on the development of mobile applications and the concerns that developers should have with battery conservation. Of course there is something to be said about Intel promoting mobile power efficiency. While developers could definitely increase the efficiency of their code, there is still a whole buffet of potential on the hardware side.
If you are a developer, particularly of mobile or laptop applications, Intel has an education portal for best power efficiency practices on their website. Be sure to check it out and pick up the tab once in a while, okay?
Subject: General Tech | February 18, 2012 - 01:22 AM | Scott Michaud
Tagged: Minecraft, Lego
Most of the success of Minecraft can be attributed to that little part of your brain which desires to be creative while at play. Prior to Minecraft, other toys such as LEGO clung on to that yearning for their successes. A mash-up between the old and the new is a natural stage in the evolution of Minecraft. LEGO and Mojang announced a LEGO-themed Minecraft set.
Who knows, maybe we will eventually get LEGO Minecraft the videogame and complete the circle.
Recent events would lead you to believe that Mojang has enough money -- and you are correct. The Minecraft LEGO set was created from LEGO’s CUUSOO program. If a CUUSOO product is selected to become a part of LEGO’s portfolio, it can collect 1% of total net sales of the product. According to the official LEGO press release, Mojang will donate the royalties they collect from this LEGO product to charity.
If you desire to explore blocky caves in real life then head on down to the Jynx website and submit your order. The set is expected to be available sometime in the summer. Also, check out the LEGO CUUSOO project to submit or vote upon other potential products.
Subject: General Tech, Systems | February 17, 2012 - 10:39 PM | Scott Michaud
Tagged: workstation, nvidia, hp
Here is a story for the professional computer users out there.
Professionals have standards: be polite, be efficient, and have a multi-year plan to cram as much hardware into a small case as you can seat. NVIDIA and HP have obviously played too much Team Fortress -- or I did -- let us just all three of us have. The engineers have dispensed with the desktop tower and crammed everything in the monitor with their Z1 product series. While not original, it does hold a number of nice features.
… But honestly, what the user really wants is for it to dispense Bonk!
As soon as I read the announcement I immediately jumped over to HP’s product page and confirmed the existence of external display connections. Sure enough, HP did not entirely botch this product and allows the connection of one extra monitor by display port. While being limited to just two monitors is a bit disappointing -- I currently have a three monitor setup -- if they were to introduce a workstation computer with just a single monitor it would have been product suicide. Thankfully they had enough sense.
The real flaunted feature of the Z1 workstation is its ease of upgrade. The included power supply is rated at 400W which to my knowledge is decent for a single-card workstation class computer. HP claims support for up to 2 internal 2.5-inch drives or a single 3.5-inch drive; unfortunately they do not clarify whether you can install all three drives, or if you must choose between the one larger versus the two smaller drives.
HP and NVIDIA go on a date -- they dress workstation classual.
The workstation is expected to start at $1899 when it ships sometime around April. Unfortunately HP’s technical specifications list an Intel Core i3 and Integrated HD 2000 GPU -- most likely to hide the price of the products with the components that you actually want. I guess you will need to wait a couple of months to find out what you will actually be paying.
Subject: General Tech, Mobile | February 17, 2012 - 08:58 PM | Scott Michaud
Tagged: kindle fire, amazon, foxconn, Quanta
Amazon had quite the successful launch of their Kindle Fire tablet PC. The original Kindle Fire is based on the Blackberry Playbook design and manufactured by the same company, Quanta. Despite being out for just three months, we may be just three or four months away from its successor.
Foxconn is expected to do the work as OEM... a Quanta of solace.
The news was first reported by The Commercial Times, a Chinese-language Taiwan publication and put online by their sister publication, China Times (Microsoft Translation). According to the article, the original Kindle Fire may not be dying an early death. As is almost expected from Amazon, the original Kindle Fire will persist as Amazon’s 7-inch Kindle Fire model. The new Kindle Fire is rumored to compliment that product, not replace it.
The new Kindle Fire is expected to be a 10-inch model and, unlike the Blackberry Playbook design which Quanta sold Amazon last year, be more heavily designed by Amazon themselves. It is expected that while Quanta will continue to manufacture the 7-inch Kindle Fire, the 10-inch will be assembled at Hon Hai (Foxconn). Commercial Times does not suggest what other changes Amazon will introduce with the new product.
Subject: General Tech, Mobile, Shows and Expos | February 17, 2012 - 06:33 PM | Scott Michaud
Tagged: tegra 3, MWC, htc
Mobile World Congress (MWC) is approaching and you should expect our coverage of the event from a hardware perspective. The actual event occurs from February 27th through March 1st although, like most events, we expect that the news coverage will begin a couple of days before that time. Rumors about what will appear at the show are already surfacing and include a few leaks about upcoming HTC releases.
Probably there's a very simple answer to it... still curious though.
(Update: As pointed out in the comments, one of the phones actually IS Tegra 3 powered. I read it as including some other processor... and somehow I only found the LG X3 when looking for Tegra 3 powered phones.)
TechCrunch rounded up details from a few sources about several phones from HTC that are expected at MWC. Ice Cream Sandwich appears to be the common thread amongst each of the leaks. Of particular note, HTC appears to be demonstrating a 10.1” tablet running an NVIDIA Tegra 3 processor. Their phones, on the other hand, do not. (Update: Yeah they do, my mistake.)
Unlike (Update: Actually, like) HTC, LG is expected to reveal a Tegra-3 powered phone, the LG X3, at Mobile World Congress -- so Tegra 3 phones are not nonexistent -- just seemingly a scarce commodity. It would be interesting to know why NVIDIA’s Tegra 3 technology appears, at least from my standpoint, so common on tablets yet so scarce on phones.
Be sure to keep checking back for our coverage of the event and all of your other hardware needs.
Subject: General Tech | February 17, 2012 - 03:41 PM | Ryan Shrout
Tagged: asus, office tour
Sometimes things don't work out the way you want them and you have to improvise. In our case this week, as we prepare to move into a new office space, we had a couple of dilemmas crop up.
- The "cabinet" the previous owner made for network was an actual cabinet and didn't have enough depth for a switch, let alone a server.
- Fiber internet is coming but not for another 6 weeks - what do we do until then?
The first issue of our networking cabinet was resolved by putting a 24-port Gigabit switch on a hinge inside the not-quite-deep-enough space we had for it. And sure, once you find out we used Gorilla glue, a block of wood and standard door hinges from Home Depot, it might sound a little bit on the ghetto side, but the fact is...it worked!
Our problem with an actual internet connection to apply to the switch in question was a bit harder. Since our fiber wasn't going to be installed until late March, and I didn't want to see the office space simply sit there and be wasted until then, we had to find another solution. We asked our neighbors about using their connections temporarily and while several were open to it, speed tests showed consistent 1.4 mbps downstream and 0.45 mbps upstream connections. Not good for the amount of video we do here.
So, another option presented itself: our current office that has 20 mbps down and 2 mbps up service (mediocre, but still better) was only a short 100-110 meters away. Could a Cat5e cable simply be run between them? Turns out it could and we were even able to run a length has long as 500 ft without a problem, connecting 10/100 rather than Gigabit speeds.
In total, our hinge modification cost us about $4.50 and the 500 ft spool of cable just around $50 but the hassle we saved has been worth thousands. The cable connection issue is obviously not permanent but barring any rogue squirrel retaliation in the next 4-6 weeks, it should more than serve our purposes.
Enjoy the weekend!
Subject: General Tech | February 16, 2012 - 12:45 PM | Jeremy Hellstrom
Tagged: nvidia, 28nm, TSMC, kepler, tegra, tegra 3
If you caught the podcast last night you would have heard Josh discussing NVIDIA's year end financial conference call, otherwise you will have to wait until the 'cast is posted later this week. Until then you can read SemiAccurate's take on the call here. There is a lot of news about NVIDIA and none of it is good, from wafer yields to chip demand nothing seems to have gone right for them. Attempting to move off of their cursed 40nm line and switching to 28nm, NVIDIA has run into big yield problems as in entire wafers having issues as opposed to just some dies being bad.
Tegra is not doing so well either, with sales of Tegra 2 dropping as we approach the release of Tegra 3, which is getting a lot of bad press. SemiAccurate refers to the chip as bloated in size as well as being downright expensive to make. Combine that with the fact that NVIDIA is lagging on A15 adoption and Samsung and Apple turning their backs on Tegra and it doesn't look good for NVIDIA's mobile plans. The one ray of sunshine is that even combined Samsung and Apple do not account for even half of smartphones on the market, so there is still room for NVIDIA and Tegra to grow.
"Nvidia seems to be so far ahead of the curve that they are experiencing problems that are unique in the industry. In their recent year end financial conference call, there was enough said to draw some very grim conclusions.
Today’s conference call was a near complete validation of all the things SemiAccurate has been saying about Nvidia. Remember when we asked if Nvidia could supply Apple? Anyone notice the part about dumping early 28nm capacity, and the disappearance of 28nm Fermi shrinks? Remember how 28nm was not an issue for Nvidia, even if their product roadmap slips said otherwise. How well does this mesh with the quotes from Jen-Hsun himself on the topic?"
Here is some more Tech News from around the web:
- Danish researchers invent 2nm components @ SemiAccurate
- Space; it’s a junkyard until the Swiss get their way @ Hack a Day
- From encryption to darknets: As governments snoop, activists fight back @ Ars Technica
- Intel to postpone mass shipments of Ivy Bridge processors @ DigiTimes
- Acer expects double ultrabook shipments in 2Q12, improving gross margin each quarter @ DigiTimes
- High Orbits and Slowlorises: understanding the Anonymous attack tools @ Ars Technica
- Samsung SCX-4728FD Multifunctional Printer @ Overclockers Online
- Norton 360 Version 6.0 Review @ TechReviewSource
- CoolerMaster MASSIVE Giveaway @ eTeknix