Subject: General Tech | February 20, 2012 - 01:53 PM | Jeremy Hellstrom
Tagged: NTV, near threshold voltage, transistor, pentium, qubit
The eyes of the world are on the 22nm Ivy Bridge chip that Intel has slowly been giving us details of but there is also something interesting happening at 32nm with the world's most repurposed Intel CPU. Once again the old Pentium core has been dusted off and modified to showcase new Intel technology, in this case Near Threshold Voltage operations. In this case the Threshold refers to the amount of power needed to flip a bit on a processor, what you would be used to seeing as VCC and is the reason those dang chips get so toasty. Much in the way that SpeedStep and other energy savings technologies reduce the operating frequency of an underloaded processor, Intel has tied the amount of voltage to the frequency and lowers the power requirements along with the chips speed. The demonstration model that they showed The Register varied from a high end of 1.2 volts at 915MHz to a mere 280 millivolts at 3MHz and down to 2 millivolts in sleep. By scaling the power consumption Intel may have found a nice middle group between performance and TDP to keep ARM from making major inroads into the server room, if they can pull it off with more modern processors. They also showed off a solar powered CPU which might be handy on a cell phone but seems of limited commercial value in the short term as well as a
Keeping with the theme of small, The Register also has news of research which has created a working transistor out of a single phosphorus atom, an atomic radius of 0.110nm for those who like some scale with their transistors. The trick was the temperature; seeing as it is a measure of energy expressed as movement (to put it ridiculously simply) you need low temperatures to keep the atoms from moving more than 10nm. At -196°C the atom was stable enough for its position to be accurately predicted which is absolutely necessary if you plan to use the atom as a qubit. Overclocking is going to be difficult.
"The threshold voltage is the point at which transistors turn on and conduct electricity. If you can flip bits near this threshold, instead of using much larger swing that is typically many times this threshold voltage to turn zeros into ones and vice versa, then you can save a lot of power."
Here is some more Tech News from around the web:
- The TR Podcast 106: New Radeons and a new AMD
- HP ARM-based servers expected to be available for testing in 2Q12 @ DigiTimes
- Windows 8 and the disappearance of the Start button @ Ars Technica
- ARM On Ubuntu 12.04 LTS Battling Intel x86? @ Phoronix
- DNS flaw reanimates slain evil sites as ghost domains @ The Register
- Canon PowerShot Elph 310 HS Review @ TechReviewSource
- Arctic Land Rider 309, 307 and Transmitter T-01 @ Rbmods
- Megacon 2012 Orlando Event Coverage @ TechwareLabs
- Win an OCZ Agility 3 120GB SSD @ Kitguru
- 2012 Competition MADNESS @ OC3D
Microsoft Allegedly Overhauling SkyDrive With Increased Paid Storage, Applications, and Other Goodies
Subject: General Tech | February 20, 2012 - 11:12 AM | Tim Verry
Tagged: storage, skydrive, paid storage, free, cloud backup, bitlocker, app integration
Update: Some of the rumors have been confirmed by Microsoft in a blog post, though the individual file size increase was a bit off. Microsoft will be allowing files up to 2 GB in size as compared to the rumored 300 MB file sizes.
Every so often, I run across a rumor that sounds almost too good to be true. On the other hand, it sounds so good that I just can't stop myself from being excited about it. Over the weekend, I saw an article that talked about Windows Live Skydrive offering paid storage tiers and I now really want this to come to fruition.
For those curious, SkyDrive is Microsoft's "cloud storage" service that gives users 25 GB of free storage space to hold files. There are some restrictions with the individual file size (that can be worked around if you really want to backup a home movie for example), but otherwise it is a boatload of space for free and saved my butt when the, um, "formatting catastrophe" of 2010 happened by having most of my digital photos backed up!
SkyDrive as it is now, funny old photos and all!
The service is connected to your Microsoft Live or hotmail account and can be accessed by navigating to skydrive.live.com. There are some usability issues with the service; however, including the fact that it's a pain in the rear to upload more than one or two files. The website doesn't make it easy to batch upload, say, a folder or folders only a file at a time. Further, it is not nearly as easy to manage those files once they are in the SkyDrive as it should be. Now, if you use IE, the SkyDrive website will allow you to upload multiple files easier; however, the other browsers are left without a way to do it. There is also the aforementioned individual file size limit of 100 MB per file.
The exciting bit about the rumors and (allegedly) leaked screen shots is that if they stay true the service is about to get a whole lot better by offering cheap storage and fixing many of the issues people have had with the service.
The leaked image
On the storage front, Microsoft is allegedly adding new paid storage tiers and increasing the individual file size limit to 300 MB (from 100 MB). Among the new plans are 20 GB, 50 GB, and 100 GB offerings (which is in addition to the free 25 GB of space) for $10, $25, and $50 a year respectively. Not a bad price at all in my opinion! Assuming the pricing is accurate, they are vastly undercutting the competition. Dropbox, for example, is currently offering 50 GB for $99 a year and 100 GB for $199 per year. Granted, Dropbox has syncing functionality, no individual file size limit, and is a much easier to use service with an established user base, but at these prices the Microsoft offering is likely to win over many people who just want some cheap off site backup space!
|Paid Storage Space||SkyDrive (Price Per Year)||Dropbox (Price Per Year)|
Dropbox pricing just for comparision.
While there are currently mobile applications for Windows Phone and Apple iOS smart phones, users must turn to third party explorer extensions (like SDExplorer) for Windows OS integration on the desktop. More leaked images seem to suggest that Microsoft will be launching applications for Windows and Mac operating systems to better integrate SkyDrive into the OS (and hopefully enable easier cloud file management). SDExplorer is a third party extension that I used to upload all my photos to SkyDrive and it allows mounting the SkyDrive account as a "hard drive" under Windows Explorer. Unfortunately, it costs money to get the full feature set, so hopefully Microsoft can provide similar (or more) features for free with their OS.
In addition, Microsoft will allegedly be adding URL shortening for public and shared SkyDrive file links as well as the ability to share files to Twitter and Facebook from within the SkyDrive website. For the latter, there are already APIs and Microsoft is likely just leveraging them to make sharing files a bit more convenient. On the other hand, Microsoft will be using their own URL shortening service via the sdrv.ms domain instead of integrating with an existing service.
As a user of Libre Office (the fork off of what was once Open Office), I deal a lot with .odt files, which is the open document standard. For users of Microsoft's web application of Office, they have been forced to save files to the Microsoft standards; however, rumors suggest that the service will soon support creating and saving to the .odt, .odp, and .ods document formats. If you are using Office Web Apps, then you are already likely fairly integrated into the Office universe, and this feature won't mean much. On the other hand, this will help out others who may need to edit one of the Libre Office created documents backed up to their SkyDrive on the go. Better compatibility is always a step in the right direction for MS after all.
Last up on the rumor pile for SkyDrive is the ability to store BitLocker recovery keys directly to SkyDrive so that you have a backup should you ever forget your encryption password. The flip side of that convenience feature is that it provides another attack vector should someone attempt to get their hands on your encryption keys, and it is a location that you must depend on someone else to keep secure. As weird as it may sound, you might want to encrypt your encryption key before uploading it to any "cloud" service (heh), just in case. Still, it's good to have options.
Needless to say, there was quite the leak this weekend over Microsoft SkyDrive features! It is a lot to take in, but in my opinion it sounds like they are really giving the service special attention it needs to get it into fighting form. And if the rumors hold true it will be much more comptetitive with other cloud storage backup options as a result of the overhaul. I'm excited about this, obviously, but what about you? Do you use SkyDrive?
Subject: General Tech, Processors, Systems, Mobile, Shows and Expos | February 20, 2012 - 01:50 AM | Scott Michaud
Tagged: Rosepoint, ISSCC 2012, ISSCC, Intel
If there is one thing that Intel is good at, it is writing a really big check to go in a new direction right when absolutely needed. Intel has released press information on what should be expected from their presence at the International Solid-State Circuits Conference which is currently in progress until the 23rd. The headliner for Intel at this event is their Rosepoint System on a Chip (SoC) which looks to lower power consumption by rethinking the RF transceiver and including it on the die itself. While the research has been underway for over a decade at this point, pressure from ARM has pushed Intel to, once again, throw money at R&D until their problems go away.
Intel could have easily trolled us all and have named this SoC "Centrino".
Almost ten years ago, AMD had Intel in a very difficult position. Intel fought to keep clock-rates high until AMD changed their numbering scheme to give proper credit to their higher performance-per-clock components. Intel dominated, legally or otherwise, the lower end market with their Celeron line of processors.
AMD responded with series of well-timed attacks against Intel. AMD jabbed Intel in the face and punched them in the gut with the release of the Sempron processor line nearby filing for anti-trust against Intel to allow them to more easily sell their processors in mainstream PCs.
At around this time, Intel decided to entirely pivot their product direction and made plans to take their Netburst architecture behind the shed. AMD has yet to recover from the tidal wave which the Core architectures crashed upon them.
Intel wishes to stop assaulting your battery indicator.
With the surge of ARM processors that have been fundamentally designed for lower power consumption than Intel’s x86-based competition, things look bleak for the expanding mobile market. Leave it to Intel to, once again, simply cut a gigantic check.
Intel is in the process of cutting power wherever possible in their mobile offerings. To remain competitive with ARM, Intel is not above outside-the-box solutions including the integration of more power-hungry components directly into the main processor. Similar to NVIDIA’s recent integration of touchscreen hardware into their Tegra 3 SoC, Intel will push the traditionally very power-hungry Wi-Fi transceivers into the SoC and supposedly eliminate all analog portions of the component in the process.
I am not too knowledgeable about Wi-Fi transceivers so I am not entirely sure how big of a jump Intel has made in their development, but it appears to be very significant. Intel is said to discuss this technology more closely during their talk on Tuesday morning titled, “A 20dBm 2.4GHz Digital Outphasing Transmitter for WLAN Application in 32nm CMOS.”
This paper is about a WiFi-compliant (802.11g/n) transmitter using Intel’s 32nm process and techniques leveraging Intel transistors to achieve record performance (power consumption per transmitted data better than state-of-the art). These techniques are expected to yield even better results when moved to Intel’s 22nm process and beyond.
What we do know is that the Rosepoint SoC will be manufactured at 32nm and is allegedly quite easy to scale down to smaller processes when necessary. Intel has also stated that while only Wi-Fi is currently supported, other frequencies including cellular bands could be developed in the future.
We will need to wait until later to see how this will affect the real world products, but either way -- this certainly is a testament to how much change a dollar can be broken into.
Subject: General Tech | February 18, 2012 - 11:11 PM | Scott Michaud
So what do you do if you blew all of your money on a professional workstation and have nothing left for software?
Well, you get a better business strategy.
Occasionally there are open source products which rival or exceed the usability of the packaged products. I first started learning 3D modeling on a perpetual educational license of Maya and 2D art on a combination of Photoshop and GIMP. While I could not manage to eliminate Photoshop from my workflow I found the switch from Maya to pre-release Blender + Bmesh builds felt like an upgrade -- not just a manageable change. Blender is rapidly getting even better with each new bi-monthly version such as their just-released 2.62 update.
(Photo Credit: Blender Project / Alexey Lugovoy)
Blender decided to introduce many new features throughout the 2.6 series of releases by developing them in parallel and merging branches into the release trunk as they became worthy. This release yielded a new renderer known as “Cycles”, new UV unwrapping tools, reprogrammed Boolean tools, and motion tracking features.
Personally, I look most forward to the official 2.63 release scheduled for April. It appears as if the secrecy surrounding the development status of Bmesh was lifted and its introduction to the canon application is pinned to that release. Prior to pre-release Bmesh builds, Blender just felt too distant to the style of modeling which I developed in my years of using Maya. Since the addition of Bmesh, Blender was able to fit all of the essential roles which Maya satisfied and avoided some of my long-standing gripes with Autodesk’s $3000 package in the process. I was not even referring to its cost.
By the end of the 2.6 line, I expect that Blender will be an upgrade for users of many 3D applications. Check it out, for free, at their website.
Subject: General Tech, Processors, Mobile | February 18, 2012 - 09:06 PM | Scott Michaud
Tagged: Intel, mobile, developer
Clay Breshears over at Intel posted about lazy software optimization over on the Intel Software Blog. His post is a spiritual resurrection of the over seven year’s old article by Herb Sutter, “The Free Lunch is Over: A Fundamental Turn Toward Concurrency in Software.” The content is very similar, but the problem is quite different.
The original 2004 article urged developers to heed the calls for the multi-core choo choo express and not hang around on the single core platform (train or computing) waiting for performance to get better. The current article takes that same mentality and applies it to power efficiency. Rather than waiting for hardware that has appropriate power efficiency for your application, learn techniques to bring your application into your desired power requirements.
"I believe your program is a little... processor heavy."
The meat of the article focuses on the development of mobile applications and the concerns that developers should have with battery conservation. Of course there is something to be said about Intel promoting mobile power efficiency. While developers could definitely increase the efficiency of their code, there is still a whole buffet of potential on the hardware side.
If you are a developer, particularly of mobile or laptop applications, Intel has an education portal for best power efficiency practices on their website. Be sure to check it out and pick up the tab once in a while, okay?
Subject: General Tech | February 18, 2012 - 01:22 AM | Scott Michaud
Tagged: Minecraft, Lego
Most of the success of Minecraft can be attributed to that little part of your brain which desires to be creative while at play. Prior to Minecraft, other toys such as LEGO clung on to that yearning for their successes. A mash-up between the old and the new is a natural stage in the evolution of Minecraft. LEGO and Mojang announced a LEGO-themed Minecraft set.
Who knows, maybe we will eventually get LEGO Minecraft the videogame and complete the circle.
Recent events would lead you to believe that Mojang has enough money -- and you are correct. The Minecraft LEGO set was created from LEGO’s CUUSOO program. If a CUUSOO product is selected to become a part of LEGO’s portfolio, it can collect 1% of total net sales of the product. According to the official LEGO press release, Mojang will donate the royalties they collect from this LEGO product to charity.
If you desire to explore blocky caves in real life then head on down to the Jynx website and submit your order. The set is expected to be available sometime in the summer. Also, check out the LEGO CUUSOO project to submit or vote upon other potential products.
Subject: General Tech, Systems | February 17, 2012 - 10:39 PM | Scott Michaud
Tagged: workstation, nvidia, hp
Here is a story for the professional computer users out there.
Professionals have standards: be polite, be efficient, and have a multi-year plan to cram as much hardware into a small case as you can seat. NVIDIA and HP have obviously played too much Team Fortress -- or I did -- let us just all three of us have. The engineers have dispensed with the desktop tower and crammed everything in the monitor with their Z1 product series. While not original, it does hold a number of nice features.
… But honestly, what the user really wants is for it to dispense Bonk!
As soon as I read the announcement I immediately jumped over to HP’s product page and confirmed the existence of external display connections. Sure enough, HP did not entirely botch this product and allows the connection of one extra monitor by display port. While being limited to just two monitors is a bit disappointing -- I currently have a three monitor setup -- if they were to introduce a workstation computer with just a single monitor it would have been product suicide. Thankfully they had enough sense.
The real flaunted feature of the Z1 workstation is its ease of upgrade. The included power supply is rated at 400W which to my knowledge is decent for a single-card workstation class computer. HP claims support for up to 2 internal 2.5-inch drives or a single 3.5-inch drive; unfortunately they do not clarify whether you can install all three drives, or if you must choose between the one larger versus the two smaller drives.
HP and NVIDIA go on a date -- they dress workstation classual.
The workstation is expected to start at $1899 when it ships sometime around April. Unfortunately HP’s technical specifications list an Intel Core i3 and Integrated HD 2000 GPU -- most likely to hide the price of the products with the components that you actually want. I guess you will need to wait a couple of months to find out what you will actually be paying.
Subject: General Tech, Mobile | February 17, 2012 - 08:58 PM | Scott Michaud
Tagged: kindle fire, amazon, foxconn, Quanta
Amazon had quite the successful launch of their Kindle Fire tablet PC. The original Kindle Fire is based on the Blackberry Playbook design and manufactured by the same company, Quanta. Despite being out for just three months, we may be just three or four months away from its successor.
Foxconn is expected to do the work as OEM... a Quanta of solace.
The news was first reported by The Commercial Times, a Chinese-language Taiwan publication and put online by their sister publication, China Times (Microsoft Translation). According to the article, the original Kindle Fire may not be dying an early death. As is almost expected from Amazon, the original Kindle Fire will persist as Amazon’s 7-inch Kindle Fire model. The new Kindle Fire is rumored to compliment that product, not replace it.
The new Kindle Fire is expected to be a 10-inch model and, unlike the Blackberry Playbook design which Quanta sold Amazon last year, be more heavily designed by Amazon themselves. It is expected that while Quanta will continue to manufacture the 7-inch Kindle Fire, the 10-inch will be assembled at Hon Hai (Foxconn). Commercial Times does not suggest what other changes Amazon will introduce with the new product.
Subject: General Tech, Mobile, Shows and Expos | February 17, 2012 - 06:33 PM | Scott Michaud
Tagged: tegra 3, MWC, htc
Mobile World Congress (MWC) is approaching and you should expect our coverage of the event from a hardware perspective. The actual event occurs from February 27th through March 1st although, like most events, we expect that the news coverage will begin a couple of days before that time. Rumors about what will appear at the show are already surfacing and include a few leaks about upcoming HTC releases.
Probably there's a very simple answer to it... still curious though.
(Update: As pointed out in the comments, one of the phones actually IS Tegra 3 powered. I read it as including some other processor... and somehow I only found the LG X3 when looking for Tegra 3 powered phones.)
TechCrunch rounded up details from a few sources about several phones from HTC that are expected at MWC. Ice Cream Sandwich appears to be the common thread amongst each of the leaks. Of particular note, HTC appears to be demonstrating a 10.1” tablet running an NVIDIA Tegra 3 processor. Their phones, on the other hand, do not. (Update: Yeah they do, my mistake.)
Unlike (Update: Actually, like) HTC, LG is expected to reveal a Tegra-3 powered phone, the LG X3, at Mobile World Congress -- so Tegra 3 phones are not nonexistent -- just seemingly a scarce commodity. It would be interesting to know why NVIDIA’s Tegra 3 technology appears, at least from my standpoint, so common on tablets yet so scarce on phones.
Be sure to keep checking back for our coverage of the event and all of your other hardware needs.
Subject: General Tech | February 17, 2012 - 03:41 PM | Ryan Shrout
Tagged: asus, office tour
Sometimes things don't work out the way you want them and you have to improvise. In our case this week, as we prepare to move into a new office space, we had a couple of dilemmas crop up.
- The "cabinet" the previous owner made for network was an actual cabinet and didn't have enough depth for a switch, let alone a server.
- Fiber internet is coming but not for another 6 weeks - what do we do until then?
The first issue of our networking cabinet was resolved by putting a 24-port Gigabit switch on a hinge inside the not-quite-deep-enough space we had for it. And sure, once you find out we used Gorilla glue, a block of wood and standard door hinges from Home Depot, it might sound a little bit on the ghetto side, but the fact is...it worked!
Our problem with an actual internet connection to apply to the switch in question was a bit harder. Since our fiber wasn't going to be installed until late March, and I didn't want to see the office space simply sit there and be wasted until then, we had to find another solution. We asked our neighbors about using their connections temporarily and while several were open to it, speed tests showed consistent 1.4 mbps downstream and 0.45 mbps upstream connections. Not good for the amount of video we do here.
So, another option presented itself: our current office that has 20 mbps down and 2 mbps up service (mediocre, but still better) was only a short 100-110 meters away. Could a Cat5e cable simply be run between them? Turns out it could and we were even able to run a length has long as 500 ft without a problem, connecting 10/100 rather than Gigabit speeds.
In total, our hinge modification cost us about $4.50 and the 500 ft spool of cable just around $50 but the hassle we saved has been worth thousands. The cable connection issue is obviously not permanent but barring any rogue squirrel retaliation in the next 4-6 weeks, it should more than serve our purposes.
Enjoy the weekend!