Upgrade your PS3 with an SSD... or just play on the PC.

Subject: General Tech, Storage | February 20, 2012 - 05:53 PM |
Tagged: ssd, PS3

There is an interesting article down at Eurogamer which covers the possible benefits of upgrading a PS3 with a solid state drive. Those who know me can guess that I am snickering while crossing another perceived advantage off of my console versus PC list. Still, if for some reason you want to play exclusives to a disposable platform that are only exclusive because you let them be and you desire to upgrade your experience, check out the interesting article.

PS3SSDSkyrim.png

Isn’t “not needing to do this” the whole reason for having a console?

Consoles titles are naturally becoming as hard drive-intensive as they are allowed to be due to their abysmally small quantity of RAM. Developers have been using tricks to increase the usefulness of their available RAM such as disallowing split screen, streaming content as needed, and rendering at low resolutions.

The first Halo, for instance, was famous for their quick load times. The load speed is due in part to having their game assets copied multiple times on the disk which allows choice in loading whichever copy requires the least seek time to access. Also, having a hard drive helped Halo too.

The article itself focuses mostly on RAGE and Skyrim due to their harsh issues with lag and pop-in. Skyrim has had known issues with getting progressively worse as time progressed. This issue was mostly corrected in version 2.03 as was also demonstrated in Eurogamer’s article making an SSD almost unnecessary, but prior to 2.03 an SSD surprisingly helped substantially with the problem. It should also be no surprise that throwing faster storage at RAGE helped immensely just as it does on the PC.

If you were considering upgrading to a faster drive for your Sony console be sure to check out Eurogamer -- or the new Hardware Leaderboard and just play on the PC.

Source: Eurogamer

Don't assume the price dictates the audio quality; try real studio quality headsets from Audio-Technica

Subject: General Tech | February 20, 2012 - 03:02 PM |
Tagged: audiophile, headset, audio, audio-technica, ATH-A900

At an MSRP of $250, the Audio-Techinca ATH-A900 headphones are not intended for the casual gamer and as you can tell by the 1/4" connector they are designed for someone who owns a high end microphone amp.  On the other hand if you need studio quality audio and will be wearing the headsets for hours at a time then the high end features built into these headphones are worth the investment.  The 53mm drivers are in an enclosed earcup which helps bring the bass up close and personal and are designed with much sturdier materials than other popular headsets.  To contrast the difference [H]ard|OCP tried Beats by Dre Studio headphones which cost more than the ATH-A900s and in every case they felt the ATH-A900s were vastly superior.  As far as [H] is concerned the two headsets aren't even in the same class.

H_AT_ATH.jpg

"Audio-Technica's open headphones are known to gamers for the wearing comfort and huge soundstage that these provide, but the open back models simply lack bass and isolation. Today, we will see if a pricey pair of the company's closed back audiophile headphones can offer the compromise many of you are looking for in PC audio."

Here is some more Tech News from around the web:

Audio Corner

 

Source: [H]ard|OCP

You can't always write a chord ugly enough to say what you want to say ...

Subject: General Tech | February 20, 2012 - 02:15 PM |
Tagged: DIY, model m, input, frank zappa, blue alps sliders

Sometimes hacks and mods are done to save you time and money or possibly both but other times you find yourself stuck in the position of Frak Zappa and cannot find a giraffe filled with whipped cream and have to make it yourself.  Such is the case with this completely made custom keyboard described at Hack a Day, in which every part was either custom ordered or made by the designer themselves.  None of the keys seem to be in their accustomed places and your thumbs will get a workout from all of those keys mounted in the centre of the board but for a programmer this could be the perfect design.  It has taken over a year to build and likely cost more than a mass produced designed keyboard but if you want something done right ...

HH1_oblique.jpg

"[dmw] posted a pseudo-build log over at the geekhack keyboard forums. Every single part of this keyboard is custom-made. The key caps were made by Signature Plastics, the case was made by Shapeways, and the custom PCB for the key switches came directly from Express PCB. The key switches are blue Alps sliders (one of the best key switches available) with a few white Alps switches taken from an old Apple keyboard."

Here is some more Tech News from around the web:

Tech Talk

 

Source: Hack a Day

Intel Inside. Thecus' Next-Gen NAS introduced: N4800, N2800

Subject: General Tech, Systems, Storage | February 20, 2012 - 01:53 PM |
Tagged: Thecus, NAS

Home users are starting to look at Network Attached Storage (NAS) devices to serve their home media needs. Also popular are products which allow you to browse the internet and play media on your TV. Just announced by Thecus are two NAS devices which fit both roles and many others. The N2800 contains a built-in media card reader while the N4800 has a built in mini Uninterruptable Power Supply (UPS), OLED status screen, and a second USB3.0 port.

thecus4800-2800.jpg

I hear they're a NASty bunch...

The obvious selling features of the two devices are the inclusion of HDMI output to enable the above roles as well as an updated 3rd Generation Intel Atom CPU D2700. The D2700 is a 2.13GHz Dual Core and hyper threaded Intel Atom processor manufactured at 32nm.

Check out the highlights of their press release below.

02/20/2012- As part of the Intel Embedded Alliance, Thecus has precedence and access to a multitude of Intel prototypes and the latest technologies. Working on those products for months now, Thecus is delighted to finally release its Vision Series.

The new N2800 and N4800 are going to be some of the first Intel(r) Atom(tm) D2700 based NAS! They will set the standard for what's best in the market to help you build a true multimedia center: USB 3.0, Dual Gigabit Ports, SD Card reader (N2800), Mini-UPS (N4800), etc.

And the most important feature is the HDMI output. With Thecus Local Display module, it's now possible to connect the NAS directly to a monitor and control it through USB mouse/keyboard. Playing HD movies, browsing the web, controlling the NAS... everything is now possible directly from your TV! Thanks to this feature, Thecus is now creating a new standard among the NAS industry.

About Thecus(r)

Thecus(r) Technology Corp. specializes in IP Storage Server and Network Video Recorder solutions. The company was established in 2004 with the mission to make technology that is as transparent as it is easy-to-use and products that are not only the best on the market, but are accessible to experts and novices alike. Combining a world-class R&D team highly experienced in storage hardware and software development with a keen customer focus, Thecus(r) stays close to the market to develop high-quality products to fulfill the storage and surveillance needs of today's world.

Source: Thecus

Of Near Threshold Voltage and Atomic Transistors

Subject: General Tech | February 20, 2012 - 01:53 PM |
Tagged: NTV, near threshold voltage, transistor, pentium, qubit

The eyes of the world are on the 22nm Ivy Bridge chip that Intel has slowly been giving us details of but there is also something interesting happening at 32nm with the world's most repurposed Intel CPU.  Once again the old Pentium core has been dusted off and modified to showcase new Intel technology, in this case Near Threshold Voltage operations.  In this case the Threshold refers to the amount of power needed to flip a bit on a processor, what you would be used to seeing as VCC and is the reason those dang chips get so toasty.   Much in the way that SpeedStep and other energy savings technologies reduce the operating frequency of an underloaded processor, Intel has tied the amount of voltage to the frequency and lowers the power requirements along with the chips speed.  The demonstration model that they showed The Register varied from a high end of 1.2 volts at 915MHz to a mere 280 millivolts at 3MHz and down to 2 millivolts in sleep.  By scaling the power consumption Intel may have found a nice middle group between performance and TDP to keep ARM from making major inroads into the server room, if they can pull it off with more modern processors.   They also showed off a solar powered CPU which might be handy on a cell phone but seems of limited commercial value in the short term as well as a

Keeping with the theme of small, The Register also has news of research which has created a working transistor out of a single phosphorus atom, an atomic radius of 0.110nm for those who like some scale with their transistors.  The trick was the temperature; seeing as it is a measure of energy expressed as movement (to put it ridiculously simply) you need low temperatures to keep the atoms from moving more than 10nm.  At -196°C the atom was stable enough for its position to be accurately predicted which is absolutely necessary if you plan to use the atom as a qubit.  Overclocking is going to be difficult.

intel_isscc_ntv_concept.jpg

"The threshold voltage is the point at which transistors turn on and conduct electricity. If you can flip bits near this threshold, instead of using much larger swing that is typically many times this threshold voltage to turn zeros into ones and vice versa, then you can save a lot of power."

Here is some more Tech News from around the web:

Tech Talk

 

Source: The Register

Microsoft Allegedly Overhauling SkyDrive With Increased Paid Storage, Applications, and Other Goodies

Subject: General Tech | February 20, 2012 - 11:12 AM |
Tagged: storage, skydrive, paid storage, free, cloud backup, bitlocker, app integration

Update: Some of the rumors have been confirmed by Microsoft in a blog post, though the individual file size increase was a bit off.  Microsoft will be allowing files up to 2 GB in size as compared to the rumored 300 MB file sizes.

Every so often, I run across a rumor that sounds almost too good to be true. On the other hand, it sounds so good that I just can't stop myself from being excited about it. Over the weekend, I saw an article that talked about Windows Live Skydrive offering paid storage tiers and I now really want this to come to fruition.

For those curious, SkyDrive is Microsoft's "cloud storage" service that gives users 25 GB of free storage space to hold files. There are some restrictions with the individual file size (that can be worked around if you really want to backup a home movie for example), but otherwise it is a boatload of space for free and saved my butt when the, um, "formatting catastrophe" of 2010 happened by having most of my digital photos backed up!

whatwouldidoifilostthesefunnyphotos.png

SkyDrive as it is now, funny old photos and all!

The service is connected to your Microsoft Live or hotmail account and can be accessed by navigating to skydrive.live.com. There are some usability issues with the service; however, including the fact that it's a pain in the rear to upload more than one or two files. The website doesn't make it easy to batch upload, say, a folder or folders only a file at a time. Further, it is not nearly as easy to manage those files once they are in the SkyDrive as it should be. Now, if you use IE, the SkyDrive website will allow you to upload multiple files easier; however, the other browsers are left without a way to do it. There is also the aforementioned individual file size limit of 100 MB per file.

The exciting bit about the rumors and (allegedly) leaked screen shots is that if they stay true the service is about to get a whole lot better by offering cheap storage and fixing many of the issues people have had with the service.

exclusivo-skydrive-apps-windows-os-x-e-opcoes-pagas.jpeg

The leaked image

On the storage front, Microsoft is allegedly adding new paid storage tiers and increasing the individual file size limit to 300 MB (from 100 MB). Among the new plans are 20 GB, 50 GB, and 100 GB offerings (which is in addition to the free 25 GB of space) for $10, $25, and $50 a year respectively. Not a bad price at all in my opinion! Assuming the pricing is accurate, they are vastly undercutting the competition. Dropbox, for example, is currently offering 50 GB for $99 a year and 100 GB for $199 per year. Granted, Dropbox has syncing functionality, no individual file size limit, and is a much easier to use service with an established user base, but at these prices the Microsoft offering is likely to win over many people who just want some cheap off site backup space!

 

Paid Storage Space SkyDrive (Price Per Year) Dropbox (Price Per Year)
20 GB $10 n/a
50 GB $25 $99
100 GB $50 $199

Dropbox pricing just for comparision.

While there are currently mobile applications for Windows Phone and Apple iOS smart phones, users must turn to third party explorer extensions (like SDExplorer) for Windows OS integration on the desktop. More leaked images seem to suggest that Microsoft will be launching applications for Windows and Mac operating systems to better integrate SkyDrive into the OS (and hopefully enable easier cloud file management). SDExplorer is a third party extension that I used to upload all my photos to SkyDrive and it allows mounting the SkyDrive account as a "hard drive" under Windows Explorer. Unfortunately, it costs money to get the full feature set, so hopefully Microsoft can provide similar (or more) features for free with their OS.

SkyDrive-App-for-Windows_thumb.png

In addition, Microsoft will allegedly be adding URL shortening for public and shared SkyDrive file links as well as the ability to share files to Twitter and Facebook from within the SkyDrive website. For the latter, there are already APIs and Microsoft is likely just leveraging them to make sharing files a bit more convenient. On the other hand, Microsoft will be using their own URL shortening service via the sdrv.ms domain instead of integrating with an existing service.

As a user of Libre Office (the fork off of what was once Open Office), I deal a lot with .odt files, which is the open document standard. For users of Microsoft's web application of Office, they have been forced to save files to the Microsoft standards; however, rumors suggest that the service will soon support creating and saving to the .odt, .odp, and .ods document formats. If you are using Office Web Apps, then you are already likely fairly integrated into the Office universe, and this feature won't mean much. On the other hand, this will help out others who may need to edit one of the Libre Office created documents backed up to their SkyDrive on the go. Better compatibility is always a step in the right direction for MS after all.

Last up on the rumor pile for SkyDrive is the ability to store BitLocker recovery keys directly to SkyDrive so that you have a backup should you ever forget your encryption password. The flip side of that convenience feature is that it provides another attack vector should someone attempt to get their hands on your encryption keys, and it is a location that you must depend on someone else to keep secure. As weird as it may sound, you might want to encrypt your encryption key before uploading it to any "cloud" service (heh), just in case. Still, it's good to have options.

More photos and information on the alleged leaks can be found here and here.

Needless to say, there was quite the leak this weekend over Microsoft SkyDrive features!  It is a lot to take in, but in my opinion it sounds like they are really giving the service special attention it needs to get it into fighting form.  And if the rumors hold true it will be much more comptetitive with other cloud storage backup options as a result of the overhaul.  I'm excited about this, obviously, but what about you?  Do you use SkyDrive?

Source: Gemind.com

Wi-Fi on Rosepoint SoC die. Intel flexes before ARM wrestle.

Subject: General Tech, Processors, Systems, Mobile, Shows and Expos | February 20, 2012 - 01:50 AM |
Tagged: Rosepoint, ISSCC 2012, ISSCC, Intel

If there is one thing that Intel is good at, it is writing a really big check to go in a new direction right when absolutely needed. Intel has released press information on what should be expected from their presence at the International Solid-State Circuits Conference which is currently in progress until the 23rd. The headliner for Intel at this event is their Rosepoint System on a Chip (SoC) which looks to lower power consumption by rethinking the RF transceiver and including it on the die itself. While the research has been underway for over a decade at this point, pressure from ARM has pushed Intel to, once again, throw money at R&D until their problems go away.

Intel could have easily trolled us all and have named this SoC "Centrino".

Almost ten years ago, AMD had Intel in a very difficult position. Intel fought to keep clock-rates high until AMD changed their numbering scheme to give proper credit to their higher performance-per-clock components. Intel dominated, legally or otherwise, the lower end market with their Celeron line of processors.

AMD responded with series of well-timed attacks against Intel. AMD jabbed Intel in the face and punched them in the gut with the release of the Sempron processor line nearby filing for anti-trust against Intel to allow them to more easily sell their processors in mainstream PCs.

At around this time, Intel decided to entirely pivot their product direction and made plans to take their Netburst architecture behind the shed. AMD has yet to recover from the tidal wave which the Core architectures crashed upon them.

applebattery.jpg

Intel wishes to stop assaulting your battery indicator.

With the surge of ARM processors that have been fundamentally designed for lower power consumption than Intel’s x86-based competition, things look bleak for the expanding mobile market. Leave it to Intel to, once again, simply cut a gigantic check.

Intel is in the process of cutting power wherever possible in their mobile offerings. To remain competitive with ARM, Intel is not above outside-the-box solutions including the integration of more power-hungry components directly into the main processor. Similar to NVIDIA’s recent integration of touchscreen hardware into their Tegra 3 SoC, Intel will push the traditionally very power-hungry Wi-Fi transceivers into the SoC and supposedly eliminate all analog portions of the component in the process.

I am not too knowledgeable about Wi-Fi transceivers so I am not entirely sure how big of a jump Intel has made in their development, but it appears to be very significant. Intel is said to discuss this technology more closely during their talk on Tuesday morning titled, “A 20dBm 2.4GHz Digital Outphasing Transmitter for WLAN Application in 32nm CMOS.”

This paper is about a WiFi-compliant (802.11g/n) transmitter using Intel’s 32nm process and techniques leveraging Intel transistors to achieve record performance (power consumption per transmitted data better than state-of-the art). These techniques are expected to yield even better results when moved to Intel’s 22nm process and beyond.

What we do know is that the Rosepoint SoC will be manufactured at 32nm and is allegedly quite easy to scale down to smaller processes when necessary. Intel has also stated that while only Wi-Fi is currently supported, other frequencies including cellular bands could be developed in the future.

We will need to wait until later to see how this will affect the real world products, but either way -- this certainly is a testament to how much change a dollar can be broken into.

Source: Intel

Blender 2.62 released -- getting better all the time.

Subject: General Tech | February 18, 2012 - 11:11 PM |
Tagged: Blender

So what do you do if you blew all of your money on a professional workstation and have nothing left for software?

Well, you get a better business strategy.

Occasionally there are open source products which rival or exceed the usability of the packaged products. I first started learning 3D modeling on a perpetual educational license of Maya and 2D art on a combination of Photoshop and GIMP. While I could not manage to eliminate Photoshop from my workflow I found the switch from Maya to pre-release Blender + Bmesh builds felt like an upgrade -- not just a manageable change. Blender is rapidly getting even better with each new bi-monthly version such as their just-released 2.62 update.

Blender 2-62.png

Flower power?

(Photo Credit: Blender Project / Alexey Lugovoy)

Blender decided to introduce many new features throughout the 2.6 series of releases by developing them in parallel and merging branches into the release trunk as they became worthy. This release yielded a new renderer known as “Cycles”, new UV unwrapping tools, reprogrammed Boolean tools, and motion tracking features.

Personally, I look most forward to the official 2.63 release scheduled for April. It appears as if the secrecy surrounding the development status of Bmesh was lifted and its introduction to the canon application is pinned to that release. Prior to pre-release Bmesh builds, Blender just felt too distant to the style of modeling which I developed in my years of using Maya. Since the addition of Bmesh, Blender was able to fit all of the essential roles which Maya satisfied and avoided some of my long-standing gripes with Autodesk’s $3000 package in the process. I was not even referring to its cost.

By the end of the 2.6 line, I expect that Blender will be an upgrade for users of many 3D applications. Check it out, for free, at their website.

Source: Blender

Intel urges you to program better now, not the same -- later.

Subject: General Tech, Processors, Mobile | February 18, 2012 - 09:06 PM |
Tagged: Intel, mobile, developer

Clay Breshears over at Intel posted about lazy software optimization over on the Intel Software Blog. His post is a spiritual resurrection of the over seven year’s old article by Herb Sutter, “The Free Lunch is Over: A Fundamental Turn Toward Concurrency in Software.” The content is very similar, but the problem is quite different.

The original 2004 article urged developers to heed the calls for the multi-core choo choo express and not hang around on the single core platform (train or computing) waiting for performance to get better. The current article takes that same mentality and applies it to power efficiency. Rather than waiting for hardware that has appropriate power efficiency for your application, learn techniques to bring your application into your desired power requirements.

inteltf2.jpg

"I believe your program is a little... processor heavy."

The meat of the article focuses on the development of mobile applications and the concerns that developers should have with battery conservation. Of course there is something to be said about Intel promoting mobile power efficiency. While developers could definitely increase the efficiency of their code, there is still a whole buffet of potential on the hardware side.

If you are a developer, particularly of mobile or laptop applications, Intel has an education portal for best power efficiency practices on their website. Be sure to check it out and pick up the tab once in a while, okay?

Source: Intel Blog

"Lego spelunking!" LEGO Minecraft available for pre-order

Subject: General Tech | February 18, 2012 - 01:22 AM |
Tagged: Minecraft, Lego

Most of the success of Minecraft can be attributed to that little part of your brain which desires to be creative while at play. Prior to Minecraft, other toys such as LEGO clung on to that yearning for their successes. A mash-up between the old and the new is a natural stage in the evolution of Minecraft. LEGO and Mojang announced a LEGO-themed Minecraft set.

Who knows, maybe we will eventually get LEGO Minecraft the videogame and complete the circle.

Recent events would lead you to believe that Mojang has enough money -- and you are correct. The Minecraft LEGO set was created from LEGO’s CUUSOO program. If a CUUSOO product is selected to become a part of LEGO’s portfolio, it can collect 1% of total net sales of the product. According to the official LEGO press release, Mojang will donate the royalties they collect from this LEGO product to charity.

If you desire to explore blocky caves in real life then head on down to the Jynx website and submit your order. The set is expected to be available sometime in the summer. Also, check out the LEGO CUUSOO project to submit or vote upon other potential products.

Source: LEGO

HP dates NVIDIA on Valentine's Day. We get Z1 workstation.

Subject: General Tech, Systems | February 17, 2012 - 10:39 PM |
Tagged: workstation, nvidia, hp

Here is a story for the professional computer users out there.

Professionals have standards: be polite, be efficient, and have a multi-year plan to cram as much hardware into a small case as you can seat. NVIDIA and HP have obviously played too much Team Fortress -- or I did -- let us just all three of us have. The engineers have dispensed with the desktop tower and crammed everything in the monitor with their Z1 product series. While not original, it does hold a number of nice features.

HP-Z1.jpg

… But honestly, what the user really wants is for it to dispense Bonk!

As soon as I read the announcement I immediately jumped over to HP’s product page and confirmed the existence of external display connections. Sure enough, HP did not entirely botch this product and allows the connection of one extra monitor by display port. While being limited to just two monitors is a bit disappointing -- I currently have a three monitor setup -- if they were to introduce a workstation computer with just a single monitor it would have been product suicide. Thankfully they had enough sense.

The real flaunted feature of the Z1 workstation is its ease of upgrade. The included power supply is rated at 400W which to my knowledge is decent for a single-card workstation class computer. HP claims support for up to 2 internal 2.5-inch drives or a single 3.5-inch drive; unfortunately they do not clarify whether you can install all three drives, or if you must choose between the one larger versus the two smaller drives.

HP-z1-open.jpg

HP and NVIDIA go on a date -- they dress workstation classual.

The workstation is expected to start at $1899 when it ships sometime around April. Unfortunately HP’s technical specifications list an Intel Core i3 and Integrated HD 2000 GPU -- most likely to hide the price of the products with the components that you actually want. I guess you will need to wait a couple of months to find out what you will actually be paying.

Source: NVIDIA

Rumor: Amazon didn't start the Fire. Kindle Fire 2 in May?

Subject: General Tech, Mobile | February 17, 2012 - 08:58 PM |
Tagged: kindle fire, amazon, foxconn, Quanta

Amazon had quite the successful launch of their Kindle Fire tablet PC. The original Kindle Fire is based on the Blackberry Playbook design and manufactured by the same company, Quanta. Despite being out for just three months, we may be just three or four months away from its successor.

18-Mmm-amazon.png

Foxconn is expected to do the work as OEM... a Quanta of solace.

The news was first reported by The Commercial Times, a Chinese-language Taiwan publication and put online by their sister publication, China Times (Microsoft Translation). According to the article, the original Kindle Fire may not be dying an early death. As is almost expected from Amazon, the original Kindle Fire will persist as Amazon’s 7-inch Kindle Fire model. The new Kindle Fire is rumored to compliment that product, not replace it.

The new Kindle Fire is expected to be a 10-inch model and, unlike the Blackberry Playbook design which Quanta sold Amazon last year, be more heavily designed by Amazon themselves. It is expected that while Quanta will continue to manufacture the 7-inch Kindle Fire, the 10-inch will be assembled at Hon Hai (Foxconn). Commercial Times does not suggest what other changes Amazon will introduce with the new product.

Source: China Times

HTC at MWC R-U-M-O-R-S

Subject: General Tech, Mobile, Shows and Expos | February 17, 2012 - 06:33 PM |
Tagged: tegra 3, MWC, htc

Mobile World Congress (MWC) is approaching and you should expect our coverage of the event from a hardware perspective. The actual event occurs from February 27th through March 1st although, like most events, we expect that the news coverage will begin a couple of days before that time. Rumors about what will appear at the show are already surfacing and include a few leaks about upcoming HTC releases.

HTC-nottegra.png

Probably there's a very simple answer to it... still curious though.

(Update: As pointed out in the comments, one of the phones actually IS Tegra 3 powered. I read it as including some other processor... and somehow I only found the LG X3 when looking for Tegra 3 powered phones.)

TechCrunch rounded up details from a few sources about several phones from HTC that are expected at MWC. Ice Cream Sandwich appears to be the common thread amongst each of the leaks. Of particular note, HTC appears to be demonstrating a 10.1” tablet running an NVIDIA Tegra 3 processor. Their phones, on the other hand, do not. (Update: Yeah they do, my mistake.)

Unlike (Update: Actually, like) HTC, LG is expected to reveal a Tegra-3 powered phone, the LG X3, at Mobile World Congress -- so Tegra 3 phones are not nonexistent -- just seemingly a scarce commodity. It would be interesting to know why NVIDIA’s Tegra 3 technology appears, at least from my standpoint, so common on tablets yet so scarce on phones.

Be sure to keep checking back for our coverage of the event and all of your other hardware needs.

Source: TechCrunch

Switch on a hinge and a 350 foot network cable - how to rig up your new office

Subject: General Tech | February 17, 2012 - 03:41 PM |
Tagged: asus, office tour

Sometimes things don't work out the way you want them and you have to improvise.  In our case this week, as we prepare to move into a new office space, we had a couple of dilemmas crop up.

  1. The "cabinet" the previous owner made for network was an actual cabinet and didn't have enough depth for a switch, let alone a server.
  2. Fiber internet is coming but not for another 6 weeks - what do we do until then?

The first issue of our networking cabinet was resolved by putting a 24-port Gigabit switch on a hinge inside the not-quite-deep-enough space we had for it.  And sure, once you find out we used Gorilla glue, a block of wood and standard door hinges from Home Depot, it might sound a little bit on the ghetto side, but the fact is...it worked!

Our problem with an actual internet connection to apply to the switch in question was a bit harder.  Since our fiber wasn't going to be installed until late March, and I didn't want to see the office space simply sit there and be wasted until then, we had to find another solution.  We asked our neighbors about using their connections temporarily and while several were open to it, speed tests showed consistent 1.4 mbps downstream and 0.45 mbps upstream connections.  Not good for the amount of video we do here.

So, another option presented itself: our current office that has 20 mbps down and 2 mbps up service (mediocre, but still better) was only a short 100-110 meters away.  Could a Cat5e cable simply be run between them?  Turns out it could and we were even able to run a length has long as 500 ft without a problem, connecting 10/100 rather than Gigabit speeds.  

In total, our hinge modification cost us about $4.50 and the 500 ft spool of cable just around $50 but the hassle we saved has been worth thousands.  The cable connection issue is obviously not permanent but barring any rogue squirrel retaliation in the next 4-6 weeks, it should more than serve our purposes.

Enjoy the weekend!

NVIDIA is up for a rough year

Subject: General Tech | February 16, 2012 - 12:45 PM |
Tagged: nvidia, 28nm, TSMC, kepler, tegra, tegra 3

If you caught the podcast last night you would have heard Josh discussing NVIDIA's year end financial conference call, otherwise you will have to wait until the 'cast is posted later this week.  Until then you can read SemiAccurate's take on the call here.  There is a lot of news about NVIDIA and none of it is good, from wafer yields to chip demand nothing seems to have gone right for them.  Attempting to move off of their cursed 40nm line and switching to 28nm, NVIDIA has run into big yield problems as in entire wafers having issues as opposed to just some dies being bad.  

Tegra is not doing so well either, with sales of Tegra 2 dropping as we approach the release of Tegra 3, which is getting a lot of bad press.  SemiAccurate refers to the chip as bloated in size as well as being downright expensive to make.  Combine that with the fact that NVIDIA is lagging on A15 adoption and Samsung and Apple turning their backs on Tegra and it doesn't look good for NVIDIA's mobile plans. The one ray of sunshine is that even combined Samsung and Apple do not account for even half of smartphones on the market, so there is still room for NVIDIA and Tegra to grow.

Tegra3_Chip.jpg

"Nvidia seems to be so far ahead of the curve that they are experiencing problems that are unique in the industry. In their recent year end financial conference call, there was enough said to draw some very grim conclusions.

Today’s conference call was a near complete validation of all the things SemiAccurate has been saying about Nvidia. Remember when we asked if Nvidia could supply Apple? Anyone notice the part about dumping early 28nm capacity, and the disappearance of 28nm Fermi shrinks? Remember how 28nm was not an issue for Nvidia, even if their product roadmap slips said otherwise. How well does this mesh with the quotes from Jen-Hsun himself on the topic?"

Here is some more Tech News from around the web:

Tech Talk

 

Source: SemiAccurate

Mass Effect 3 is Coming, Pre-Order Now and Get Battlefield 3 (PC) Free

Subject: General Tech | February 16, 2012 - 02:08 AM |
Tagged: PC, mass effect 3, gaming, game, ea, bf3, battlefield 3

Update: Apparently EA has decided to pull the deal because it was too good of an idea :(. 

The final installment in the Mass Effect trilogy is almost upon us, and for those itching to get a taste of Mass Effect 3 can now go and download the Mass Effect 3 demo for the PC via EA's Origin service. The demo delivers about an hour (they claim two hours, but I finished it in about an hour and I was purposefully taking it slow to take in the scenery and such) of Shephard battling against a (spoilers ahead) Reaper invasion.

Personally, from playing the demo I'm not convinced that it is going to live up to the hype, and it seems to be rather "dumbed down" compared to the first one. With that said, it was not terrible and I will likely pick it up if only to finish out the story.  The story itself hits hard in the demo and I am excited for that aspect of the Mass Effect sequel, for example.  If you have not already done so, check out the demo that's out now.

masseffect3.PNG

Anyway, if you do enjoy the demo and are getting pumped for the release this March, EA is currently running a rather good deal on Origin for those willing to Pre-Order Mass Effect 3 from the Origin store. According to EA, users who place a pre-order for Mass Effect 3 through the Origin store for any platform (including digital download, boxed PC copy, Xbox 360 or PlayStation 3) before March 5, 2012 will receive a free digital PC edition of Battlefield 3 for free.  The codes for BF3 will be emailed to customers when they become available.

As always, there are some caveats:

  • The offer is only valid for those in US and Canada.
  • You must pre-order through Origin and cannot be combined with any other discounts.
  • You are not eligible for the free copy if you already own Battlefield 3 on Origin.
  • The Battlefield 3 codes will be emailed no later than March 8, 2012.

That last one is a big one (for me anyway).  Considering Battlefield 3 is already released, why can't those that pre-order ME3 get instant access to it?  I was all for the deal at first as I have not yet purchased BF3 and if I could get it for free by pre-ordering a game I was likely to buy anyway it sounded like a sweet deal.  Unfortunately, not being able to jump into BF3 to hold me over until Mass Effect 3 launched makes it less awesome.  After all, once Mass Effect 3 releases, I'm not going to want to play Battlefield 3 anymore!  Considering Battlefield 3 will likely still be approximately $60 on Origin in a few months, getting it free is still a good deal, but it's less of a impulse purchase knowing I might not get the Battlefield 3 code until after I have Mass Effect 3 downloaded.

It's there if you want it though, so go download the Mass Effect 3 demo and let us know what you think of it!

Source: EA

Just how much money is down in that mine anyway, Mojang? Psychonauts 2 could need at least $13M. Notch a problem!

Subject: General Tech | February 15, 2012 - 04:29 PM |
Tagged: Psychonauts, Mojang

Tim Schafer and Markus (“Notch”) Persson are currently discussing a sequel to the wonderful game that no-one bought called Psychonauts. Since February 7th, the two have been discussing creating the sequel both in private as well as on Twitter. During the discussion, it was revealed that Psychonauts 2 will cost at least $13 million to make -- to which Notch allegedly responded ever so passively, “Yeah, I can do that.”

Psychonauts2-Mojang.png

I could totally see a crafting system in Psychonauts.

The original Psychonauts sold abysmally with just 130,000 Xbox units purchased globally. It has been pushed onto Good Old Games as well as Steam since that time and sales have substantially increased. It draws a smirk to my lips that the art was ultimately purchased predominantly by long-tail PC sales. I love PC Gaming.

Even still, it is possible that Notch is taking this more as a passion project. It is possible that rather than blowing his money on a big house or a private jet, he feels as though he could spend it reviving a franchise he adores. Whether or not he gets a return on his investment could not even be a concern of his.

Of course, over the last week or so this whole trade became a bit of a joke in the industry. Rock Paper Shotgun also suggested a list of games for Notch to fund a sequel of. I personally would take anyone with the word Shotgun in their name quite seriously when they hand you a list of demands.

Passion project or otherwise, I seriously hope the Psychonauts sequel does get made and does become successful. When you create a game as focused and as well thought out as Psychonauts it definitely deserves all the money it could shut up and receive.

Source: PCGamer

Far Cry 3 gets a trailer and release date, first unofficially and then officially

Subject: General Tech | February 15, 2012 - 03:22 PM |
Tagged: gaming, far cry, far cry 3, ubisoft

If you follow the information from the just released official trailer of Far Cry 3 then the date you await is the 7th of September.  The leaked trailer which arrived 5 hours ago stated a date of the 6th, which was almost the only difference between the two releases.  Rock, Paper, SHOTGUN has both the trailers if you want to take a peek.

RPS_far-cry-33333.jpg

"This is the ‘official’ version of the Far Cry 3 trailer, which is almost exactly the same as the one that leaked earlier on today. I actually felt bad about linking to that one, but we’re all about disclosure here at RPS: whether you’re Ubisoft or John Walker, we’ll get to the heart of the story, no matter what. So, tell us Ubisoft: why did the leaked trailer have a Sept 6 release date when this one says Sept the 7th?"

Here is some more Tech News from around the web:

Gaming

 

Take that Moore! Electron beam etching set to take us to the 10nm process

Subject: General Tech | February 15, 2012 - 01:45 PM |
Tagged: photolithography, moores law, MAPPER, etching, electron lithography

Josh has covered the lithography process in depth in several of his processor reviews, watching the shrink from triple digit process to double digit process as the manufacturers refine existing processes and invent new ways of etching smaller transitors and circuits.  We've also mentioned Moore's Law several times, which was a written observation by Gordon E. Moore that has proven to be accurate far beyond his initial 10 year estimate for the continuation of the trend that saw that "the number of components in integrated circuits had doubled every year from the invention of the integrated circuit in 1958 until 1965".  It is a measure of density, not processing power as many intarweb denizens interpret it.

With UV light currently being the solution that most companies currently implement and expect to use for the near future, the single digit process seems out of reach as the bandwidth of UV light can only be compressed so small without very expensive work arounds being implemented.  That is why the news from MAPPER Lithography of Delft, The Netherlands is so exciting.  They've found a way to utilize directed electron beams to etch circuitry and are testing out 14nm and 10nm processes and doing it to the standards expected by industry.  This may be the process used to take us below 9nm and extend the doubling of tranisitor density for a few years to come.  Check out more about the process at The Register and check out the video below.

"An international consortium of chip boffins has demonstrated a maskless wafer-baking technology that they say "meets the industry requirement" for next-generation 14- and 10-nanometer process nodes.

Current chip-manufacturing lithography uses masks to guide light onto chip wafers in order to etch a chip's features. However, as process sizes dip down to 20nm and below, doubling up on masks begins to become necessary – an expensive proposition."

Here is some more Tech News from around the web:

Tech Talk

 

Source: The Register

Google Public DNS Handles 70 Billion Requests A Day

Subject: General Tech | February 15, 2012 - 03:56 AM |
Tagged: web, Internet, google, dns

Yesterday morning the Internet juggernaut that is Google announced that its public DNS service has far surpassed their expectations for the experimental service. In fact, the company has taken the 'beta service' training wheels off of what they believe to be "the largest public DNS service in the world," and their statement that they are now handling 70 billion requests a day means the claim may not be far from the truth.

Interestingly, 70% of the service's users come from outside of the US, and Google has announced that they are beefing up their overseas presence to service them with new access points in Australia, India, Japan, and Nigeria. Further, they are expanding their offerings in Asia in addition to maintaining the current servers in North America, South America, and Europe.

The company is continuing to provide their DNS service for free, and they ended their announcementby stating "Google Public DNS’s goal is simple: making the web—really, the whole Internet!—faster for our users."

For those curious, DNS is the technology that allows users to punch in easy to remember text URLs and have their computers connect to the proper servers via numerical IP addresses (which are definitely not as easy to remember). It has been likened to the Internet equivalent of a phone book, and that description is an apt one as DNS servers maintain a running list of IP addresses and the accompanying URL (universal resource locator) so that humans can input a text URL and connect to servers using an IP address. DNSSEC makes things a bit more complicated as it adds further layers of security, but on a basic level the description fits.

012.PNG

DNS benchmark "namebench" results

There are several free offerings besides the DNS services provided by your ISP, and open source tools like Name Bench can help you track down which DNS service is the fastest for you. Users connect to DNS servers using an IP address on one of several levels (in software, at the computer level, or at the router level, et al), and for the majority of people your modem and/or router will obtain the default DNS automatically from your ISP along with your IP.

The default DNS is not your only option, however. Further, many routers can support up to three DNS IP addresses, and by connecting to multiple (separate) services you can achieve a bit of redundancy and maybe even a bit of speed. A fast DNS server can result in much faster web page load times, especially for sites that you don't normally go to (and thus are not cached).

In the case of the Google Public DNS, they operate on the following IP addresses.

  • 8.8.8.8
  • 8.8.4.4
  • 2001:4860:4860::8888
  • 2001:4860:4860::8844

(The latter two are IPv6 addresses, and were announced on World IPv6 Day.)

 If you have not looked into alternative DNS services, I encourage you to do so as they can often be faster and more reliable than the default ISP provided servers (though that is not always the case). It does not take much time to test and is an easy configuration tweak that can save you a bit of time in getting to each web page (like PC Perspective!). Have you tried out Google or other alternative DNS services, and did you see any improvements?

Source: Google