Subject: Editorial, General Tech, Systems | July 11, 2011 - 09:57 PM | Scott Michaud
Tagged: xbox, pc gaming
Last week we reported on Microsoft rolling their Games for Windows initiative into Xbox.com and I essentially said that unless Microsoft is trying to roll their now established Xbox brand into Windows that they are missing the point of PC gaming. This week we hear rumors that, in fact, Microsoft may be trying to roll their now established Xbox brand into Windows. According to Insideris, Windows 8 will allow you to play Xbox 360 games on your PC. That said, despite speculation as a result of this news, it does not state whether it will be the complete catalog or a subset of 360 games that are compatible with the PC.
Which came first? The console or the Newegg?
What does this mean for PC gaming? I am unsure at this point. A reasonable outcome would be that Xbox becomes a user-friendly brand for Microsoft’s home theatre PC initiatives which adds a whole lot more sense to the Windows 8 interface outside of the tablet PC space. This is a very positive outcome for the videogame industry as a whole since it offers the best of Xbox for those who desire it and the choice of the PC platform.
This however opposes Microsoft’s excessively strict stance on closed and proprietary video gaming platforms. Could Microsoft have been pushing their proprietary platform to gut the industry norms knowing that at some point they would roll back into their long-standing more-open nature of Windows? Could Microsoft be attempting to lock down PCs, meeting somewhere in the middle? We will see, but my hopes are that proprietary industry will finally move away from art. After all, why have a timeless classic if your platform will end-of-life in a half-dozen years at best?
Subject: General Tech | July 11, 2011 - 08:46 PM | Jeremy Hellstrom
Tagged: input, mouse, cooler master, CM Storm
CoolerMaster is really going all out in the peripheral market as you can see from their latest gaming mouse, the Sentinel Z3RO-G. The 5600DPI Storm Tactical Twin-Laser Sensor is standard issue in the Storm series, 128kb of onboard memory gives you multiple profiles for the 8 buttons and it even features something called Rapid Fire Tactical Mode which will probably be handy when Diablo 3 comes out. The unique feature on this mouse is an LED screen which displays your current sensitivity settings which eTechnix really fell in love with.
"Today I will be taking a look at CM Storm’s latest offering- the Sentinel Z3RO-G. Just like CM Storm’s other products the Z3RO-G is aimed at the gaming market, and showcases many of the company’s famous features. The Z3RO-G is kitted out with a 5600DPI dual laser sensor which is easily changeable on-the-fly for a quick switch between precision sniping to rushing within an instant. It also has a unique LED display on the top to give you information about your current settings and is highly customisable using the advanced software included. So are these features useful, or just a marketing gimmick?"
Here is some more Tech News from around the web:
- NZXT Avatar S Mouse Review @ Hardware Secrets
- ZOWIE MiCO Gaming Mouse Review @ eTeknix
- XFX Warpad Gaming Mousepad Review @ Tweaknews
- Soyntech Inpput R490 @ XSReviews
- XFX WarPad Review @ OCC
- Synology USB Station 2 @ XSReviews
- Scan 2-Port HPU-300 NC SuperSpeed USB 3.0 PCI Express Card Review @ eTeknix
Subject: General Tech | July 11, 2011 - 03:38 PM | Jeremy Hellstrom
Tagged: pewpew, laser, DIY
There are a lot of instructions on the net covering the steps to build yourself a laser, from the large scale models at Power Labs which are not portable to smaller scale ones using DVD/Blu-ray lasers which can't be used for much more than driving the family pet insane. Over at Hack a Day is a detailed project on how to build your own hand held pulse laser which can certainly burn holes through thin metals and other unsuspecting inanimate objects. This particular build is powered by scrounged capacitors from disposable cameras and as long as you keep an even number and ensure the capacitors are all the same rating you can make it even more powerful.
"Self-declared Mad Scientist and Instructables user [Trevor Nestor] recently built a pulse laser pistol and decided to share his build process, so that you too can build a ray gun at home. The gun is made up of mostly scavenged components, save for the Neodymium:YAG laser head, which he purchased on eBay for about $100. He does say however, that you can score an SSY-1 laser from an old rangefinder, providing you hang out near a stockpile of decommissioned Abrams tanks."
Here is some more Tech News from around the web:
- How digital detectives deciphered Stuxnet, the most menacing malware in history @ Ars Technica
- Google+ privacy features are exposed @ The Inquirer
- Google Blocks co.cc From Search Results @ Slashdot
- Report: Microsoft Wants $15 Per Samsung Android Handset @ Linux.com
- Fujifilm Finepix F550EXR Digital Camera @ Maximum CPU
- The TR Podcast 91: Llano is Audi
- Summertime Challenge Giveaway @Hi Tech Legion
Subject: General Tech, Mobile | July 11, 2011 - 07:53 AM | Scott Michaud
Tagged: nook color, kindle
Amazon did not create the eBook reader market but they created the vastly most popular product in the category, the Kindle. Amazon gained such a popular status over main competitor, Sony, due to their content and the ubiquity of their service across multiple platforms adjacent to the Kindle device itself. Rumors flew for quite some time now, and from various sources, that Amazon would be jumping into the Android tablet space to likely complement their Kindle line. In a humorously ironic twist, an eBook reader based on an Android tablet just unseated the Kindle as the most popular e-reader.
A little hot under the collardron?
Barnes and Nobel entered the eBook reader market in late 2009 fighting an uphill battle against Amazon and a juvenile pun on their name (hehehe, “Nook eBook”). A year later they launched the Nook Color, an Android 2.1 tablet locked into a certain subset of applications available either pre-loaded or their application store. This tablet brainwashed to be an eBook reader overtook Kindle recently, finally shushing naysayers to Barnes and Nobel's entry to the tablet market. Heh – “Nook eBook”. It will be interesting to see how Amazon’s business will evolve in the coming year or two as a result of competitive pressures and an evolving marketplace.
Subject: General Tech, Processors, Systems | July 10, 2011 - 06:45 AM | Scott Michaud
Tagged: Intel, ultrabook
Intel has been trying to push for a new classification of high-end, thin, and portable notebooks to offset the netbook flare-up of recent memory. Intel hopes that by the end of 2012, these “Ultrabooks” will comprise 40% of consumer notebook sales. What is the issue? They are expected to retail in the 1000$ range which is enough for consumers to buy a dual-core laptop with 4 GB of RAM and a tablet. Intel is not fazed by this and has even gone to the effort of offering money to companies wishing to develop these Ultrabooks; the OEMs are fazed, however, and even with Intel’s pressing there is only one, the ASUS UX21, slated to be released in September.
Asus sticking its neck out. (Video by Engadget)
For the launch, Intel created three processors based on the Sandy Bridge architecture: the i5-2557M, the i7-2637M, and the i7-2677M. At just 17 watts of power, these processors should do a lot on Intel’s end to support the branding of Ultrabooks having long battery life and an ultra-thin case given the lessened need for heat dissipation. Intel also has two upcoming Celeron processors which are likely the same ones we reported on two months ago. Intel has a lot to worry about when it comes to competition with their Ultrabook platform though; AMD will have products that appeal to a similar demographic for half the price and tablets might just eat up much of the rest of the market.
Do you have a need for a thousand dollar ultraportable laptop? Will a tablet not satisfy that need?
(Registration not required for commenting)
122 years ago the Wall Street Journal put out it's first; conversely News of the World may have put out it's last
Subject: General Tech | July 8, 2011 - 10:46 PM | Jeremy Hellstrom
If there is one piece of advice you can glean from our Forums this week, it is that letting an 8yr old relative play with any piece of technology you value is a very bad idea, at best you will end up with a Miley Cyrus infection. If you are looking at setting up a new system and going for a nice overclock with your AMD or Intel CPU, maybe you should investigate some of the air coolers that Forum members have used successfully. That's not all, don't you hate whining PSUs, naughty SSDs and overly picky RAM?
As well you can catch the 161st iteration of the PC Perspective Podcast, or get in an argument in the Lightning Round, trade kit in the Trading Post or just go off the wall in the Off Topic Forum, the choice is yours.
Subject: General Tech | July 8, 2011 - 04:37 PM | Jeremy Hellstrom
Tagged: nvidia, amd, 28nm, kepler, maxwell
TSMC's 28nm wafer yields are having a negative effect on NVIDIA's scheduled release of their next generation of GPUs, no matter what the PR coming out of NVIDIA might suggest. That news is coming from graphics card manufacturers who were hoping to release cards but have since seen NVIDIA's scheduled releases delayed by a year. While it may be true that TSMC is partly to blame for the delay there is also talk about the chips performance being lower than was expected and is needed to challenge AMD. The news for NVIDIA gets even worse as DigiTimes confirms that AMD is still on schedule with it's 28nm chips. This may seem like a bit of deja vu, as we saw similar production problems from TSMC's initial 40nm chips; though that effected both major GPU makers more or less equally.
"Despite Nvidia CEO Huang Jen-hsun previously saying that the company is set to announce its new 28nm GPU architecture at the end of 2011 and 22/20nm in 2013, sources from graphics card makers have pointed out that Nvidia has already adjusted its roadmap and delayed 28nm Kepler and 22/20nm Maxwell to 2012 and 2014.
The sources believe that the delay is due to unsatisfactory yield rates of Taiwan Semiconductor Manufacturing Company's (TSMC) 28nm process as well as lower-than-expected performance of Kepler.
TSMC originally expected its 28nm capacity at Fab15 to be available in the fourth quarter of 2011 and was set to start pilot production for its 20nm process technology in the third quarter of 2012.
However, TSMC's other major client Qualcomm, currently, still has not yet adjusted its 28nm process schedule and is set to launch three new products, 8960. 8270 and 8260A using dual-core Krait architecture in the fourth quarter of 2011.
Meanwhile, AMD will follow its original schedule and enter the 28nm era in the first half of 2012. The company's next-generation graphics chips Southern Island as well as Krishna and Wichita processors, which will replace the existing Ontraio and Zacate processors, and will all adopt a 28nm process from TSMC."
Here is some more Tech News from around the web:
- IA releases a dual-core 1.6GHz EPIA board @ The Inquirer
- Ubuntu ushers me out of the Windows XP era @ The Tech Report
- Your Friday must-see video: 14 minute Bioshock Infinite demo @ Ars Technica
- Last flight of the Space Shuttle: a 30-year retrospective @ Ars Technica
- Google: Go public on Profiles or we'll delete you @ The Register
- AMD's Brazos E-450 detailed @ Fudzilla
- Only jailbroken iPhones, iPads can be safe from latest vuln @ The Register
- TRENDnet 450Mbps Wireless N USB Adapter @ Maximum CPU
- ASUS USB-N13 802.11n Network Adapter Review @ ThinkComputers
- The Summer of Honeycomb, Part 1: Win an ASUS Eee Pad Transformer @ AnandTech
- Modders-Inc Junes's FRotM Winner - The Ultimate Computer Desk
Subject: General Tech | July 8, 2011 - 10:02 AM | Tim Verry
Tagged: memory leak, firefox, bug fix, aurora
With the recent change in Firefox's browser release schedule, they have been able to accelerate the release of bug fixes and new features. One bug that has plagued a number of Firefox users for a long time is a memory leak bug that could see Firefox eating up a good chunk of memory that is much more than it is supposed to be using.
In addition to mitigating the memory issues, the new build promises a faster start-up time on Mac OS X, Windows, and Linux, Firefox Sync, and enhanced font rendering.
Subject: General Tech | July 8, 2011 - 08:35 AM | Tim Verry
Tagged: thunderbolt, sony, pci-e, optical
See that blue port that looks like USB 3.0? It actually has some optical prowess up its sleeve
Sony is well known among technology enthusiasts as being a company that loves to take the proprietary route; however, in a rather paradoxical twist Sony's new optical port on the VAIO Z did not start proprietary. In fact, it was only made proprietary after Intel and Apple changed the design of the connection that became named Thunderbolt.
Both Thunderbolt and the new Sony connection are based on Light Peak, the optical standard championed by Intel that promised up to 100Gbps optical connections over 100 meter cables (though this was only in lab conditions). OEMs influenced Intel into postponing the optical variant of Light Peak in favor of a cheaper electric variant, which is what today's Thunderbolt implementation is. Thunderbolt uses an electric connection over copper using active cables to promises 10Gbps (20Gbps bidirectional) transfers. The original design for the connector for Light Peak was a connection that looked like a USB connection and would be able to support USB connections as well as accommodate the Light Peak cables. However, Apple and Intel decided a few months before what would become Thunderbolt launched to change the connector to a mini-Display Port connection.
The Sony connection on the other hand, employs the USB-like connector, and is capable of handling USB 2.0, USB 3.0 devices as well as the Sony VAIO Z's Power Media Dock which uses the optical connection that is "based on Light Peak," according to This Is My Next. While Thunderbolt devices will not be able to plug into the VAIO Z's new optical connector and Sony has not released any specifications on what it is capable of, the inclusion of a Blu Ray drive, lots of I/O options in the form of VGA, DVI, HDMI, one USB 2.0, one USB 3.0, Gigabit Ethernet, and a discrete 1GB AMD HD 6650M graphics card the connection (whatever its specific transfer capabilities) seems to be no slouch in the transfer speed(s) department.
This Is My Next has the full story on how Sony's (now) proprietary connection joined the companies lineup of proprietary technology despite Sony's efforts to use an non propriety standard (surprisingly) which you can read here. It is certainly an interesting tale of karma and surprise. What are your thoughts on the new connection?
Subject: Editorial, General Tech | July 8, 2011 - 04:29 AM | Scott Michaud
Tagged: mod, battlefield 3
The Battlefield franchise has had a somewhat indecisive history with the mod community. Battlefield 2 was developed in part by a mod team for the first game, Battlefield 1942, and mod tools were provided for several of their releases. Recently they shifted their focus on to the console spinoff, Bad Company. While the second in the franchise was created for the PC neither featured mod tools. Now that DICE has returned to the original canon with Battlefield 3 there were hopes that mod tools would return with the franchise but according to DICE that is not the case.
These tools are hard, just look at the destructibility, you wouldn’t like it…
German gaming site GameStar met up with DICE’s CEO Patrick Soderlund to discuss Battlefield 3. Soderlund answered an array of questions from the community about the Bad Company 2 friends list, alternatives to the commander mode, and the potential future of Mirror’s Edge. When questioned about the mod tools: Soderlund did not rule out the possibility of mod tools in the future but might as well done so. He contends that Frostbite 2 is too difficult to deal with for modders (which historically means: “the tools barely work for us, we are not going through the effort to polish them for public use”).
Surprisingly, to those who know me, I can agree with DICE’s stance on the issue. If your mod tools do not fit your level of polish required to release, then do not release them; provided, of course, you do not actively harm the creation of mods. With that in mind, the mod community is what will keep your game flowing with new content, for a little upfront cost. If your tail is shorter than you anticipated: this should be the first place to look.