Subject: General Tech, Mobile | August 3, 2011 - 05:21 PM | Tim Verry
Tagged: SoC, qualcomm, PC, mobile, gaming, console
Mobile gaming has seen a relatively sharp rise in popularity in recent years thanks to the rise of powerful smartphones and personal media players like the iPod Touch and its accompanying App Store. Mobile networks, powerful System On A Chips (SoC) that are capable of 3D graphics, lighting, and physics, and a large catalog of easy to download and play games have created an environment where people actually want to play games on their mobile devices. Many people now indulge themselves in quick Angry Birds sessions while in long lines, on work breaks, or wherever they have time when out and about.
One area where mobile devices have not caught on; however, is at home. Mobile devices face stiff competition from game consoles and the PC. That competition has not stopped numerous manufacturers from trying to implement an all-in-one mobile console that was portable and easy to plug into a larger display when at home. Everything from cheap controllers with logic inside that allows them to play old arcade games to smart phones with HDMI outputs costing hundreds of dollars have passed through the hands of consumers; however, the mobile console has yet to overcome the sheer mind share of consumers who prefer dedicated game consoles and their PCs.
According to Anandtech, Qualcomm, a popular manufacturer of ARM SoC for smart phones has announced its plans to pursue that vision of an integrated, mobile console. They claim that the increased power provided by next generation SoC technology will allow tablets and smartphones to deliver graphics that are better than those of current dedicated game consoles like the PS3 and Xbox 360. Due to Sony and Microsoft wanting to extend the lives of consoles well into the future, mobile technology may well surpass it. The company "is committed to delivering both the hardware and the software support needed to bring developers to these mobile platforms," according to Anandtech.
Qualcomm wants to bring portable consoles to the masses powered by their SoCs and backed by their software. The tablets and smartphones would be able to connect to displays using HDMI or wireless technology in addition to supporting controllers (or acting as a controller itself). Further, the games library will be the culmination of software from all platforms and will rival the graphical prowess of the current consoles. Qualcomm hopes that a large library and capable hardware will be enough to entice consumers to the idea of a portable console becoming their all-in-one gaming device.
Portable consoles are similar to tablets and 3D television in that there is a major push for it every few years, a few devices come out, and then it dies off to be reborn again a few years later. Whether Qualcomm is able to pull off the plans for a portable console remains to be seen; however, the device is bound to catch on at some point. At the very least, this is certainly not the last time we will hear about the portable console. You can see more of Qualcomms plans here.
What do you believe is holding back the portable console from catching on with consumers? Is it a good idea in the first place?
As Superman fans well know, Kal-El is faster than a speeding bullet, and NVIDIA’s new Tegra 3 Kal-El chip is no different. We reported on a demonstration of the Kal-El chip running games with dynamic lighting and realistic cloth physics earlier this year, and it is certainly an impressive mobile chip.
Speaking of “impressive,” Asus’ chairman Jonney Shih was quoted by Forbes recently in stating that the upcoming Transformer 2 device would be “impressive.” While Shih was not able to share any details about the device in question, he did mention that Asus will be unveiling new tablets before the end of this year. With the NVIDIA Kal-El chip set to launch this month, the timing is certainly favorable for a quad core Transformer 2.
The Transformer 1, will the second iteration have even more oomph?
Of all the Android tablets, the Transformer has been one of the most well recieved; therefore, it seems likely that Asus would pursue another iteration of the device. Whether that device will be powered by the Tegra 3 chip is still uncertain, however. Do you think the rumor of a quad core Transformer is likely, or is this something that is "too good to be true?"
Subject: Editorial, General Tech, Processors | August 3, 2011 - 06:11 AM | Scott Michaud
Tagged: Netburst, architecture
It is common knowledge that computing power consistently improves throughout time as dies shrink to smaller processes, clock rates increase, and the processor can do more and more things in parallel. One thing that people might not consider: how fast is the actual architecture itself? Think of the problem of computing in terms of a factory. You can increase the speed of the conveyor belt and you can add more assembly lines, but just how fast are the workers? There are many ways to increase the efficiency of a CPU: from tweaking the most common or adding new instruction sets to allow the task itself to be simplified; to playing with the pipeline size for proper balance between constantly loading the CPU with upcoming instructions and needing to dump and reload the pipe when you go the wrong way down an IF/ELSE statement. Tom’s Hardware wondered this and tested a variety of processors since 2005 with their settings modified such that they could only use one core and only be clocked at 3 GHz. Can you guess which architecture failed the most miserably?
Pfft, who says you ONLY need a calculator?
(Image from Intel)
Netburst architecture was designed to get very large clock rates at the expensive of heat -- and performance. At the time, the race between Intel and its competitors was clock rate: the higher the clock the better it was for marketers despite a 1.3 GHz Athlon wrecking a 3.2 GHz Celeron in actual performance. If you are in the mood for a little chuckle, this marketing strategy was all destroyed when AMD decided to name their processors “Athlon XP 3200+” and so forth rather than by their actual clock rate. One of the major reasons that Netburst was so terrible was branch prediction. Branch prediction is a strategy you can use to speed up a processor: when you reach a conditional jump from one chunk of code to another, such as “if this is true do that, otherwise do this”, you do not know for sure what will come next. Pipelining is a method of loading multiple commands into a processor to keep it constantly working. Branch prediction says: “I think I’ll go down this branch” and loads the pipeline assuming that is true; if you are wrong, you need to dump the pipeline and correct your mistake. One way that Pentium Netburst kept high clock rates was by having a ridiculously huge pipeline, 2-4x larger than the first generation of Core 2 parts which replaced it; unfortunately the Pentium 4 branch prediction was terrible keeping the processor stuck needing to dump its pipeline perpetually.
The sum of all tests... at least time-based ones.
(Image from Tom's Hardware)
Now that we excavated Intel’s skeletons to air them out it is time to bury them again and look at the more recent results. On the AMD side of things, it looks as though there has not been too much innovation on the efficiency side of things only now getting within range of the architecture efficiency that Intel had back in 2007 with their first introduction of Core 2. Obviously efficiency per core per clock means little in the real world as it tells you neither about raw performance of a part nor how power efficient it is. Still, it is interesting to see how big of a leap Intel made away from their turkey of an architecture theory known as Netburst and model the future around the Pentium 3 and Pentium M architectures. Lastly, despite the lead, it is interesting to note exactly how much work went into the Sandy Bridge architecture. Intel, despite an already large lead and focus outside of the x86 mindset, still tightened up their x86 architecture by a very visible margin. It might not be as dramatic as their abandonment of Pentium 4, but is still laudable in its own right.
Subject: General Tech | August 2, 2011 - 07:43 PM | Jeremy Hellstrom
Tagged: asus, ROG, Vulcan ANC Pro, headset, audio
If you find yourself gaming in a noisy environment and are trying to keep your contribution to the noise down by using headsets it can be frustrating if you cannot hear the game you are playing. ASUS has a way to solve that, thanks to the active noise cancellation in their Republic of Gamers Vulcan ANC Pro Gaming Headset. Red & Blackness Mods tried out a pair for review and were impressed by the light weight of the headset as well as detachable mic for when you don't need to communicate with team mates. They were not overly impressed with the sound quality but as these are specifically designed for gaming that is not a major concern and not attempting for high end audio helped keep the price down.
"Asus mostly known for their high end laptops and motherboards have recently started pumping out various accessories and even touchpads. Today we are taking a look at the Asus Vulcan ANC Pro Gaming headphones that you can pick up for around 50$. What type of quality and sound quality can we expect from these?"
Here is some more Tech News from around the web:
- ASUS Vulcan ANC (Active-Noise-Cancelling) Pro Gaming Headset Review @ HardwareHeaven
- Cooler Master Sirus Gaming Headset @ XSReviews
- Tt eSPORTS Shock One Gaming Headset Review @ eTeknix
- SonoCore Cindy COA-805 IEM @ reviewstash
- CM Storm Sirus Headset Review @ Hardware Secrets
- CM Storm Sirus @ OC3D
- TTesports Shock Spin Headphones @ Rbmods
- Grace Digital GDI-IRMS300 Internet Micro Hi-Fi Stereo System Review @MissingRemote
- Thermaltake Shock One Headset @ Bjorn3D
- Roccat KULO 7.1 USB Virtual Surround Headset Review @ Real World Labs
- Scythe Kama Bay AMP 2000 Rev. B Amplifier Review @ Madshrimps
- SteelSeries Siberia V2 for PS3 @ OC3D
Subject: General Tech | August 2, 2011 - 07:07 PM | Jeremy Hellstrom
Tagged: idf, intel developers forum, Ivy Bridge, haswell
At last years Intel Developers Forum, the star of the show was Sandy Bridge as we had not yet seen the chip in action. That seems longer than a year ago, but it was only last September which means that we are drawing close to the 2011 IDF. According to DigiTimes this year they will be focusing on mobility products, as we know Intel is working on a new(ish) form factor which they are calling Ultrabook which will replace the CULV form factor that we have known previously. There were not that many ultrabook branded Sandy Bridge products released this year and with the upcoming release of Ivy Bridge it seems that there won't be many in the future, as Ivy Bridge is intended to do everything Sandy could and use less power doing it. The other focus that DigiTimes expects to see, as do we at PC Perspective, is more information on Haswell which will be the next generation of Intel chip architecture. We don't expect to see working silicon until 2013, but you can expect a lot more information about the instruction sets it will use, which is only to be expected at a developers conference.
"CPU maker Intel is set to host its Intel Developer Forum (IDF) show in San Francisco from September 13-15 and its plans for ultrabook, upcoming Ivy Bridge- and Haswell-based processors as well, as its strategy for next-generation tablet PC processor, are all expected to become focuses at the show, according to sources from PC players.
Intel, was originally set to enter the Ivy Bridge-based CPU generation in the fourth quarter of 2011, but after considering the yield rate of the 22nm process and the market status worldwide, the company, in the end, decided to postpone the launch of the new-generation CPU to March of 2012, allowing a smooth transition between the two generation of CPU structures.
In addition to the Ivy Bridge structure, Intel is also set to reveal the detail specifications of its Haswell processor, which will appear in 2013, at IDF in September, the sources noted.
As for ultrabook, Intel will display several completed models from the first-tier notebook players including Asustek Computer, Hewlett-Packard (HP) and Lenovo, at the show with Intel is also expected to provide the detail of its ultrabook design concept as well as its three stages of execution plan for the device.
For tablet PCs, Intel has been cooperating with Google to pair up its Atom Z670 processor with Android 3.0, while will showcase its latest progress in MeeGo and AppUP Center at the show."
Here is some more Tech News from around the web:
- Intel and Mobile Computing @ ThinkComputers
- In-Depth with Mac OS X Lion Server @ AnandTech
- HTML5 poses security risks @ The Inquirer
- Skype releases its Ipad app @ The Inquirer
- TechwareLabs Zotac Factory Tour in Dong Guan China
- Weekly Giveaway #9: Dirt 3 @ eTeknix
Subject: General Tech | August 2, 2011 - 03:43 PM | Tim Verry
Tagged: firefox, chrome, browser
The Firefox UX development team recently posted a presentation showing off some of the latest design and UI (user interface) improvements for the popular Firefox web browser by Mozilla. While not all of the design choices shown in the presentation will make it into the Aurora or other beta builds, they do indicate that Mozilla is at least considering mixing up their traditional interface for upcoming releases. The image below is one of the screenshots included in the presentation, and at first glance it may be mistaken for Google's Chrome browser. However, upon closer inspection it becomes clear that Mozilla have not simply copied Chrome's minimalist design but they have gone with a similar tab design, continued with the transparency that is already present in certain builds and sprinkled some Mozilla flair on top to create one possible look for a future Firefox browser.
Some other proposed changes of the design include a new menu that is icon based versus word lists and is located on the right side of the window as well as an improved full screen experience that seeks to give web apps the screen real estate they need. A new home tab and add-on manager interface are also proposed changes. As shown in the screenshot above, tabs that are not in focus, have their backgrounds become fully transparent so that only the text is visible. This definitely helps the main tab stand out and may help in reducing the amount of distraction users face when having multiple tabs open.
While these are only proposed changes, it is apparent that Mozilla are planning some kind of major UI overhaul if they can get the users to accept it, and the next major release may well see a slightly more chrome-esque appearance with that special Firefox flair. What are your thoughts on the proposed designs, do they seem likely? If you are still using Firefox, what features of other browsers would you like to see Firefox emulate?
Subject: General Tech | August 2, 2011 - 11:54 AM | Tim Verry
Tagged: windows, operating system, microsoft
Windows XP is almost old enough that revisionist historians can have a crack at it without anyone speaking out against it. That is, it would be if not for the large number of users still using the operating system at their home and work. The decade old operating system has only now fallen below 50% of Windows' market share. More specifically, the slip in market share occurred between June and July where it fell 0.63% to a total of 49.94%.
The numbers are percentages of MS's total 87.66% market share.
In comparison, Windows Vista holds a much smaller 9.24% market share after dropping 0.28%. Microsoft’s most recent operating system, Windows 7; meanwhile, saw a gain of 0.74% to a total of 27.87% market share, which puts the new operating system well on its way to overtaking the XP juggernaut. Techspot has the full scoop on the market share situation, which you can read about here.
Are you still using Windows XP?
Subject: General Tech | August 1, 2011 - 05:53 PM | Jeremy Hellstrom
Tagged: Ivy Bridge, x79, Waimea Bay, sandy bridge-e
It almost seems as if AMD is the only company managing to keep up to their schedules, though you could argue that they don't have a CEO cracking the whip and pushing forward release dates. First NVIDIA's GPU woes and now thanks to VR-Zone we know that the X79 Waimea Bay chipset won't be made available until November, which significantly reduces the chances of it being under your Christmas Tree. It should get them to major manufacturers in time for them to consider the platform when they release their 2012 lineup, but as system builders we can only hope that someone pushes out a product as quickly as possible, so we can pick it up and spend January dealing with the inevitable bugs you get from pushing something out early.
But wait, that's not all ... how does an Ivy Bridge processor locked at a 100MHz base clock strike you? It seems that is what Intel is planning on releasing with both Ivy Bridge and Sandy Bridge-E, with a less than useful exception. Both chip architectures will be theoretically overclockable, but not in the single MHz steps that we have become used to. Instead Ivy Bridge will offer you the ability to jump from 100MHz to 133MHz, no stops in between. Sandy Bridge-E will offer higher stops but again it will limit you to only those frequency jumps, something the overclocking community is not going to appreciate.
"We don't like to be the bringers of bad news, but it's come to our attention that Intel has decided to change its high-end consumer Waimea Bay platform one more time before it launches. The only good news is that we've managed to pin-point which month the platform is expected to launch and that is November and there are several reasons behind this choice."
Here is some more Tech News from around the web:
- AMD to launch high-end FX series processors in September @ DigiTimes
- Next Generation GPUs: AMD's Cray On a Chip? @ VR-Zone
- Q & A With Western Digital @ Tech ARP
- Q & A With Western Digital Part 2 @ Tech ARP
- TechwareLabs 2011 Video Interview with Intel Mobility Brand Manager
- Chrome Extension Helps Find Noisy Tabs @ Slashdot
- Sneaky online tracking used by major websites is exposed @ The Inquirer
- Bandai Namco Gaming Event July 2011 @ HardwareHeaven
- The Summer of Honeycomb, Part 3: Win a Samsung Galaxy Tab 10.1 @ AnandTech
- Win a Nokia E6 Smartphone @ t-break
Subject: General Tech | July 31, 2011 - 05:17 PM | Scott Michaud
Tagged: battlefield 3
A non-disclosure agreement is created for those times where certain people need to receive information, get demonstrations, and so forth before the organization wishes to release to the public. We respect the wishes of the individuals and organizations that provide us with information that assists us with our jobs. Currently Battlefield 3 is undergoing private Alpha testing for a select number of invitees – I am not one of those invitees (at least, as of the time of this writing) and thus have no obligation to keep silent on information that was leaked to the public. That is convenient, of course, as some information leaked out on system benchmarks with a wide variety of video cards and processors pitted against DICE’s upcoming shooter.
The first thing to realize, and one of the main reasons for the non-disclosure, is that development builds are development builds: optimizations will be made that will speed things up; enhancements will be made that may make things possibly slower but better for some reason or another. Do not assume that these numbers will even closely reflect the finished product, just the state of the game as it exists right now.
That said and in mind: to expect a smooth experience at 1080p you should anticipate having a GTX 460 or Radeon 5830 as your minimum and a GTX 560 Ti or a Radeon 6950 if you want to hover at around 60 FPS. The GTX 590 actually fell well below the GTX 580 and even fell below the GTX 570 which means that SLi is not yet supported… which should be no surprise for an unreleased game! On the CPU side of things, while no Sandy Bridge processors were tested it looks like performance will hit a bit of a plateau around the 6-core Phenom IIs, upper Core i5, and lower Core i7 parts. Battlefield 3 Alpha really looks as if it appreciates four or more cores thumbing its nose at all the dual core offerings tested. If you would like to see more, check out GameGPU’s page.
Subject: General Tech, Systems | July 30, 2011 - 07:53 AM | Scott Michaud
Tagged: usb computer, Raspberry Pi
I must say, that unlike cake: pie is the foundation of everlasting relationships – like circumference and diameter! That, and cake always seems to end up in lies (yes, that horse is still twitchin’). While my personal favorite flavor is blueberry I might just become fond of Raspberry Pi in the near future. We originally reported on the organization dedicated to providing computing technology to the masses a few months ago when they showed off their prototype computer-in-a-usb-stick. More progress on the logistics as well as a firm specification on the PCB have occurred since then and it aligns nearly perfectly with original predictions.
That… doesn’t really look edible…
(Picture from Raspberry Pi)
The original prediction was a $25 device 700 MHz device backed by 128MB of RAM and an OpenGL ES 2.0 1080p-capable GPU. While that is still true, a second model will be released for $35 with double the RAM and an extra USB port for peripheral connectivity due to the addition of the SMSC LAN9512 two-device USB hub. The alpha board is slightly larger than the final design due to the ports required for debugging purposes and contains an extra couple layers on the PCB that will not be present in the final version. It is still expected to ship within the next 9 months (12 from original post) with the target narrowed slightly to likely sometime in 2011.