All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: General Tech, Mobile | June 22, 2011 - 12:05 PM | Jeremy Hellstrom
Tagged: arm, amd, texas instruments, snapdragon, amazon, tegra
It is not just AMD which is forging a new relationship with ARM, which we saw evidence of during the AMD Fusion Developer Summit, several other manufacturers are making good on previous statements made while waiting for AMD, and are going to be selling ARM based notebooks. These companies are not on the fringe of the market, these are major vendors like ASUS which are releasing quad-core ARM based notebooks which will use SnapDragon, Tegra or TI for the graphics portion. DigiTimes has the scoop here, as well as news on a tablet which will be released by Amazon running an unspecified TI processor which we should see by August.
"Several vendors, including Samsung Electronics, Toshiba, Acer and Asustek Computer, plan to develop ARM architecture notebooks, with products possibly to be launched as early as the end of 2011, according to industry sources.
The sources pointed out that ARM-based systems using Android were already launched under the smartbook name two years ago with Toshiba and Lenovo both launching products in the retail channel. However, due to weaker than expected demand, the related products were soon phased out of the market.
Since ARM's CPU has already been upgraded from single-core two years ago to quad-core with a significant increase in performance, while the platform's storage capacity has also seen significant improvements, and an enhanced user interface, ARM is already capable of launching notebook products that are able to run for a long period of time, and if the price is attractive, there is a great chance for the products to create a brand new market segment in the IT industry.
Asustek has already made plans to launch a 13-inch ARM-based notebook adopting Nvidia's processor with Android.
The sources pointed out that there are already several brand vendors reportedly set to launch ARM-based notebooks with prices lower than US$299 to compete for market share and the vendors' processor choices include Nvidia's Tegra, Qualcomm's Snapdragon and processors from Texas Instruments."
Here is some more Tech News from around the web:
- Google Chrome extension detects dangerous websites @ The Register
- Programmers urged to code with their tootsies @ The Register
- The Linux Kernel Power Problems On Older Desktop Hardware @ Phoronix
- Making Airsoft guns far more potent @ Hack a Day
- AMD Rejects BAPCo's SYSmark 2012 - Should We? @ Techgage
Subject: General Tech, Storage | June 21, 2011 - 10:26 PM | Scott Michaud
Tagged: skydrive, microsoft
There are a number of reasons for which someone would desire to have their data accessible from the internet and there are a number of services that provide that capability in many different ways. If you are looking to collaborate on a small project with automatic syncing then you will probably find your way to Dropbox. If you are looking to access your music collection from your variety of devices then you will probably like Amazon or Google’s music lockers. Should you be looking to migrate a business to a really large online storage system then Amazon S3 might be worth hooking into. If you are a home user who wishes to store and share your photos, documents, and videos with friends and family then Microsoft recently updated their Skydrive service to help users like you; it should be available to you right now.
Why start the video from the desktop, Microsoft? A guy has feelings you know.
One feature I wished that Microsoft would have implemented to Skydrive at some point over the last few years is an easy method to map your Skydrive account to a drive letter on your computer. Sadly this feature is still not present in Skydrive. What are present are features to make Microsoft’s service look much more user friendly and much more like a native application. Their new photo browser looks quite a bit like their Windows Phone 7 tile interface with photos shown in their original aspect ratio fitting together like a puzzle. There is also a nice looking content browser that slides both pictures and videos across a viewing screen with thumbnails below for selection. With features like these with a focus on cross-browser support it is obvious that Microsoft is looking to be your family’s content hub and prevent Facebook from getting that much more powerful in this space.
What do you use, if anything, to share content with friends and family?
Subject: General Tech, Mobile, Shows and Expos | June 21, 2011 - 06:58 PM | Scott Michaud
Tagged: Huawei, CommunicAsia, Android 3.2
There seems to always be a trade show going on at some corner of the ellipsoid world particularly at this time of the year. Down in Singapore the CommunicAsia 2011 exhibition is on until the 24th and news is starting to trickle out about advancements in communication technology. If you were holding your breath until Android reached version 3.2 on devices you can almost finally exhale, if you are still conscious because you can at best hold your breath for like 8 minutes and Android products are not that quick to ship. Yet.
Seventh floor… going up… ... WHAMMY BAR!!!
Huawei announced on the 21st that they are releasing a 7-inch tablet based on Android’s 3.2 release. The tablet will feature a dual-core 1.2 GHz processor from Qualcomm but no mention of how much system RAM it will contain as it still allegedly depends on partners. The capacitive touchscreen will be IPS-based at a 217 PPI pixel density. After a little trigonometry: a 7-inch screen will have a resolution somewhere between 1280x720 and 1366x768 if its pixel density is 217 pixels per inch. The unit itself is capable of outputting 1080p to an external display through HDMI. There are currently no details towards a price, but Huawei stated that there are no plans for a Wifi-only version. The unit is expected to ship in the third quarter of this year.
Subject: Editorial, General Tech | June 21, 2011 - 02:36 PM | Ryan Shrout
Tagged: VIA, sysmark, nvidia, Intel, benchmark, bapco, amd
It seems that all the tech community is talking about today is BAPCo and its benchmarking suite called Sysmark. A new version, 2012, was released just recently and yesterday we found out that AMD, NVIDIA and VIA have all dropped their support of the "Business Applications Performance Corporation". Obviously those companies have a beef with the benchmark as it is, yet somehow one company stands behind the test: Intel.
Everyone you know of is posting about it. My twitter feed "asplode" with comments like this:
AMD quits BAPCo, says SYSmark is nutso. Nvidia and VIA, they say, also. http://bit.ly/kHvKux
AMD: Voting For Openness: In order to get a better understanding of AMD's press release earlier concerning BAPCO... http://bit.ly/kNtKkj
Ooh, BapCo drama.
Even PC Perspective posted on this drama yesterday afternoon saying: "The disputes centered mostly over the release of SYSmark 2012. For years various members have been complaining about various aspects of the product which they allege Intel strikes down and ignores while designing each version. One major complaint is the lack of reporting on the computer’s GPU performance which is quickly becoming beyond relevant to an actual system’s overall performance. With NVIDIA, AMD, and VIA gone from the consortium, Intel is pretty much left alone in the company: now officially."
Obviously while cutting the grass this morning this is the topic swirling through my head; so thanks for that everyone. My question is this: does it really matter and how is this any different than it has been for YEARS? The cynical side of me says that AMD, NVIDIA and VIA all dropped out because each company's particular products aren't stacking up as well as Intel's when it comes to the total resulting score. Intel makes the world's fastest CPUs, I don't think anyone with a brain will dispute that, and as such on benchmarks that test the CPU, they are going to have the edge.
We recently reviewed the AMD Llano-based Sabine platform and in CPU-centric tests like SiSoft Sandra, TrueCrypt and 7zip the AMD APU is noticeably slower. But AMD isn't sending out press releases and posting blogs about how these benchmarks don't show the true performance of a system as the end user will see. And Intel isn't pondering why we used games like Far Cry 2 and Just Cause 2 to show the AMD APU dominating there. Why? Because these tests are part of a suite of benchmarks we use to show the overall performance of a system. They are tools which competent reviewers wield in order to explain to readers why certain hardware acts in a certain way in certain circumstances.
Continue reading for more on this topic...
Subject: General Tech | June 21, 2011 - 11:59 AM | Jeremy Hellstrom
Tagged: audio, wireless audio, headset, razer
RAZER's new headphones are wireless, with three rechargable AAA batteries which are easily accessible so that when they do finally stop holding a charge you can easily get at them to swap them for new ones. The charging dock is quite well designed, actually giving you something more stylish than a nail to hang your headphones on when they are not in use as well as charging them. When you are using them you will enjoy interference free quality audio, both sending and receiving, if the review at Mad Shrimps is anything to go by. They liked this headset more than other wireless sets they have used in the past and compared them favourably to wired sets.
"The RAZER Chimaera Gaming Headset offers wireless connectivity for maximum liberty of movement, features large cups for confortable wear during long gaming sessions, a uni-directional microphone with a flexible mic boom and an easy to use charging dock."
Here is some more Tech News from around the web:
- SteelSeries Spectrum 7XB Gaming Headset Review @ HardwareHeaven
- Roccat Kulo Stereo Headset Review @ eTeknix
- SteelSeries 7H For i Devices @ OC3D
- Arctic Sound P351 Gaming Headset Review @ HardwareLOOK
Subject: General Tech | June 21, 2011 - 11:38 AM | Jeremy Hellstrom
Tagged: opteron, interlagos, bulldozer, amd
It might just be that the ISC is the perfect place to show off their new chip or it may have been Intel's displaying of the 50 core Knights Corner silicon yesterday; whatever triggered it we finally get a look at AMD's Bulldozer. A 1U server by Supermicro contained two 16-core Bulldozer chips though other vendors are claiming to be able to fit a 4 socket system in the same size case. Those sweet talking wonks over at The Inquirer not only talked their way into getting a few photos of the system they were even allowed to fondle it, which revealed heatsinks that were cool enough to touch even when running POVRay which lends credence to the idea of 4 CPUs, or 64 cores, in a 1U box. We are still looking at Q3 for a release of the new Opteron architecture, with no news at all as to AMD's plans to turn that architecture into an APU in a later generation of chips.
What a little chutzpah gets you
"CHIP DESIGNER AMD chose the International Supercomputing Conference (ISC) to finally demonstrate a working Bulldozer system.
At AMD's ISC stand one could find several 2U and 4U servers built with older Opteron chips, but it was a 1U pizza box server made by Supermicro that housed two 16-core Bulldozer chips running live demonstrations of POVRay. This is the first time that AMD has publicly displayed its next generation Opteron processor, codenamed Bulldozer."
Here is some more Tech News from around the web:
- 13-Year-Old Password Security Bug Fixed @ Slashdot
- McAfee announces Wavesecure security software @ The Inquirer
- Microsoft could buy RIM @ The Inquirer
- Intel to launch Ivy Bridge in March 2012 @ DigiTimes
- Google revives TV buzz with SageTV buy @ The Register
- Google bypasses admin controls with latest Chrome IE @ The Register
- Resistive memory: how small can you go? @ NanotechWeb
Subject: General Tech | June 21, 2011 - 03:33 AM | Scott Michaud
Tagged: vsis, sudo, PlanetLab, linux
The Unix-based computer has always been ahead of the curve when it comes to enforcing permission levels upon their users. Back during the infancy of operating systems the idea of permissions did not compute with many of the platform developers. Today it is next to impossible to imagine a modern operating system without some sort of hierarchy of trust. Historically there were various methods of controlling access: first there was “su” which temporarily logged you in as another user; then there was “sudo” which let you just execute commands to another user rather than log in as them; now PlanetLab claims to have a better offering, called “Vsys”.
Vsys was created to allow finer control over what user is allowed what action. One feature is the ability to create extensions, scripts from executable files, to define what is permissible and what is not. It is apparently possible to permit certain combinations of commands but no other similar combinations. PlanetLab, the creator of Vsys, is a research network headquartered at Princeton University. From the wording of the article it appears as if Vsys was an internal tool developed for their researchers to have more specific access to what was necessary which they now released publicly. While it has been stated that Vsis will not replace sudo for the common user it should be useful for administrators of larger groups of users.
Subject: General Tech, Storage | June 20, 2011 - 09:06 PM | Ryan Shrout
Tagged: storage, raid, network attached storage, NAS, drobo
Just Delivered is a new section of PC Perspective where we share some of the goodies that pass through our labs that may or may not see a review, but are pretty cool none the less.
When the time is right for dedicated network storage and you don't want to go through the hassle or complication of building your own FreeNAS or other type of device, one of the best options on the market according to our own Allyn Malventano is a Drobo.
For an upcoming review we just received a new Drobo FS, the network attached version of the Drobo lineup. Available in both a standard and a "Pro" model, the former with 5 bays the latter with 8, they are about as idiot-proof and easy to setup as a NAS can be.
The Drobo FS only has a single connectivity option: the Gigabit Ethernet port for connection to your primed-and-ready router. Adding or swapping hard drives for larger models is super easy and the "BeyondRAID" technology makes it reliable as well as simple to use.
We are looking forward to putting the Drobo FS to the test in the coming days and reporting back to you on the performance, features and reliability of it.
Subject: General Tech, Processors | June 20, 2011 - 04:46 PM | Scott Michaud
Tagged: VIA, sysmark, nvidia, bapco, amd
People like benchmarks. Benchmarks tell you which component to purchase while your mouse flutters between browser tabs of various Newegg or Amazon pages. Benchmarks let you see how awesome your PC is because often videogames will not for a couple of years. One benchmark you probably have not seen here in a very long time is Sysmark from the Business Applications Performance Corporation, known as BAPCo to its friends and well-wishers. There has been dispute over the political design of BAPCo and it eventually boiled over with AMD, NVIDIA, and VIA rolling off the sides of the pot.
Fixed that for you
The disputes centered mostly over the release of SYSmark 2012. For years various members have been complaining about various aspects of the product which they allege Intel strikes down and ignores while designing each version. One major complaint is the lack of reporting on the computer’s GPU performance which is quickly becoming beyond relevant to an actual system’s overall performance. With NVIDIA, AMD, and VIA gone from the consortium, Intel is pretty much left alone in the company: now officially.
Subject: General Tech | June 20, 2011 - 12:11 PM | Jeremy Hellstrom
Tagged: Intel, mic, larrabee, knights corner, 50 GPGPU
Knights Corner is not exactly Larrabee but the idea behind both are very similar. A large number of GPGPUs are integrated with a CPU, Intel is using a Xeon core now as opposed to a Pentium; with the GPGPUs hooked up in a similar method to Larrabee's ring of Pentium cores. The design is proven as they have sold units of the previous generation Kights Ferry and offers a feature that a lot of programmers are going to appreciate; instead of needing to learn a new language like CUDA or OpenCL, standard x86 scalar code is used to program these chips. This architecture is also expected to scale very well, for as ARM recently pointed out only specific multithreaded applications continue to scale well as more cores are added. Drop by The Inquirer for more information.
They will likely be sold as PCIe card like the Knights Ferry card pictured above.
"CHIPMAKER Intel has announced its second generation hybrid core technology codenamed 'Knights Corner'.
Knights Corner is Intel's second chip in its Many Integrated Core (MIC) chip line and will feature Xeon X86 cores and more than 50 GPGPU cores loosely based on what was previously known as Larrabee. Knights Corner will be fabricated using Intel's 22nm tri-gate process node beginning in 2012, though the firm would not be drawn on the exact core count at this time."
Here is some more Tech News from around the web:
- Japan's 8-petaflop K Computer Is Fastest On Earth @ Slashdot
- Intel admits that Moore's Law is not enough @ The Inquirer
- When WiFi doesn't work: a guide to home networking alternatives @ Ars Technica
- Western Digital Livewire PowerLine AV Kit @ TechwareLabs
- US reveals Stuxnet-style vuln in Chinese SCADA 'ware @ The Register
- Adobe offloads unwanted Linux AIR onto OEMs @ The Register
- Designcord 5 Metre Autorewind Cable Reel Extension Lead Review @ eTeknix
- x264 HD Benchmark 4.0 @ TechARP
- ArtRage: quality digital painting on the cheap @ Ars Technica
- Ultra Simple 360-degree Photo Hack @ Make
- AMD Developer Summit lacks Bulldozer details @ The Inquirer
- Nokia Connections 2011 - Our Expectations @ t-break
- DreamHack Summer Festival kicks off! Day One! @ eTeknix
- Interview with Ziad Matar of Qualcomm @ t-break
- Patriot Xporter XT Rage 32GB Flash Drive Giveaway! @ ThinkComputers
- Weekly Giveaway #2: Foxconn Flaming Blade GTI, Innergie mCube Lite, SteelSeries Spectrum AudioMixer, StarTech ExpressCard eSATA Controller Adapter Card @ eTeknix
Subject: General Tech, Displays | June 20, 2011 - 04:03 AM | Scott Michaud
Tagged: widi, D-Link
There are a lot of benefits of having a home theatre PC but still one major drawback: having the PC by the TV. Intel has worked hard to find a solution and released the specification under the name “WiDi”, a wireless display specification that lets you share your monitor with an HDTV attached to a wireless receiver box. D-Link has just recently launched their WiDi receiver in the US with Canada coming next month; will WiDi start picking up market share with more capable devices?
Why do network appliances these days look like pillows?
(Image from D-Link)
The D-Link MainStage (known as DHD-131 to its friends) has only a power cable to its name apart from your choice of video and audio connection to your TV or sound system. For choice of connection you have two video options and three audio options: on the video side you have HDMI for your high-resolution viewing and standard RCA for your standard definition devices; on the audio side you have optical audio or HDMI for surround and white and red RCA for stereo. Apart from a power button and a reset button that is the whole of this unit.
Plastic case with on/off butty. Baby got back shots.
(Image from D-Link)
One thing that typically holds back other implementations of WiDi that I have seen, and I assume this is no exception, is latency. The slight lag when controlling a media program or browsing a website is acceptable however it would really hold back the use of a PC as a console replacement unless the video card is directly connected to the TV which is a definite shame but to be expected given the bandwidths over WiFi that we are talking about. If you happen to be interested in this solution, however, it retails for just under 130$.
Subject: Editorial, General Tech | June 20, 2011 - 03:24 AM | Tim Verry
Tagged: simulator, networking, Internet, cyber warfare
Our world is the host to numerous physical acts of aggression every day, and until a few years ago those acts have remained in the (relatively) easily comprehensible physical world. However, the millions of connected servers and clients that overlay the numerous nations around the world have rapidly become host to what is known as “cyber warfare,” which amounts to subversion and attacks against another people or nation through electronic means-- by attacking its people or its electronic and Internet-based infrastructure.
While physical acts of aggression are easier to examine (and gather evidence) and attribute to the responsible parties, attacks on the Internet are generally the exact opposite. Thanks to the anonymity of the Internet, it is much more difficult to determine the originator of the attack. Further, the ethical debate of whether physical actions in the form of military action is appropriate in response to online attacks comes into question.
It seems as though the Pentagon is seeking the answers to the issues of attack attribution and appropriate retaliation methods through the usage of an Internet simulator dubbed the National Cyber Range. According to Computer World, two designs for the simulator are being constructed by Lockheed Martin with a $30.8 million USD grant and Johns Hopkins University Applied Physics Laboratory with a $24.7 million USD grant provided by DARPA.
The National Cyber Range is to be designed to mimic human behavior in response to various DefCon and InfoCon (Informational Operations Condition) levels. It will allow the Pentagon and authorized parties to study the effectiveness of war plan execution as it simulates offensive and defensive actions on the scale of nation-backed levels of cyber warfare. Once the final National Cyber Range design has been chosen by DARPA from the two competing projects (by Johns Hopkins and Lockheed Martin), the government would be able to construct a toolkit that would allow them to easily transfer and conduct cyber warfare testing from any facility.
Image cortesy Kurtis Scaletta via Flickr Creative Commons.
Subject: General Tech, Mobile | June 19, 2011 - 12:22 AM | Scott Michaud
Tagged: tablet, sony, S2, S1
We are going to see quite a few Android-based tablets come out in the next few months as the flood gates open for tablet creators. We have been reporting on strong rumors have been pointing to Amazon stepping in the tablet space to extend their Kindle portfolio this fall. Amazon is generally very successful when they decide to step in the market, yet that did not deter Sony from preparing to dive in to the tablet space as well. Sony are preparing to launch a 9-inch tablet and a dual screen 5.5-inch tablet in the autumn and to build hype they have released a video ad campaign to build hype for that event.
This “Two Will” Pass
As you can tell from watching the video, it says little about the product except that they slide really quickly, absolutely love someone, cast ominous shadows, and can kill action figures with lightbulb mind bullets. Sony did mention that this is just the first episode of five so it is possible that their later videos may be more informative. However, if you just want to see what an Echochrome 2-esque city has to do with Android tablet then be sure to watch the next four commercials.
We apologize for the lack of a podcast, but you don't often get a chance to see your city humiliate its self
Subject: General Tech | June 17, 2011 - 06:36 PM | Jeremy Hellstrom
Tagged: friday, PC Perspective Forums
Before your weekly tour through the PC Perspective Forums, it would behoove you to check out the bottom of some of our front page stories. If you click on the comments link, or just scroll down after clicking on an article you will notice it is possible to have a discussion about that article right there on the front page with other readers and with the creator of the review or news post. It is easy and you don't even need to sign up, though we would prefer that you do as membership at PC Perspective does have its privileges. If you want to remain anonymous or unverified then certainly do, though we do require you to fill something in the email address box and read a captcha, as there are still spammers out there on the internets.
To get really in depth advice and opinions you really should head off to the PC Perspective Forums, we can't give you indepth instructions on using Microsoft's debugging tools in the comments section but it is a piece of cake (slightly old cake, but still) to provide step by step instructions with pictures in the Forum its self. Some things just shouldn't be on the front page, but in the forums you will find kind souls ready to help your wetware as well as your hardware. Sometimes you will even find independent reviews in the Forums.
In the Cases'n'Cooling Forum, a new case mod has appeared; the Blood Ice HAF 922 is worth a look whether you are into case modding or just want to see an impressive tool set and workbench. In the Storage Forum a member is having an unpleasant time with a recent SSD upgrade as is someone in the Linux Forum.
The live watchers are already upset with us and the rest will just be receiving the bad news, but for the first time in quite a while we failed to provide our dedicated viewers a fresh PC Perspective Podcast on Wednesday. With Ryan in Seattle schmoozing with AMD and a pending riot of Vancouver residents, we decided to call it off. Perhaps next week we shall torture you with a double length episode?
Subject: General Tech | June 17, 2011 - 02:37 PM | Scott Michaud
We have been long battling online menaces that are looking to generate money off of the grief of others. It used to be simple for the attack to be successful: release virus; ???; profit. Now that worms are much less common the focus has shifted from invading a person’s computer to tricking the person to allow you in their computer or attacking the service they are accessing. Now, what was once a far-fetched joke by a popular comic strip is true: people are being contacted at home and told to infect their computer.
Your call is VERY important to us.
The story for security has always been the same: be careful what you do, keep your attack surface as small as possible, and limit the damage in the event of a breech. You need to be aware, regardless of what platform you utilize, that you are only as safe as your level of complacency. If someone is attempting to get you to do something quickly, they likely are trying to play on your complacency by distracting you with an urgency. The disappointing part is that in the heat of the moment even someone aware of these attacks could still be susceptible to them because social engineering is simply very effective.
All of the above said, the silver lining to this whole problem is that the attackers are getting substantially more desperate which means that it is only a matter of time before the pool of attackers shrinks due to lack of profitability. The problem will never go away, but as the difficulty steadily increases for the attackers (which it is, otherwise they would not be so inventive) the draw of money will seem much less luscious.
Subject: General Tech | June 17, 2011 - 02:24 PM | Jeremy Hellstrom
Tagged: TSMC, southern islands, northern islands, llano, global foundries, arm, amd, 40nm, 32nm, 28nm
Back in April there was a kerfuffle in the news about a deal penned between AMD, Global Foundries and TSMC. It is not worth repeating completely as you can follow the story by using the previous link, suffice to say that it did not indicate problems with the relationship between AMD and Global Foundries.
The previous post was specifically about 40nm and 32nm process chips, however today we hear from DigiTimes that TSMC has scored a deal with AMD for the 28nm Southern Islands APUs of which we have seen much recently. The 40nm Northern Islands GPUs will also be produced by TSMC. That leaves a lot of production capabilities free at Global Foundries to work on ARM processors.
"AMD reportedly has completed the tape-out of its next-generation GPU, codenamed Southern Islands, on Taiwan Semiconductor Manufacturing Company's (TSMC) 28nm process with High-k Metal Gate (HKMG) technology, according to a Chinese-language Commercial Times report. The chip is set to expected to enter mass produciton at the end of 2011.
TSMC will also be AMD's major foundry partner for the 28nm Krishna and Wichita accelerated processing units (APUs), with volume production set to begin in the first half of 2012, the report said.
TSMC reportedly contract manufactures the Ontario, Zacate and Desna APUs for AMD as well as the Northern Island family of GPUs. All of these use the foundry's 40nm process technology.
TSMC was quoted as saying in previous reports that it had begun equipment move-in for the phase one facility of a new 12-inch fab (Fab 15) with volume production of 28nm technology products slated for the fourth quarter of 2011. The foundry previously said it would begin moving equipment into the facility in June, with volume production expected to kick off in the first quarter of 2012."
Here is some more Tech News from around the web:
- ARM acquires Obsidian Software @ The Inquirer
- Mozilla pushes out final Firefox 5 test build @ The Register
- Sega Hacked @ XSReviews
- Tablets of 2011: What to Look For - June Update @ TechSpot
- A few thoughts on Ultrabooks @ The Tech Report
- Dsabling Windows Pagefile & Hibernation to Reclaim SSD Space @ Techgage
- Overclockers Benchmarking Party II: Where the Bell Tolls!
- Post Computex 2011 - Part 2 @ Bjorn3D
Subject: General Tech, Graphics Cards, Mobile | June 17, 2011 - 04:35 AM | Scott Michaud
Tagged: webgl, microsoft
WebGL: Heaven or Hell?
(Image from MrDoob WebGL demo; contains Lucy model from Stanford 3D repository)
WebGL is an API very similar to OpenGL ES 2.0: the API used for OpenGL features in embedded systems, particularly smart phones. The goal of WebGL is to provide a light-weight, CSS obeying, 3D and shader system for websites that require advanced 3D graphics or even general purpose calculations performed on the shader units of the client’s GPU. Mozilla and Google currently have support in their public browsers with Opera and Apple shipping support in the near future. Microsoft has stated that allowing third-party websites that level of access to the hardware is dangerous as security vulnerabilities that formerly needed to be exploited locally can now be exploited from the web browser. This is an area of expertise that Microsoft knows all too well from their past attempts at active(x)ly adding scripting functionality to the web browser evolving into a decade-long game of whack-a-mole for security holes.
But skeptics to Microsoft’s position could easily point to their effort to single out the one standard based on OpenGL, competitor to their still-cherished DirectX standard. Regardless of Microsoft’s motives it seems to put to rest the question of whether Microsoft will be working towards implementing WebGL in any release of Internet Explorer currently in development.
Do you think Microsoft is warning its competitors about its past ActiveX woes, or is this more politically motivated? Comment below (registration not required.)
Subject: General Tech, Storage | June 16, 2011 - 03:02 PM | Scott Michaud
Tagged: ssd, Intel, enterprise
Intel is currently in the process of releasing their 2011 lineup of solid state hard drives. A lot of news and products came out regarding their consumer 300-series and enthusiast 500-series line however it has been pretty silent regarding their enterprise 700-series products. That has changed recently with the release of specifications as a result of Anandtech’s coverage of the German hardware website ComputerBase.de.
And how does it compare to OCZ?
Intel will be releasing two enterprise SSDs: the SATA 3 Gbps based 710 SSD codename Lyndonville and the PCI express 2.0 based 720 SSD codename Ramsdale. The SATA based 710 will feature 25nm MLC-HET flash at capacities of 100, 200, and 300 GB. The 710 will have read and write speeds of 270/210 MB/s with 35,000/3300 read and write IOPS at 4KB and a 64MB cache. The PCIe based 720 will feature 34nm SLC flash at capacities of 200 and 400 GB. The 720 will be substantially faster than the 710 with read and write speeds of 2200/1800 MB/s with 180,000/56,000 read and write IOPS at 4KB and a 512MB cache. On the security front the 710 will be encrypted with 128 bit AES encryption where the 720 will be encrypted with 256 bit AES.
While there has been no hint toward pricing of these drives Intel is still expected to make a second quarter release date for their SATA based 710 SSD. If you are looking for a PCI express SSD you will need to be a bit more patient as they are still expected to be released in the fourth quarter. It will be interesting to see how the Intel vs OCZ fight will play out in 2012 for dominance in the PCIe-based SSD space.
Subject: General Tech | June 16, 2011 - 12:57 PM | Jeremy Hellstrom
Tagged: amd, Intel, nvidia
In some sort of bizarre voyeuristic hardware love/hate triangle AMD, Intel and NVIDIA are all semi-intertwined and being observed by Microsoft. Speaking with The Inquirer the VP of product and platform marketing at AMD, Leslie Sobon, stated that there was no chance that Intel would attempt to purchase NVIDIA as AMD did with ATI. AMD's purchase was less about the rights to the Radeon series as it was taking possession of the intellectual property that ATI owned after a decade of creating GPUs and lead directly to the APUs that AMD has recently released which will likely become their main product. Intel already has a working architecture that combines GPU and CPU and doesn't need to purchase another company's IP in order to develop that type of product.
There is another reason for purchasing NVIDIA though, which has very little to do with their discreet graphics card IP and everything to do with Tegra and Fermi which are two specialized products which so far Intel doesn't have an answer for. A vastly improved and shrunken Atom might be able to push Tegra off of mobile platforms and perhaps specialized SandyBridge CPUs could accelerate computation like the Fermi products do but so far there are no solid leads, only speculation.
If you learn more from your failures than your successes then Intel knows a lot about graphics.
"CHIP DESIGNER AMD believes that it is on a divergent path from Intel thanks to its accelerated processor unit (APU) and that Intel buying Nvidia "would never happen"."
Here is some more Tech News from around the web:
- Find Out if Your Passwords Were Leaked by LulzSec Right Here @ Gizmodo
- Adobe patches critical bugs in Flash and Reader @ The Register
- Umi, we hardly knew ye: contemplating the fate of the videophone in 2011 @ Ars Technica
- 'A SHARK attacked my ROBOT', gasps ex-Sun exec @ The Register
- We’ve got a real bone to pick with this mouse @ Hack a Day
- Fun Quotes from the AFDS Media Roundtable @ SemiAccurate
Subject: General Tech, Shows and Expos | June 15, 2011 - 09:14 PM | Scott Michaud
Tagged: opencl, amd, AFDS
If you are a developer of applications which requires more performance than a CPU alone can provide then you are probably having a gleeful week. Today Microsoft announced their competitor to OpenCL and we have a large write-up about that aspect of their keynote address. If you are currently an OpenCL developer you are not left out, however, as AMD has announced new tools designed to make your life easier too.
General Purpose GPU utilities: Because BINK won't satisfy this crowd.
(Logo trademark Apple Inc.)
AMD’s spectrum of enhanced tools includes:
- gDEBuger: An OpenCL and OpenGL debugger, profiler, and memory analyzer released as a plugin for Visual Studio.
- Parallel Path Analyzer (PPA): A tool designed to profile data transfers and kernel execution across your system.
- Global Memory for Accelerators (GMAC) API: Lets developers use multiple devices without needing to manage multiple data buffers in both the CPU and the GPU.
- Task Manager API: A framework to manage scheduling kernels across devices.
These tools and utilities should make the development of software easier and allow more developers to take the risk on the new technology. The GPU has already proven itself worthy of more and more important tasks and it is only a matter of time before it is finally ubiquitous enough that it is a default component as important as the CPU itself. As an ironic aside, that should spur the adoption of PC Gaming given how many people would have sufficient hardware.