Subject: General Tech | May 9, 2012 - 07:09 PM | Jeremy Hellstrom
Tagged: gaming, witcher 2
[H]ard|OCP overheard that The Witcher 2 underwent a large update, to the tune of 10GB or so, which gave them an opportunity to pit the newest GPUs from NVIDIA and AMD against a game which has a reputation for being hard on graphics card. The Radeon HD 7970 and GeForce GTX 680 both struggled equally at 2560x1600 on this game, which is quite impressive for a year old game. The most demanding feature of the game is UberSampling, which incorporates Antialiasing and Anisotropic Filtering and can bring even these powerful graphics cards to their knees. When it was enabled [H] saw frame rates reduced to 20fps or lower, something that the old standby Crysis just cannot do. You can argue that a DX9 game lacks many of the optimizations of DX10 and 11 which help the performance of these cards but it would seem that as far as testing the raw graphics power of a card,
"The Witcher 2: Assassins of Kings just underwent a major update. The Enhanced Edition offers a new zone, more quests, more cinematics, and several bug fixes. This DX9 game has the reputation of being one of the most demanding and stressful games on modern video cards. We put this theory to the test."
Here is some more Tech News from around the web:
- Lone Survivor PC Review @ eTeknix
- Saturday Morning Satan: Diablo III Gets A Cartoon! @ Rock, Paper, SHOTGUN
- Why Elder Scrolls Online Needs To Be A Sandbox @ Rock, Paper, SHOTGUN
- No BioShock Infinite This Year, Not For You @ Rock, Paper, SHOTGUN
- Kid Icarus: Uprising Nintendo 3DS @ Tweaktown
- Dragon's Dogma Preview (PlayStation 3) @ HardwareHeaven
- Running Sega Dreamcast Games on Your PC with nullDC @ Techgage
Subject: General Tech | May 9, 2012 - 06:28 PM | Jeremy Hellstrom
Tagged: TSMC, 28nm, nvidia
In the seemingly never ending saga of TSMC, NVIDIA and the mysteriously lacking supply of 28nm GPUs on the market is another update from DigiTimes. TSMC is going to give priority to NVIDIA on their production lines, though if TSMC is still at 95% capacity that may not mean a great increase in capacity. This doesn't refute the rumours that NVIDIA is shopping around for a new supplier for their chips nor that they may be revamping the mask they use for the chips but it does imply that TSMC does not want to lose NVIDIA's business and might have some capacity to spare for them.
"Since Nvidia has been unsatisfied with TSMC's 28nm process, while the company has also not refuted rumors that the company may cooperate with Samsung Electronics or Globalfoundries, TSMC, to sooth Nvidia, has put the GPU maker on its supply priority, allowing Nvidia to be able to release its 28nm GPUs on schedule in May and June."
Here is some more Tech News from around the web:
- Micron chucks down $2.5bn lifeline to Elpida @ The Register
- Calligra Suite, the Promising Not-An-Office Suite @ Linux.com
- Microsoft makes good with a 23-fix Patch Tuesday @ The Register
- Ubuntu 12.04 Precise Pangolin @ The Inquirer
- Virtu Universal MVP Review @ Hardware Secrets
- Nikon Coolpix P510 Review @ TechReviewSource
- Cisco Linksys E4200 V2 review @ Hardware.Info
- HardwareHeaven 10 year Anniversary Giveaway
Subject: General Tech | May 8, 2012 - 09:32 PM | Jeremy Hellstrom
Tagged: zotac, AMP!, 4gb gtx 680, 4GB, factory overclocked, gtx 680
The AMP! is back from Zotac and don't let the base clock fool you into thinking that is the only advantage, it is the extra boost capability that really makes this card special. On the other hand if it is your buffer that is causing you to suffer the 4GB GTX 680 might be a little slower but it can power 4 monitors and has enough GDDR5 to make sure it does so smoothly.
HONG KONG – May 2, 2012 – ZOTAC International, a global innovator and channel manufacturer of graphics cards, mainboards and mini-PCs, today unveils the amplified ZOTAC GeForce GTX 680 AMP! Edition and high-resolution dominating ZOTAC GeForce GTX 680 4GB graphics cards for the most demanding gamers and enthusiasts.
Catered to gamers that require that extra performance kick in the most demanding gaming situations, the ZOTAC GeForce GTX 680 AMP! Edition harnesses the untapped power of the NVIDIA GeForce GTX 680 graphics processor to maximize frame rates in the latest visually stunning Microsoft DirectX 11-enabled games. Extreme high-definition and triple-display gamers can opt for the ZOTAC GeForce GTX 680 4GB which doubles the amount of video memory for that extra bit of smoothness at resolutions above 2560x1600, including 4K and 3840x1080.
The new ZOTAC GeForce GTX 680 AMP! Edition and GeForce GTX 680 4GB usher in a new era of visual computing that pushes details and pixel density for unbelievably clear and crisp graphics,” said Carsten Berger, marketing director, ZOTAC International. “The cards deliver world-class performance and are the fastest single GPU graphics cards available this generation too.“
Quad-display output capabilities enables the ZOTAC GeForce GTX 680 AMP! Edition and GeForce GTX 680 4GB to simultaneously power four independent displays in desktop mode at resolutions beyond 2560x1600, including new and upcoming 4K resolution displays. NVIDIA 3D Vision Surround technology joins three 2D or 3D displays together to render a massive wide display for superior immersion and enhanced field of view when gaming with the ZOTAC GeForce GTX 680 AMP! Edition and GeForce GTX 680 4GB.
The ZOTAC Assassin’s Creed 3-Game Pack is bundled with the two cards to let gamers take advantage of the newfound graphics power right out of the box. Assassin’s Creed I, II and Revelations takes gamers through an epic historical journey with beautiful graphics, captivating storylines and immersive gameplay.
It’s time to play with the ZOTAC GeForce GTX 680 AMP! Edition and GeForce GTX 680 4GB!
New ZOTAC GeForce GTX 680 AMP! Edition & GeForce GTX 680 4GB graphics cards
- 1536 SMX unified shaders
- ZOTAC GeForce GTX 680 AMP!
- Edition Engine clock: 1110 MHz (base), 1176 MHz (boost)
- 2GB GDDR5 memory Memory clock: 6608 MHz
- Custom Dual-Fan Cooler
- ZOTAC GeForce GTX 680 4GB
- Engine clock: 1006 MHz (base), 1058 MHz (boost)
- 4GB GDDR5 memory Memory clock: 6008 MHz
- 256-bit memory interface DVI-I, DVI-D, HDMI & DisplayPort outputs
- PCI Express 3.0 interface
- NVIDIA GPU Boost technology
- NVIDIA 3D Vision Surround capable
- NVIDIA FXAA technology
- NVIDIA TXAA technology
- NVIDIA SLI ready
- NVIDIA Adaptive Vertical Sync
- DirectX 11 technology & Shader Model 5.0
- OpenGL 4.2 compatible
- Hardware-accelerated Full HD video playback
- Blu-ray 3D ready Loss-less audio bitstream capable
- ZOTAC Assassin’s Creed 3-Game Pack included Assassin’s Creed, Assassin’s Creed II, Assassin’s Creed: Revelations
- TrackMania 2 Canyon 3-Day Game Pass included
Infectious fear is infectious
PCMag and others have released articles based on a blog post from Sophos. The original post discussed how frequently malware designed for Windows is found on Mac computers. What these articles mostly demonstrate is that we really need to understand security: what it is, and why it matters. The largest threats to security are complacency and misunderstanding; users need to grasp the problem rather than have it burried under weak analogies and illusions of software crutches.
Your data and computational ability can be very valuable to people looking to exploit it.
The point of security is not to avoid malware, nor is it to remove it if you failed to avoid it. Those actions are absolutely necessary components of security -- do those things -- but they are not the goal of security. The goal of security is to retain control of what is yours. At the same time, be a good neighbor and make it easier for others to do the same with what is theirs.
Your responsibility extends far beyond just keeping a current antivirus subscription.
The problem goes far beyond throwing stones...
The distinction is subtle.
Your operating system is irrelevant. You could run Windows, Mac, Android, iOS, the ‘nixes, or whatever else. Every useful operating system has vulnerabilities and run vulnerable applications. The user is also very often tricked into loading untrusted code either directly or delivering it within data to a vulnerable application.
Blindly fearing malware -- such as what would happen if someone were to draw parallels to Chlamydia -- does not help you to understand it. There are reasons why malware exists; there are certain things which malware is capable of; and there are certain things which malware is not.
The single biggest threat to security is complacency. Your information is valuable and you are responsible to prevent it from being exploited. The addition of a computer does not change the fundamental problem. Use the same caution on your computer and mobile devices as you should on the phone or in person. You would not leave your credit card information on a park bench unmonitored.
Subject: General Tech | May 8, 2012 - 06:13 PM | Jeremy Hellstrom
Tagged: ddr4, jedec, micron
If you are not familiar with JEDEC you might not realize why they are constantly referred to when news breaks about a new technology; if that is the case you should aquaint yourself with them. The standard for DDR4 is almost finalized with the specific changes being that the DIMM's VDDQ must remain constant at1.2V with plans to reduce VDD and speeds of 1.6 giga transfers per second to an initial objective of 3.2 giga transfers per second. This seems low considering DDR3-2400 can hit 2.4GT/s so when it arrives we may see speeds cross over like DDR2 did when we saw DDR3 first come onto the stage.
Micron has fabbed 30nm DDR4 chips, both DIMM and SODIMM varieties which operate at the lower voltage. The initial speed of 4Gbit/s that The Inquirer reports on may seem conservative but for this initial run we are only looking for a proof of concept which can be refined. Micron expects to see production swing into gear by the end of 2012 but they may not have many customers as neither AMD nor Intel have DDR4 support scheduled by that time.
"Although JEDEC has yet to finalise the DDR4 specification, Nanya and Micron have been forging ahead designing and now fabricating 30nm 4Gbit DDR4 chips that will be part of the two firms' DDR4 product range that will include registered and low-voltage registered DIMMs and SODIMMs. According to Micron, it is already sampling DDR4 modules and expects its customers to support quick implementation in 2013."
Here is some more Tech News from around the web:
- Attackers target unpatched PHP bug allowing malicious code execution @ Ars Technica
- AMD G series APUs support Windows Embedded Compact 7 @ The Inquirer
- AMD readies Trinity APU in May and preparing more CPUs for later @ DigiTimes
- Ninjalane Podcast - Diablo 3 and Game Demos What is Kickstarter and Prepping for MOA
- A bit about the diode @ Hack a Day
Subject: General Tech | May 8, 2012 - 06:58 AM | Tim Verry
Tagged: xbox 360, microsoft, gaming, console
The Xbox 360 has now been available in some form for almost seven years and has sold approximately 67.2 million units. Consumers are able to get the updated Xbox 360 4GB model for $199 USD at many retailers along with the Kinect add-on for $99. If that price still seems too steep, Microsoft has started to offer a subsidized Xbox 360 and Kinect bundle for those users lucky enough to live close to a physical Microsoft Store. There are currently 17 stores in a number of US states, with four more listed as "coming soon."
Microsoft is offering a two year contract of Xbox Live Gold for $14.99 a month. As a promotion for signing the contract, the company will sell a 4GB Xbox 360 S and Microsoft Kinect add-on for $99 USD. In total, the system will cost $458.76 plus applicable taxes–$359.76 for the monthly contract and $99 for the hardware. Interestingly, the subsidized cost ends up being more expensive than buying it outright. In under five minutes of searching around Amazon, I found 2 1 year subscriptions to Xbox Live Gold and an Xbox 360 S 4GB and Kinect hardware bundle for $380.20.
That isn’t surprising but is still interesting that the subsidized model with contract does end up being more expensive. If you can’t afford the upfront cost the subscription may be worth it, especially with the Xbox Next not coming this year. Buying the hardware outright is going to cost less but considering the Xbox is rather dated at this point, paying $99 for the hardware–with the Xbox Live Gold contract not being locked to that one console– may be a better deal should the next Xbox be released within that two year window then buying both consoles outright. At least then, you can apply the contract towards the new console and not be out as much money on the original hardware. In the end, it is a nice alternative method for getting the console and Kinect hardware.
Subject: Editorial, General Tech | May 8, 2012 - 04:39 AM | Ryan Shrout
Tagged: netgear, giveaway, contest, broadcom, 802.11ac
Broadcom and Netgear came to PC Perspective recently to discuss some upcoming products based on the new 802.11ac protocol, a new technology that will enable a minimum of 1 Gigabit wireless networking in the 5 GHz spectrum.
While we are learning about the new products that the two companies are partnering on, they offered up a few prizes for our readers: one of three new Netgear R6300 dual-band, 802.11ac routers!!
While not on the market yet, these routers will offer some impressive new features including:
The NETGEAR R6300 WiFi Router delivers next generation WiFi at Gigabit speeds. It offers the ultimate mobility for WiFi devices with speeds up to 3x faster than 802.11n.
Compatible with next generation WiFi devices and backward compatible with 802.11 a/b/g and n devices, it enables HD streaming throughout your home. The R6300 with simultaneous dual band WiFi technology offers speeds up to 450+1300‡ Mbps† and avoids interference, ensuring top WiFi speeds and reliable connections. This makes it ideal for larger homes with multiple devices. In addition, four Gigabit Ethernet ports offer ultra-fast wired connections. Wirelessly access and share USB hard drive and USB printer using the two USB 2.0 ports.
The NETGEAR Genie® app provides easy installation from an iPad®, tablet, computer or smartphone. It includes a personal dashboard, allowing you to manage, monitor, and repair your home network. NETGEAR customers can download the app at http://www.netgear.com/genie or from the Google Play or App Store.
All you have to do to enter this contest is submit your answer the question below and be sure to include your REAL email address so we can contact you!! The survey will run through the rest of this week (May 11th) and you can enter from all over the world! They had one simple question:
Subject: General Tech | May 7, 2012 - 10:01 PM | Tim Verry
Tagged: windows media center, Windows 8 Pro, windows 8, upgrade, htpc
News is circulating around the Internet that Microsoft is taking Windows Media Center out of Windows 8 and offering it as a separate paid add-on for Windows 8 Pro users. Many are not happy about the decision.
Windows Media Center is an application developed by Microsoft that provides a TV friendly interface for all the media on your computers including photos, videos, music, and television. That last function is quite possibly the biggest feature of WMC as it allows users to ditch their cable set top box (STB) and turn their computer into a TV tuner and DVR with the proper hardware.
Windows 8 Metro With Media Center Icon
The program debuted as a special edition of Windows called Windows XP Media Center Edition. It was then rolled into the general release of Windows Vista and then into many editions of Windows 7. Windows Media Center has a relatively small user base relative to the number of general Windows users, but they are a vocal and enthusiastic minority. About a month ago, I got a CableCard from Comcast (after a week of... well, let’s just say it’s not a pleasant experience) and after pairing it with the HDHomeRun Prime and my Windows 7 machines, i was able to watch and record TV on any of the computers in my house as well as on the living room TV via an Xbox 360 acting as a Windows Media Center extender. I have to say that the setup is really solid, I have all the expandable DVR space I could want, and the WMC interface is so much snappier than any cable or satellite set top box I’ve ever used. Windows 7 became that much more valuable once I was able to utilize Windows Media Center.
With that said, it is still a niche feature and I understand that not everyone needs or wants to use it. It is even a feature that I would pay for should Microsoft unbundle it. Yet, when I read a bit of news concerning Windows 8 and WMC over the weekend, I was not happy at all. According to an article at Tested.com, Microsoft is going to unbundle Windows Media Center for Windows 8 into a separate downloadable Media Center pack with a currently unknown price (so far, I’m disappointed but still willing to accept it). The Media Center pack will be made available for purchase and download using the “Add Features To Windows 8” control panel option–what was known as Windows Anytime Upgrade in previous versions of Windows.
Windows Media Center in Windows 7 - TV Guide
What is confusing (and what I find infuriating) is that users will only be able to purchase the Media Center pack if they are using the Pro version of Windows 8, leaving home users out of luck. Due to Windows 8 Pro essentially being the Ultimate Edition of previous Windows versions, it is definitely going to cost more than the base version, and that is rather disconcerting. I have no problem paying for the Media Center pack, but I do have a problem with Microsoft artificially limiting who has the right to purchase it to begin with. It just seems downright greedy of them and is a big disservice to Media Center’s faithful users. Microsoft should go with one method or the other, not both. For example, they should unbundle Media Center, and allow users of any desktop (not RT, in other words) Windows 8 version to purchase it. Alternatively, if they are going to limit Media Center to be a Pro version only feature, it should be a free download. Users should not have to pay for the privilege to pay for the software, especially when Microsoft has said that Windows 8 Media Center will not be very different from the one in Windows 7 and will only contain minor improvements.
Rick Broida of PC World has been a bit more straightforward in stating his opinion of Microsoft’s decision in saying “I’m hopping mad.” And I tend to agree with his sentiments, except for WMC needing to be free. I’d be happy to pay for it if it means Microsoft continues to support it. I just have an issue with the pricing situation that the news of the decision is suggesting. To be fair, Microsoft has not yet released final pricing information, so it may not be as bad as I’m thinking. Even so, the news that they are making WMC a paid add on and are limiting it to Windows 8 Pro only leaves a rather bad aftertaste. Mr. Broida encourages HTPC users to not upgrade, and to stick with Windows 7. I don’t think I’m at that point yet (though I get where he’s coming from), but I will say that Windows 8 was a tough sell before I heard this news, and the WMC news isn’t helping. I can only hope that Microsoft will reconsider and, dare I say it, do the right thing for their users here.
Subject: General Tech, Graphics Cards | May 7, 2012 - 07:26 PM | Scott Michaud
Tagged: pc gaming, diablo iii
Tom’s Hardware took a look at the recent beta of Diablo III and published benchmarks of its performance across multiple profiles. They have found that, for minimum quality settings, a GeForce GT 440 or Radeon 6670 will be very smooth at lower resolutions and even handle 1080p. Maximum quality settings do not lower framerate by all that much even with antialiasing enabled.
Blizzard works on their own personal time zone centered on their offices. It seems quite nebulous to most but apparently 12 years somehow signifies the end of a release cycle. The last couple of years have seen a flurry of releases for the company with two of their three major franchises seeing update twelve year after their last installment.
The latter of those two franchises is Diablo and Diablo III is set to launch in just over a week. If you wonder how your machine will handle the game, and you missed the open beta a little over a week ago, Tom’s Hardware did not miss it and has put it up against several of their test systems.
Not quite a demonic presence on your hardware…
Oddly enough, raising your specifications from minimum to high with antialiasing will only drop your framerate by approximately 20-21% at 1920x1080 resolutions. It is possible that when the full game is released that the highest quality settings could have features enabled which increases that difference slightly.
The other possibility is that the game quality settings are quite CPU-bound. Unfortunately Tom’s Hardware did not test various CPUs between low and highest to see how they scale.
If that is not the case, however, the addition of quality settings seems more about allowing the user to personalize their experience rather than supporting lesser hardware. This could be one of the rare occasions where a mild overclock has a functional use.
For those wishing to see how the game will work on mobile parts, you will likely need to wait just a little longer. The benchmark focuses on desktop components. If your PC has a minimum of a GT440 or a Radeon 6670 then you should not be concerned in the slightest about Diablo III even if you output to a 1080p TV or monitor.
Also, if you are running AMD cards -- be sure to check out our recent article about what to do with the 12.4 drivers. Diablo III likes some cards on it, but not others. How about your's?
Diablo III is scheduled to be released May 15th.
Subject: General Tech | May 7, 2012 - 05:05 PM | Jeremy Hellstrom
Tagged: audio, Creative, Sound Blaster Recon3D Fatal1ty, SoundCore3D
The newest flagship card from Creative is the Fatal1ty branded Sound Blaster Recon3D PCIe 1x card, which is the first to feature their new SoundCore3D chipset which brings 192kHz sampling rates at 24-bit to SoundBlaster. It comes with a microphone and like many of the high end cards on the market comes with a front panel which adds RCA stereo inputs jacks, DSP mode selection buttons, and analog volume and recording level knobs which can be pushed in flush with the face of the panel to both lock them and allow you to close the door on your case. [H]ard|OCP tried out Rightmark Audio Analyzer as well as their own ears to try to gauge the quality of sound produced by this new SoundBlaster series, which you can read about right here.
"Creative's latest Sound Blaster flagship sound card features its new SoundCore3D chipset along with a powerful headphone amplifier, a beam forming microphone, and the return of the company's popular front panel audio I/O bay. Is this card a worthy successor to its Audigy and X-Fi brethren?"
Here is some more Tech News from around the web:
- HIS Multi-View+Sound Adapter Review @ Madshrimps
- Enermax DreamBass Genie AP001 USB Audio Adapter Review @ Hi Tech Legion
- ROCCAT Kave 5.1 Surround Gaming Headset Review @ Legit Reviews
- Noontec Zoro Headset Review @ XtremeComputing
- KRK Systems KNS 8400 Headphones @ techPowerUp
- Ozone Onda 3HX Headset Review @ eTeknix
- Thermaltake eSports Isurus In-ear Headset Review @ Neoseeker
- IF500 Luna 5 Encore iPhone iPod Speaker Dock Review @MissingRemote
- Tritton Primer XBOX 360 Wireless Stereo Headset Review @ HardwareHeaven
- CM Storm Sirus 5.1 USB Gaming Headset @ Tweaktown
- Play .MKV's in Windows Media Center @ CoD
- Audioengine A5+ Speakers and Wireless Audio Adapter @ SPCR
- Genius SP-i250G LED Portable Gaming Speakers @ Tweaktown