All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: General Tech | September 29, 2014 - 03:41 AM | Scott Michaud
Tagged: Realsense 3D, realsense, kinect, Intel
RealSense is Intel's 3D camera initiative for bringing face recognition, gesture control, speech input, and augmented reality to the PC. Its closest analogy would be Microsoft's Kinect for Windows. The technology has been presented at Intel keynotes for a while now, embodied in the "Intel Perceptual Computing SDK 2013" under its "Perceptual Computing" initiative.
Since August 31st, that has been removed from their site and replaced with the Intel RealSense SDK. While the software is free, you will probably need compatible hardware to do anything useful. None is available yet, but the "Intel RealSense Developer Kit" hardware (not to be confused with the "Intel RealSense SDK", which is software) is available for reservation at Intel's website. The camera is manufactured by Creative Labs and will cost $99. They are also very clear that this is a developer tool, and forbid it from being used in "mission critical applications". Basically, don't trust your life on it, or the lives and health of any other(s) or anything.
The developer kit will be available for many regions: the US, Canada, much of Europe, Brazil, India, China, Taiwan, Japan, Malaysia, South Korea, New Zealand, Australia, Russia, Israel, and Singapore.
Subject: General Tech | September 29, 2014 - 02:59 AM | Scott Michaud
Tagged: assassin's creed, pc gaming, ubisoft
Ubisoft's upcoming Assassin's Creed: Rogue is currently only announced for the Xbox 360 and PlayStation 3, but it might see a PC release, too. This is particularly weird because Rogue is scheduled to launch, for the two aforementioned consoles, on the same day as Assassin's Creed: Unity is scheduled for the Xbox One, PlayStation 4, and PC. Unless they are planning a delayed PC launch, PC gamers might receive two games, in the same franchise, on the same day.
I would have to expect that its PC release would need to be staggered though, right? I mean, how would they market two similar games at the same time? In particular, how would they market two fairly long, similar games at the same time? I mean, thanks Ubisoft, but it just seems like an unnecessary risk of market cannibalization.
About that evidence, though. PC Gamer apparently found reference to the title in the Brazillian software ratings board and the title was mentioned on one of Ubisoft's Uplay page for the PC. Those are pretty good pieces of evidence, although we need to take their word on it, which is implicitly trusting screenshots from NeoGAF. Also, PC Gamer really needs to link to the exact thread at NeoGAF because it was buried under the first several pages by the time I got there.
Assassin's Creed: Unity launches on November 11th, 2014 (not the 14th). Rogue -- maybe?
Subject: Graphics Cards, Processors, Mobile | September 29, 2014 - 01:53 AM | Scott Michaud
Tagged: apple, a8, a7, Imagination Technologies, PowerVR
First, Chipworks released a dieshot of the new Apple A8 SoC (stored at archive.org). It is based on the 20nm fabrication process from TSMC, which they allegedly bought the entire capacity for. From there, a bit of a debate arose regarding what each group of transistors represented. All sources claim that it is based around a dual-core CPU, but the GPU is a bit polarizing.
Image Credit: Chipworks via Ars Technica
Most sources, including Chipworks, Ars Technica, Anandtech, and so forth believe that it is a quad-core graphics processor from Imagination Technologies. Specifically, they expect that it is the GX6450 from the PowerVR Series 6XT. This is a narrow upgrade over the G6430 found in the Apple A7 processor, which is in line with the initial benchmarks that we saw (and not in line with the 50% GPU performance increase that Apple claims). For programmability, the GX6450 is equivalent to a DirectX 10-level feature set, unless it was extended by Apple, which I doubt.
Image Source: DailyTech
DailyTech has their own theory, suggesting that it is a GX6650 that is horizontally-aligned. From my observation, their "Cluster 2" and "Cluster 5" do not look identical at all to the other four, so I doubt their claims. I expect that they heard Apple's 50% claims, expected six GPU cores as the rumors originally indicated, and saw cores that were not there.
Which brings us back to the question of, "So what is the 50% increase in performance that Apple claims?" Unless they had a significant increase in clock rate, I still wonder if Apple is claiming that their increase in graphics performance will come from the Metal API even though it is not exclusive to new hardware.
But from everything we saw so far, it is just a handful of percent better.
Subject: General Tech | September 28, 2014 - 08:30 PM | Scott Michaud
Tagged: mount & blade, taleworlds, mount & blade ii, bannerlord, pc gaming
The Mount & Blade franchise is enjoyed among a relatively small, dedicated group of fans. One leading reason for this uptake is the large base of third-party content from its modding community. One mod, Mount & Musket, led to the creation of a game studio, Flying Squirrel Entertainment, when the mod was picked up into an official expansion, Mount & Blade: Warband: Napoleonic Wars. Sometimes taxonomy can be proper but a little bit excessive.
Developer, TaleWorlds, builds games atop their own, proprietary engine and designs it with modders in mind. They are currently in development of Mount & Blade II: Bannerlords, a prequel to Mount & Blade: Warband. In the video, below, they explain that every feature in the video is available for third-parties. This includes painting layers of materials and foliage, generating terrain by height-maps, and tessellation.
Hopefully they also add "connect to IP"...
While the game was first announced two years ago, it is still in a "when it's done" phase. The publisher is still unknown. Paradox Interactive was attached to the first three games, and Napoleonic Wars, but are not involved with Bannerlord, according to a Reddit AMA from last year. As popular as it is, at least for what it is, TaleWorlds could even be self-publishing to digital distribution platforms like Steam, Desura, GoG, and others, but that is just speculation.
Subject: General Tech, Graphics Cards, Motherboards, Cases and Cooling | September 28, 2014 - 12:25 AM | Ryan Shrout
Tagged: X99, video, maxwell, live, GTX 980, GTX 970, evga
UPDATE: If you missed the live stream with myself and Jacob, you can catch the entire event in the video below. You won't want to miss out on seeing the first ever GTX 980 water block as well as announcements on new Torq mice!
EVGA has been a busy company recently. It has continued to innovate with new coolers for the recent GTX 980 and GTX 970 card releases, newer power supplies offer unique features and improved quality and power output, a new line of X99 chipset motherboards including a Micro ATX variant and hey, the company even released a line of high-performance mice this year! PC Perspective has covered basically all of these releases (and will continue to do so with pending GPU and MB reviews) but there is a lot that needs explaining.
To help out, an industry and community favorite will be stopping by from EVGA to the PC Perspective offices: Jacob Freeman. You might know him as @EVGA_JacobF on Twitter or have seen him on countless forums, but he will making an in-person appearance on Friday, September 26th on PC Perspective Live! We plan on discussing the brand new ACX 2.0 cooler on the Maxwell GPUs released last week, go over some of highlights of the new X99 motherboards and even touch on power supplies and the Torq mice line as well.
EVGA GTX 980/970, X99, PSU and Torq Live Stream featuring Jacob Freeman
3pm ET / 12pm PT - September 26th
EVGA has been a supporter of PC Perspective for a long time and we asked them to give back to our community during this live stream - and they have stepped up! Look at this prize list:
- 1 x EVGA GeForce GTX 980 SC
- 1 x EVGA GeForce GTX 970 SC ACX 2.0
- 1 x EVGA X99 Classified
- 1 x EVGA X99 FTW
- 1 x EVGA X99 SLI
- 1 x EVGA SuperNOVA 750 G2 PSU
- 4 x Torq Mice
How can you participate and win these awesome pieces of hardware? Just be here at 3pm ET / 12pm PT on http://www.pcper.com/live and we'll be announcing winners as we go for those that tune in. It really couldn't be more simple!
If you have questions you want to ask Jacob about EVGA, or any of its line of products, please leave them in the comments section below and we'll start compiling a list to address on the live stream Friday. Who knows, we may even save some prizes for some of our favorite questions!
To make sure you don't miss our live stream events, be sure you sign up for our spam-free PC Perspective Live! Mailing List. We email that group a couple hours before each event gets started.
Subject: Graphics Cards | September 27, 2014 - 07:24 PM | Ryan Shrout
Tagged: nvidia, maxwell, gsync, g-sync, freesync, adaptive sync
During an interview that we streamed live with NVIDIA's Tom Petersen this past Thursday, it was confirmed that NVIDIA is not currently working on, or has any current plans to, add support for the VESA-based and AMD-pushed Adaptive Sync portion of the DisplayPort 1.2a specification. To quote directly:
There is no truth [to that rumor of NVIDIA Adaptive Sync support] and we have made no official comments about Adaptive Sync. One thing I can say is that NVIDIA as a company is 100% dedicated to G-Sync. We are going to continue to invest in G-Sync and it is a way we can make the gaming experience better. We have no need for Adaptive Sync. We have no intention of [implementing it]."
To be clear, the Adaptive Sync part of DP 1.2a and 1.3+ are optional portions of the VESA spec that is not required for future graphics processors or even future display scalar chips. That means that upcoming graphics cards from NVIDIA could still be DisplayPort 1.3 compliant without implementing support for the Adaptive Sync feature. Based on the comments above, I fully expect that to be the case.
The ASUS ROG Swift PG278Q G-Sync monitor
With that new information, you can basically assume that the future of variable refresh monitors is going to be divided: one set for users of GeForce cards and one set for users with Radeon cards. (Where Intel falls into this is up in the air.) Clearly that isn't ideal for a completely open ecosystem but NVIDIA has made the point, over and over, that what they have developed with G-Sync is difficult and not at all as simple as could be solved with the blunt instrument that Adaptive Sync is. NVIDIA has a history of producing technologies and then keeping them in-house, focusing on development specifically for GeForce owners and fans. The dream of having a VRR monitor that will run on both vendors GPUs appears to be dead.
When asked about the possibility of seeing future monitors that can support both NVIDIA G-Sync technology as well as Adaptive Sync technology, Petersen stated that while not impossible, he "would not expect to see such a device."
The future of G-Sync is still in development. Petersen stated:
"Don't think that were done. G-Sync is not done. Think of G-Sync as the start of NVIDIA solving the problems for gamers that are related to displays...G-Sync is our first technology that makes games look better on displays. But you can start looking at displays and make a lot of things better."
Diagram showing how G-Sync affects monitor timings
So now we await for the first round of prototype FreeSync / Adaptive Sync monitors to hit our labs. AMD has put a lot of self-inflicted pressure on itself for this release by making claims, numerous times, that FreeSync will be just as good of an experience as G-Sync, and I am eager to see if they can meet that goal. Despite any ill feelings that some users might have about NVIDIA and some of its policies, it typically does a good job of maintaining a high quality user experience with these custom technologies. AMD will have to prove that what it has developed is on the same level. We should know more about that before we get too much further into fall.
You can check out our stories and reviews covering G-Sync here:
- PCPer Live! NVIDIA Maxwell, GTX 980, GTX 970 Discussion with Tom Petersen, Q&A
- Acer XB280HK 28-in 4K G-Sync Monitor Review
- NVIDIA G-Sync Surround Impressions: Using 3 ASUS ROG Swift Displays
- PCPer Live! Recap - NVIDIA G-Sync Surround Demo and Q&A
- ASUS ROG Swift PG278Q 27-in Monitor Review - NVIDIA G-Sync at 2560x1440
Subject: General Tech, Processors, Mobile | September 27, 2014 - 02:38 PM | Scott Michaud
Tagged: Intel, spreadtrum, rda, Rockchip, SoC
A few months ago, Intel partnered with Rockchip to develop low-cost SoCs for Android. The companies would work together on a design that could be fabricated at TSMC. This time Intel is partnering with Tsinghua Unigroup Ltd. and, unlike Rockchip, also investing in them. The deal will be up to $1.5 billion USD in exchange for a 20% share (approximately) of a division of Tsinghua.
Image Credit: Wikipedia
Intel is hoping to use this partnership to develop mobile SoCs, for smart (and "feature") phones, tablets, and other devices, and get significant presence in the Chinese mobile market. Tsinghua acquired Spreadtrum Communications and RDA Microelectronics within the last two years. The "holding group" that owns these division is apparently the part of Tsinghua which Intel is investing in, specifically.
Spreadtrum will produce SoCs based on Intel's "Intel Architecture". This sounds like they are referring to the 32-bit IA-32, which means that Spreadtrum would be developing 32-bit SoCs, but it is possible that they could be talking about Intel 64. These products are expected for 2H'15.
Subject: General Tech, Graphics Cards | September 27, 2014 - 02:59 AM | Scott Michaud
Tagged: rage, pc gaming, consolitis
Shinji Mikami has been developing a survival horror game, which makes sense given a good portion of his portfolio. He created Resident Evil and much of the following franchise. The Evil Within is about to release, having recently gone gold. At around this time, publishers begin to release system requirements and Bethesda does not disappoint in that regard.
Are the requirements... RAGE-inducing?
A case could be made for disappointing requirements, themselves, though.
Basically, Bethesda did not release minimum requirements. Instead, they said "This is what we recommend. It will run on less. Hope it does!" This would not be so problematic if one of their requirements wasn't a "GeForce GTX 670 with 4GBs of VRAM".
They also recommend a quad-core Core i7, 4GB of system memory, 50GB of hard drive space, and a 64-bit OS (Windows 7 or Windows 8.x).
Before I go on, I would like to mention that The Evil Within is built on the RAGE engine. Our site has dealt extensively with that technology when it first came out in 2011. While I did not have many showstopping performance problems with that game, personally, it did have a history with texture streaming. Keep that in mind as you continue to read.
A typical GTX 670 does not even have 4GBs of VRAM. In fact, the GTX 780 Ti does not even have 4GB of VRAM. Thankfully, both of the newly released Maxwell GPUs, the GTX 970 and the GTX 980, have at least 4GB of RAM. Basically, Bethesda is saying, "I really hope you bought the custom model from your AIB vendor". They literally say:
Note: We do not have a list of minimum requirements for the game. If you’re trying to play with a rig with settings below these requirements (you should plan to have 4 GBs of VRAM regardless), we cannot guarantee optimal performance.
Each time I read, "You should plan to have 4 GBs of VRAM regardless", it is more difficult for me to make an opinion about it. That is a lot of memory. Personally, I would wait for reviews and benchmarks, specifically for the PC, before purchasing the title. These recommended settings could be fairly loose, to suit the vision of the game developers, or the game could be a revival of RAGE, this time without the engine's original architect on staff.
The Evil Within launches on October 14th.
Subject: General Tech, Systems | September 27, 2014 - 02:16 AM | Scott Michaud
Tagged: msi, kingbox, ms-9a66, fanless, industry, ruggedized
This is not usually a category of computer that we report on, but MSI has just released a fanless, embedded desktop for industrial applications. Silent PCs seem to be talked about more and more frequently, and I am not sure how much of it is industry trends (as opposed to me just paying more attention). Their focus on this design is performance while remaining rugged and, as mentioned a few times, fanless.
Note that it supports CPUs with a maximum of 35W TDP. This leaves room for MSI to include up to a Core i7-4785T in the device, but we do not know if this is actually offered. It has four expansion bays, one PCIe x16 and three regular PCI slots. It does not have an ISA slot, though. I am sure this will be disappointing to some enterprises, and Josh. He probably still has a graphics card for it. You might think I would be joking. I am, but sadly I also am not.
For power, the device can accept anywhere from 9 to 36V DC. Basically, it seems to be based on laptop components with expansion slots for add-in boards. You can also purchase a fan "module" for it if, for one reason or another, it is still the best PC for the job even if it wasn't fanless.
Pricing and specific availability are not provided, but it is apparently released.
Subject: Graphics Cards | September 26, 2014 - 02:22 PM | Jeremy Hellstrom
Tagged: asus, ROG, gtx 780 ti, MATRIX Platinum, DirectCU II
With the release of the new Maxwell cards comes an opportunity for those with a smaller budget to still get a decent upgrade for their systems. Early adopters will often sell their previous GPUs once they've upgraded allowing you to get a better card than your budget would usually allow, though with a risk of ending up with a bum card. The ASUS ROG GTX 780 Ti MATRIX Platinum is a good example with a DirectCU II air cooler for general usage but the LN2 switch will also allow more extreme cooling methods for those looking for something a little more impressive. The factory overclock is not bad at 1006/1072MHz core and 7GHz effective memory but the overclock [H]ard|OCP managed at 1155/1220MHz and 7.05GHz pushes the performance above that of the R9 290X of the same family. If you can find this card used at a decent price it could give you more of an upgrade than you thought you could afford.
"In today's evaluation we are breaking down the ASUS ROG GTX 780 Ti MATRIX Platinum video card. We put this head-to-head with the ASUS ROG R9 290X MATRIX Platinum. Which provides a better gaming experience, best overclocking performance, and power and temperature? Which one provides the best value? "
Here are some more Graphics Card articles from around the web:
- MSI GTX 980 OC @ HardwareHeaven
- Taking It To The Limit: Overclocking NVIDIA’s GeForce GTX 970 & 980 @ Techgage
- Gigabyte G1 Gaming Geforce GTX 980 Review @ HiTech Legion
- Palit GTX 970 JetStream 4 GB @ techPowerUp
- ASUS Strix Edition GeForce GTX 970 Graphics Card Review @ Techgage
- ASUS GTX 970 STRIX OC Review @ Hardware Canucks
- Palit GTX970 JetStream OC @ Kitguru
- Testing Nvidia’s GeForce GTX 980 4GB Graphics Cards In SLI @ eTeknix
- Gigabyte G1 Gaming GeForce GTX 980 4GB @ eTeknix
- ASUS STRIX GTX 970 DirectCU II OC 4GB Review @HiTech Legion
- Nvidia GeForce GTX 980 4GB @ eTeknix
- MSI GTX 970 Gaming 4G Review @HiTech Legion
- Nvidia Quadro K5200, K4200 and K2200 Professional Graphics Cards @ X-bit Labs
- Gigabyte R7 250X OC Performance Review @ Neoseeker
- Sapphire R9 285 ITX Compact v MSI GTX760 Gaming Mini ITX @ Kitguru
Subject: General Tech | September 26, 2014 - 01:26 PM | Jeremy Hellstrom
Tagged: PCIe SSD, Samsung, NVMe, SM1715, 3d nand
Samsung's new SM1715 NVMe PCIe SSD will use their new 3D V-NAND and come in a 3.2TB card, double the previous model and perhaps the smallest of the new line of SSDs they are working on. The stats are fairly impressive at 750,000/130,000 random read/write IOPS or 3GB/sec read bandwidth and 2.2GB/sec write bandwidth if you prefer that measurement. Samsung offers a nice mix of bandwidth and size with the new model and you can expect the competition to start releasing new models with increased capacities and speeds in the near future. The Register was not provided the full set of specifications for the drive but those should be forthcoming in the near future.
"Faster, fatter flash cards that speed up server applications are in demand, and Samsung has announced it is mass-producing a 3.2TB NVMe PCIe SSD using its 3D V-NAND technology. It says higher capacities are coming."
Here is some more Tech News from around the web:
- TSMC releases networking processors based on 16nm FinFET @ DigiTimes
- China market: SSD prices drop sharply, say memory module makers @ DigiTimes
- 6 IT Certifications for SysAdmins to Consider @ Linux.com
- Outage fears as Amazon's always on elastic cloud gets rebooted @ The Inquirer
- The Great Lightbulb Conspiracy @ Slashdot
Subject: Graphics Cards | September 26, 2014 - 12:14 PM | Ryan Shrout
Tagged: vxgi, video, tom petersen, nvidia, mfaa, maxwell, livestream, live, GTX 980, GTX 970, dsr
UPDATE: If you missed the live stream yesterday, I have good news: the interview and all the information/demos provided are available to you on demand right here. Enjoy!
Last week NVIDIA launched GM204, otherwise known as Maxwell and now branded as the GeForce GTX 980 and GTX 970 graphics cards. You should, of course, have already read the PC Perspective review of these two GPUs, but undoubtedly there are going to be questions and thoughts circulating through the industry.
To help the community get a better grasp and to offer them an opportunity to ask some questions, NVIDIA's Tom Petersen is stopping by our offices on Thursday afternoon where he will run through some demonstrations and take questions from the live streaming audience.
Be sure to stop back at PC Perspective on Thursday, September 25th at 4pm ET / 1pm PT to discuss the new Maxwell GPU, the GTX 980 and GTX 970, new features like Dynamic Super Resolution, MFAA, VXGI and more! You'll find it all on our PC Perspective Live! page on Monday but you can sign up for our "live stream mailing list" as well to get notified in advance!
NVIDIA Maxwell Live Stream
1pm PT / 4pm ET - September 25th
We also want your questions!! The easiest way to get them answered is to leave them for us here in the comments of this post. That will give us time to filter through the questions and get the answers you need from Tom. We'll take questions via the live chat and via Twitter (follow me @ryanshrout) during the event but often time there is a lot of noise to deal with.
So be sure to join us on Thursday afternoon!
UPDATE: We have confirmed at least a handful of prizes for those of you that tune into the live stream today. We'll giveaway an NVIDIA SHIELD as well as several of the brand new SLI LED bridges that were announced for sale this week!
Subject: General Tech | September 26, 2014 - 02:46 AM | Scott Michaud
Tagged: free games, swap, arena shooter, pc gaming
Subterfuge Weapons Assessment Program, an obvious backronym for S.W.A.P., takes the first person shooter genre and removes the whole "damage" mechanic. Basically, shooting an opponent will have your character "exchange bodies". The point is apparently to prevent the enemy from delivering a payload to your base or put them into situations where they will kill themselves once they are at your position.
While I have yet to play the game, it is free. No micro-transactions, DLC, or subscriptions. They are using this project to gauge interest for a full, Unreal Engine release. It has an interesting art style, reminiscent of Unreal Tournament (1999) or the original Tribes. It could be worth a download, especially if you like old-fashioned arena shooters and unusual game mechanics.
Those are two genres which do not get mixed a lot...
Subject: General Tech, Graphics Cards | September 26, 2014 - 02:03 AM | Scott Michaud
Tagged: steam, precisionx 16, precisionx, overclocking, nvidia, evga
If you were looking to download EVGA Precision X recently, you were likely disappointed. For a few months now, the software was unavailable because of a disagreement between the add-in board (AIB) partner and Guru3D (and the RivaTuner community). EVGA maintains that it was a completely original work, and references to RivaTuner are a documentation error. As a result, they pulled the tool just a few days after launching X 15.
This new version, besides probably cleaning up all of the existing issues mentioned above, adds support for the new GeForce GTX 900-series cards, a new interface, an "OSD" for inside applications, and Steam Achievements (??). You can get a permanent badge on your Steam account for breaking 1200 MHz on your GPU, taking a screenshot, or restoring settings to default. I expect that latter badge is one of shame, like the Purple Heart from Battlefield, that is not actually a bad thing and says nothing less of your overclocking skills by pressing it. Seriously, save yourself some headache and just press default if things just do not seem right.
PrecisionX 16 is free, available now, and doesn't require an EVGA card (just a site sign-up).
Subject: General Tech, Mobile | September 26, 2014 - 01:45 AM | Scott Michaud
Tagged: tablet, Nexus, google, nexus 9, nvidia, tegra k1
The Nexus line is due for an update, with each product being released for at least a year. They are devices which embody Google's vision... for their own platform. You can fall on either side of that debate, whether it guides OEM partners or if it is simply a shard the fragmentation issue, if you even believe that fragmentation is bad, but they are easy to recommend and a good benchmark for Android.
We are expecting a few new entries in the coming months, one of which being the Nexus 9. Of note, it is expected to mark the return of HTC to the Nexus brand. They were the launch partner with the Nexus One and then promptly exited stage left as LG, Samsung, and ASUS performed the main acts.
We found this out because NVIDIA spilled the beans on their lawsuit filing against Qualcomm and Samsung. Apparently, "the HTC Nexus 9, expected in the third quarter of 2014, is also expected to use the Tegra K1". It has since been revised to remove the reference. While the K1 has a significant GPU to back it up, it will likely be driving a very high resolution display. The Nexus 6 is expected to launch at around the same time, along with Android 5.0 itself, and the 5.2-inch phone is rumored to have a 1440p display. It seems unlikely that a larger, tablet display will be lower resolution than the phone it launches alongside -- and there's not much room above it.
The Google Nexus 9 is expected for "Q3".
Subject: Storage | September 25, 2014 - 06:36 PM | Jeremy Hellstrom
Tagged: corsair, Voyager Air 2, wireless hdd
The Corsair Voyager Air 2 is the second iteration of wireless drive, this years model coming with a 1TB drive, a totally redesigned shell and a $20 drop in price. Legit Reviews warns that while the price drop is appreciated it no longer comes with the charging kit which will cost you extra. It supports USB 3.0 and 802.11 b/g/n transfers as well as Internet passthrough, keep in mind that WiFi is disabled once the USB plug is connected. The overall speeds were in line with what was expected and the battery life is impressive for 720p streaming, though 1080p streaming drains it much more quickly. See the Voyager in action right here.
"Last year we took a look at Corsair’s first wireless hard drive, called Voyager Air, which was a very sleek and impressive unit that we really liked. Today, we’re going to take a look at the more recently revamped version, conveniently called Voyager Air 2. We’ll take a look and see what this drive all has to offer and if there is anything new brought to the table."
Here are some more Storage reviews from around the web:
- RAIDON Runner GR2660 SSD/HDD RAID Enclosure @ Kitguru
- Silicon Power Stream S03 2TB USB 3.0 Portable Hard Drive Review @ NikKTech
- QNAP TS-251 High Performance NAS for SOHO and Home Users Review @ Madshrimps
- Team Group Micro SDHC UHS-1 U3 32GB Review @ Madshrimps
- SanDisk Ultra II 240GB SSD Review @ Legit Reviews
- Corsair Force LX 256GB @ eTeknix
- Kingston SM2280S3 M.2 SATA 120 GiB SSD Review @ Hardware Secrets
- Kingston SM2280S3 M.2 SATA SSD @ The SSD Review
Subject: Processors | September 25, 2014 - 02:56 PM | Jeremy Hellstrom
Tagged: linux, X99, core i7-5960x, Haswell-E
After the smoke from their previous attempt at testing the i7 5960X CPU Phoronix picked up a Gigabyte X99-UD4-CF and have now had a chance to test Haswell-E performance on Linux. The new processor is compared to over a dozen others on machines running Ubuntu and really showed up the competition on benchmarks that took advantage of the 8 cores. Single threaded applications that depended on a higher clock speed proved to be a weakness as the 4790K's higher frequency allowed it to outperform the new Haswell-E processor. Check out the very impressive results of Phoronix's testing right here.
"With the X99 burned-up motherboard problem of last week appearing to be behind us with no further issues when using a completely different X99 motherboard, here's the first extensive look at the Core i7 5960X Haswell-E processor running on Ubuntu Linux."
Here are some more Processor articles from around the web:
- Intel's Xeon E5-2687W v3 @ The Tech Report
- Intel Core i7-5960X Extreme @ Benchmark Reviews
- Intel Core i7-4790K and Core i5-4690K @ X-bit Labs
- Return of the Athlon: AMD Brings Kabini to the desktop @ Bjorn3d
- AMD FX8370E @ Kitguru
- AMD FX-8370E @ eTeknix
Subject: General Tech | September 25, 2014 - 01:04 PM | Jeremy Hellstrom
Tagged: euclideon, voxels, larrabee, point cloud
Could the next Elder Scrolls game you play look like the screenshot below? Euclideon is working to make that a reality with their new voxel engine. The engine is strictly CPU based, similar to the long dead Larrabee architecture but with one major difference, currently they are capable of rendering 2000x1000 frames at around 32 FPS on a six-core processor. They are properly referred to as frames because this is a point cloud solution, not pixel based. They generated the images in the video you can see at The Tech Report by rendering 3D scans of real objects and locations but programmers will still be able to create scenes with Maya or 3ds Max. Euclideon feels that they can still get a lot more performance out of a CPU with software refinements and are not planning on moving to GPU at this time. With two unannounced games using this new engine in development it might be time to make sure your machine has at least 6 cores so that you can be ready for their launch
"We first heard about Euclideon back in 2011, when the company posted a video of a voxel-based rendering engine designed to enable environments with unlimited detail. This month, the firm made headlines again with a new video showing the latest iteration of is technology, which uses 3D scanners to capture real-world environments as point-cloud data. We spoke to Euclideon CEO Bruce Dell to find out more about these innovations—and about the first games based on them."
Here is some more Tech News from around the web:
- Gigabyte Technology may still see losses from non-motherboard businesses in 2014 @ DigiTimes
- Diamond Dual Band Wireless 802.11n Range Extender Review @ Neoseeker
- Patch Bash NOW: 'Shell Shock' bug blasts OS X, Linux systems wide open @ The Register
- Supercapacitors have the power to save you from data loss @ The Register
Subject: General Tech | September 25, 2014 - 12:24 PM | Ken Addison
Tagged: podcast, video, GTX 980, GTX 970, maxwell, nvidia, amd, noctua, NH-D15, acer, 4k, 4k gsync, XB280HK, 840, 840 evo, Samsung
PC Perspective Podcast #319 - 09/25/2014
Join us this week as we discuss our GTX 980 and 970 Review, Noctua NH-D15, Acer's 4K G-Sync Display and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the Store
- RSS - Subscribe through your regular RSS reader
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano
Program length: 1:34:52
Subject: General Tech, Systems | September 24, 2014 - 06:10 PM | Scott Michaud
Tagged: Samsung, Chromebook, laptop
This does not apply to our North American readers, although it is good for them to know. To our European fans: Samsung has pulled out of the laptop market, for devices running either Windows or ChromeOS, in your region. The company is not commenting on how many jobs will be lost as a result of this decision. Samsung is not halting operations in any other region and this decision "is not necessarily reflective of conditions in other markets".
Parallels are drawn with Sony and its VAIO division, but this is significantly different. Sony sold its PC business to Japanese Industrial Partners who, in July, relaunched the brand in Japan. Samsung has not sold any division although there is rumors of upcoming restructuring. While Samsung will retain their brand and continue to develop products for the other regions, pulling away is always concerning for customers. It really could be a geographic anomaly, like Xbox was in Japan, or it could be a warning tremor. We simply do not know.