All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: Graphics Cards | October 5, 2015 - 07:13 AM | Scott Michaud
Tagged: graphics drivers, amd
Apparently users of AMD's Catalyst 15.9 drivers have been experiencing issues. Specifically, “major memory leaks” could be caused by adjusting windows, such as resizing them or snapping them to edges of the desktop. According to PC Gamer, AMD immediately told users to roll back when they found out about the bug.
They have since fixed it with Catalyst 15.9.1 Beta. This subversion driver also fixes crashes and potential “signal loss” problems with a BenQ FreeSync monitor. As such, if you were interested in playing around with the Catalyst 15.9 beta driver, then it should be safe to do so now. I wish I could offer more input, but I just found out about it and it seems pretty cut-and-dry: if you had problems, they should be fixed. The update is available here.
Subject: General Tech | October 5, 2015 - 07:01 AM | Scott Michaud
Tagged: physics, microsoft, Intel, Havok
Microsoft has just purchased Havok from Intel for an undisclosed price. This group develops one of the leading physics engines for video games and other software. It was used in every Halo title since Halo 2, including Halo Wars, and a fork of it drives the physics for Valve's Source Engine. It has been around since 2000, but didn't really take off until Max Payne 2 in 2003.
And the natural follow-up question for just about everything is “why?”
Hopefully this isn't bad taste...
Photo Credit: Havok via Game Developer Magazine (June 2013)
There are good reasons, though. First, Microsoft has been in the video game middleware and API business for decades. DirectX is the obvious example, but they have also created software like Games for Windows Live and Microsoft Gaming Zone. Better software drives sales for platforms, and developers can always use help accomplishing that.
Another reason could be Azure. Microsoft wants to bring cloud services to online titles, offloading some of the tasks that are insensitive to latency allows developers to lower system requirements or do more with what they have (which is especially true when consoles flatten huge install bases to a handful of specifications). If they plan to go forward with services that run on Azure or Xbox Live, then it would make sense to have middleware that's as drop-in as possible. Creating a physics engine from scratch is a bit of a hassle, but so is encouraging existing engines to use it.
It would be better to just buy someone that everyone is using. Currently, that's Havok, an open-source solution that is rarely used outside of other open-source systems, and something that's owned by NVIDIA (and probably won't leave their grip until their fingers are frigid and lifeless).
That's about all we know, though. The deal doesn't have a close date, value, or official purpose. Intel hasn't commented on the deal, only Microsoft has.
Subject: Graphics Cards | October 5, 2015 - 02:33 AM | Sebastian Peak
Tagged: rumor, report, radeon, graphics cards, Gemini, fury x, fiji xt, dual-GPU, amd
The AMD R9 Fury X, Fury, and Nano have all been released, but a dual-GPU Fiji XT card could be on the way soon according to a new report.
Back in June at AMD's E3 event we were shown Project Quantum, AMD's concept for a powerful dual-GPU system in a very small form-factor. It was speculated that the system was actually housing an unreleased dual-GPU graphic card, which would have made sense given the very small size of the system (and mini-ITX motherboard therein). Now a report from WCCFtech is pointing to a manifest that just might be a shipment of this new dual-GPU card, and the code-name is Gemini.
"Gemini is the code-name AMD has previously used in the past for dual GPU variants and surprisingly, the manifest also contains another phrase: ‘Tobermory’. Now this could simply be a reference to the port that the card shipped from...or it could be the actual codename of the card, with Gemini just being the class itself."
The manifest also indicates a Cooler Master cooler for the card, the maker of the liquid cooling solution for the Fury X. As the Fury X has had its share of criticism for pump whine issues it would be interesting to see how a dual-GPU cooling solution would fare in that department, though we could be seeing an entirely new generation of the pump as well. Of course speculation on an unreleased product like this could be incorrect, and verifiable hard details aren't available yet. Still, of the dual-GPU card is based on a pair of full Fiji XT cores the specs could be very impressive to say the least:
- Core: Fiji XT x2
- Stream Processors: 8192
- GCN Compute Units: 128
- ROPs: 128
- TMUs: 512
- Memory: 8 GB (4GB per GPU)
- Memory Interface: 4096-bit x2
- Memory Bandwidth: 1024 GB/s
In addition to the specifics above the report also discussed the possibility of 17.2 TFLOPS of performance based on 2x the performance of Fury X, which would make the Gemini product one of the most powerful single-card GPU solutions in the world. The card seems close enough to the final stage that we should expect to hear something official soon, but for now it's fun to speculate - unless of course the speculation concerns a high initial retail price, and unfortunately something at or above $1000 is quite likely. We shall see.
Subject: General Tech | October 3, 2015 - 11:04 PM | Scott Michaud
Tagged: Starcraft II, legacy of the void, blizzard
Third time's the charm, unless they plan another release at some point.
The StarCraft II interface isn't perfect. Even though it is interesting and visually appealing, some tasks are unnecessarily difficult and space is not used in the most efficient way. To see what I mean, try to revert the multiplayer mode to Wings of Liberty, or, worse, find your Character Code. Blizzard released a new UI with Heart of the Swarm back in 2013, and they're doing a new one for the release of Legacy of the Void on November 10th. Note that my two examples probably won't be fixed in this update, they are just examples of UX issues.
While the update aligns with the new expansion, Blizzard will patch the UI for all content levels, including the free Starter Edition. This honestly makes sense, because it's easier to patch a title when all variations share a common core. Then again, not every company patches five-year-old titles like Blizzard does, so the back-catalog support is appreciated.
The most heartwarming change for fans, if pointless otherwise, is in the campaign selection screen. As the StarCraft II trilogy will be completed with Legacy of the Void, the interface aligns them as three episodes in the same style as the original StarCraft did.
On the functional side, the interface has been made more compact (which I alluded to earlier). This was caused by the new chat design, which is bigger yet less disruptive than it was in Heart of the Swarm. The column of buttons on the side are now a top bar, which expands down for sub-menu items.
While there are several things that I don't mention, a final note for this post is that Arcade will now focus on open lobbies. Players can look for the specific game they want, but the initial screen will show lobbies that are waiting to fill. The hope seems to be that players waiting for a game will spend less time. This raises two questions. First, Arcade games tend to have a steep learning curve, so I wonder if this feature will slump off after people try a few rounds before realizing that they should stick with a handful of games. Second, I wonder what this means for player numbers in general -- this sounds like a feature that is added during player declines, which Blizzard seems to hint is not occuring.
I'm not sure when the update will land, but it will probably be around the launch of Legacy of the Void on November 10th.
Subject: Displays | October 3, 2015 - 09:12 PM | Sebastian Peak
Tagged: UP3216Q, ultrasharp, UHD, monitor, ips, HDMI 2.0, display, dell, calibration, Adobe RGB, 4k
While not officially launched in the U.S. just yet, on Thursday Tom's Hardware reported news of a trio of upcoming UltraSharp monitors from Dell, the largest of which - the UP3216Q - I was able to locate on Dell's Bermuda site.
For anyone looking for a 4K display for photo or video editing (or any other color critical work) the new Dell UltraSharp UP3216Q looks like a great - and likely very pricey - option. Just how much are we talking? The existing 31.5-inch 4K UP3214Q carries a $1999 MSRP (though it sells for $1879 on Dell's site). For this kind of money there are probably those who will never consider a 16:9 option (or ever give up their 16:10 30-inch displays), but the specifications of this new UP3216Q are impressive:
- Diagonal Viewing Size: 31.5 inch
- Aspect Ratio: Widescreen (16:9)
- Panel Type, Surface: In-Plane Switching
- Optimal resolution: 3840 x 2160 @ 60Hz
- Active Display Area (H x V): 273,996 sq-mm (424.7 sq-inches)
- Contrast Ratio: 1000 to 1 (typical), 2 Million to 1 (dynamic)
- Brightness: 300 cd/m2 (typical)
- Response Time: 6ms fast mode . GTG
- Viewing Angle: 178° vertical / 178° horizontal
- Adjustability: Tilt, Swivel, Height Adjust
- Color Support: 1.07 billion colors
- Pixel Pitch: 0.182 mm
- Backlight Technology: LED light bar system
- Display Screen Coating: Anti-Glare with 3H hardness
- Connectivity: DP, mDP, HDMI (MHL), 4 x USB3 with one charging port, 1 x USB3 upstream, Media Card Reader
With the 60 Hz 4K (UHD) IPS panel offering full sRGB and 99.5% Adobe RGB, and a factory calibration that promises to be factory color calibrated with a deltaE of less than 2, the UP3214Q sounds pretty much ready to go out of the box. However for those inclined to strive for a more perfect calibration Dell is offering an X-Rite i1Display Pro colorimeter as an optional accessory, providing their own Dell UltraSharp Color Calibration Solution software.
A couple of points of interest with this monitor, while it offers DisplayPort and mini-DP inputs it also supports 4K 60 Hz via HDMI 2.0. Color support is also listed as 1.07 billion colors, but it's not specified whether this indicates a 10-bit panel or if they are implementing 10-bit color processing with an 8-bit panel - though if it's in the $2k price range it would probably safe to assume this is a 10-bit panel. Lastly, in keeping with the UltraSharp branding the monitor will also carry Dell's Premium Panel Guarantee and 3-Year Advanced Exchange Service warranty.
Subject: Mobile | October 2, 2015 - 04:09 PM | Tim Verry
Tagged: Tegra X1, tablet, pixel, nvidia, google, android 6.0, Android
During its latest keynote event, Google unveiled the Pixel C, a powerful tablet with optional keyboard that uses NVIDIA’s Tegra X1 SoC and runs the Android 6.0 “Marshmallow” operating system.
The Pixel C was designed by the team behind the Chromebook Pixel. Pixel C features an anodized aluminum body that looks (and reportedly feels) smooth with clean lines and rounded corners. The tablet itself is 7mm thick and weighs approximately one pound. The front of the Pixel C is dominated by a 10.2” display with a resolution of 2560 x 1800 (308 PPI, 500 nits brightness), wide sRGB color gamut, and 1:√2 aspect ratio (which Google likened to the size and aspect ratio of an A4 sheet of paper). A 2MP front camera sits above the display while four microphones sit along the bottom edge and a single USB Type-C port and two stereo speakers sit on the sides of the tablet. Around back, there is an 8MP rear camera and a bar of LED lights that will light up to indicate the battery charge level after double tapping it.
The keyboard is an important part of the Pixel C, and Google has given it special attention to make it part of the package. The keyboard attaches to the tablet using self-aligning magnets that are powerful enough to keep the display attached while holding it upside down and shaking it (not that you'd want to do that, mind you). It can be attached to the bottom of the tablet for storage and used like a slate or you can attach the tablet to the back of the keyboard and lift the built-in hinge to use the Pixel C in laptop mode (the hinge can hold the display at anywhere from 100 to 135-degrees). The internal keyboard battery is good for two months of use, and can be simply recharged by closing the Pixel C like a laptop and allowing it to inductively charge from the tablet portion. The keyboard is around 2mm thick and is nearly full size at 18.85mm pitch and the chiclet keys have a 1.4mm travel that is similar to that of the Chromebook Pixel. There is no track pad, but it does offer a padded palm rest which is nice to see.
Internally, the Pixel C is powered by the NVIDIA Tegra X1 SoC, 3GB of RAM, and 32GB or 64GB of storage (depending on model). The 20nm Tegra X1 consists of four ARM Cortex A57 and four Cortex A53 CPU cores paired with a 256-core Maxwell GPU. The Pixel C is a major design win for NVIDIA, and the built in GPU will be great for gaming on the go.
The Pixel C will be available in December ("in time for the holidays") for $499 for the base 32 GB model, $599 for the 64 GB model, and $149 for the keyboard.
First impressions, such as this hands-on by Engadget, seem to be very positive stating that it is sturdy yet sleek hardware that feels comfortable typing on. While the hardware looks more than up to the task, the operating system of choice is a concern for me. Android is not the most productivity and multi-tasking friendly software. There are some versions of Android that enable multiple windows or side-by-side apps, but it has always felt rather clunky and limited in its usefulness. With that said, Computer World's JR Raphael seems hopeful. He points out that the Pixel C is, in Batman fashion, not the hardware Android wants, but the hardware that Android needs (to move forward) and is primed for a future of Android that is more friendly to such productive endeavors. Development versions of Android 6.0 included support for multiple apps running simultaneously side-by-side, and while that feature will not make the initial production code cut, it does show that it is something that Google is looking into pursuing and possibly enabling at some point. The Pixel C has an excellent aspect ratio to take advantage of the app splitting with the ability to display four windows each with the same aspect ratio.
I am not sure how well received the Pixel C will be by business users who have several convertible tablet options running Windows and Chrome OS. It certainly gives the iPad-and-keyboard combination a run for its money and is a premium alternative to devices like the Asus Transformers.
What do you think about the Pixel C, and in particular, it running Android?
Even if I end up being less-than-productive using it, I think I'd still want the sleek-looking hardware as a second machine, heh.
Subject: Cases and Cooling | October 2, 2015 - 03:29 PM | Jeremy Hellstrom
Tagged: PSU, modular psu, coolermaster, V750W, 80+ gold
Cooler Master's V750W PSU is fully modular and comes with a nice selection of cabling including four PCIe 6+2 connectors and eight SATA power connectors. At 150x140x86mm (5.9x5.5x3.4") it also takes up less space than many PSUs, though not enough to fit in a truly SFF case. A single 12V rail can provide 744W at 62A which is enough to power more than one mid to high range GPU and Bjorn3D's testing shows that it can maintain that 80+ GOLD rating while it is being used. The five year warranty is also a good reason to pick up this PSU, assuming you are not in the market for something in the kilowatt range.
"One available option soon to be available on the market for <179$, and our specimen of review today, is the CoolerMaster V750. CoolerMaster has partnered with Seasonic to produce the high quality compact “V” series PSUs which made a huge statement for CoolerMaster and told the world they were ready to push some serious power."
Here are some more Cases & Cooling reviews from around the web:
- Corsair RM1000x @ Kitguru
- Corsair RM750x @ Kitguru
- Antec HCP Platinum Continuous Power 1000W @ eTeknix
- Be Quiet! Dark Power Pro 11 1000W Power Supply Unit Review @ NikKTech
- Cooler Master V Series 550W @ Kitguru
- be quiet! Dark Power Pro 11 550W @ eTeknix
Subject: Editorial | October 2, 2015 - 12:41 PM | Jeremy Hellstrom
Tagged: google, chromecast, AT&T, apple tv, amd, amazon
There is more discouraging news out of AMD as another 5% of their workforce, around 10,000 employees, will be let go by the end of 2016. That move will hurt their bottom line before the end of this year, $42 million in severance, benefit payouts and other costs associated with restructuring but should save around $60-70 million in costs by the end of next year. This is on top of the 8% cut to their workforce which occurred earlier this year and shows just how deep AMD needs to cut to stay alive, unfortunately reducing costs is not as effective as raising revenue. Before you laugh, point fingers or otherwise disparage AMD; consider for a moment a world in which Intel has absolutely no competition selling high powered desktop and laptop parts. Do you really think the already slow product refreshes will speed up or prices remain the same?
Consider the case of AT&T, who have claimed numerous times that they provide the best broadband service to their customers that they are capable of and at the lowest price they can sustain. It seems that if you live in a city which has been blessed with Google Fibre somehow AT&T is able to afford to charge $40/month less than in a city which only has the supposed competition of Comcast or Time Warner Cable. Interesting how the presence of Google in a market has an effect that the other two supposed competitors do not.
There is of course another way to deal with the competition and both Amazon and Apple have that one down pat. Apple removed the iFixit app that showed you the insides of your phone and had the temerity to actually show you possible ways to fix hardware issues. Today Amazon have started to kick both Apple TV and Chromecast devices off of their online store. As of today no new items can be added to the virtual inventory and as of the 29th of this month anything not sold will disappear. Apparently not enough people are choosing Amazon's Prime Video streaming and so instead of making the service compatible with Apple or Google's products, Amazon has opted to attempt to prevent, or at least hinder, the sale of those products.
The topics of competition, liquidity and other market forces are far too complex to be dealt with in a short post such as this but it is worth asking yourself; do you as a customer feel like competition is still working in your favour?
"AMD has unveiled a belt-tightening plan that the struggling chipmaker hopes will get its finances back on track to profitability."
Here is some more Tech News from around the web:
- Microsoft brings LinkedIn to Cortana, and Likes and mentions to Outlook @ The Inquirer
- Microsoft previews less buggy OneDrive for Business client @ The Register
- Toshiba CEO: Yeah, we MAY need to chop some heads @ The Register
- UK scientists create quantum cryptology world record with 'unhackable' data @ The Inquirer
Subject: Mobile | October 2, 2015 - 02:02 AM | Tim Verry
Tagged: LG, ultrathin, Broadwell, ips display
Earlier this week, LG revealed three new notebooks under its Gram series that are set to compete with Apple’s Macbook Air (The Verge has a photo comparison of the two) and various Ultrabooks from other manufacturers (e.g. Lenovo and Asus). The new series includes one 13-inch and two 14-inch laptops that weigh in at 2.16 pounds and are 0.5” thick. The LG Gram with 13” display is the smallest of the bunch at 11.9” x 8.4” x 0.5” and the chassis is constructed of magnesium and polycarbonate (plastic). Meanwhile, the two notebooks with the 14” display measure 12.8” x 8.94” x 0.5” and feature a body made from a combination of carbon-magnesium and lithium-magnesium alloys. The difference in materials accounts for the larger notebooks hitting the same weight target (2.16 lbs).
The 14-inch LG Gram 14 (gram-14Z950-A.AA4GU1) notebook.
LG is packing every Gram notebook with a 1080p IPS display (13.3 or 14 inches), dual mics, a 1.3 MP webcam, six row island-style keyboard, and a spacious track pad. External IO includes two USB 3.0 ports, HDMI output, micro SD card slot, and a micro USB port that (along with the included dongle) supports the 10/100 Ethernet network connection.
The base Gram 13-inch comes in Snow White while both Gram 14-inch notebooks are clad in Champagne Gold.
The LG Gram 13 Broadwell-powered laptop (gram-13Z950-A.AA3WU1).
Internally, LG has opted to go with Intel’s Broadwell processor and its built-in HD 5500 GPU. The LG Gram 13 uses the Intel Core i5-5200U (2 cores, 4 threads at 2.2-2.7GHz). The 14-inch models can be configured with an Intel i5 or an Intel Core i7-5500U which is a dual core (with HyperThreading for four threads) processor clocked at 2.4 GHz that can boost to 3.0 GHz. Additional specifications include 8GB of DDR3L memory, a solid state drive (128 GB on the Gram 13, up to 256 GB on the Gram 14), Intel 802.11ac Wi-Fi, and rated battery life of up to 7.5 hours (which is not great, but not too bad).
The Gram series is LG’s first major thin-and-light entry into the US market, and while there are some compromises made to get the portability, the price points are competitive and seem to be priced right. Interestingly, LG is aiming these notebooks as Macbook Air competitors, allegedly offering you a larger, yet lighter, notebook. It is not actually the lightest notebook on the market, however. Below is a brief point of (weight) comparison to some of the major recent thin-and-lights, the Gram is going up against:
- 12” Apple MacBook: 2.03 lbs
- 11” Apple MacBook Air: 2.38 lbs
- 13” Apple MacBook Air: 2.96 lbs
- 13.3" ASUS Zenbook UX305FA (Core M): 2.65 lbs
- 13.3" ASUS Zenbook UX301LA (Core i7): 3.08 lbs
- 13.3” LaVie Z: 1.87 lbs
- 13.3” LaVie Z 360: 2.04 lbs
- 12.2" Samsung ATIV Book 9: 2.09 lbs
We will have to wait for reviews to see how the build quality stacks up, especially the 14-inch models using the lithium-magnesium bodies which, while light, may not be the sturdiest flex-wise. If they can hold up to the stress of the daily commuter, the retail pricing is far from exorbitant and if you can live with the compromises fairly attractive.
Subject: Mobile | October 1, 2015 - 07:13 PM | Jeremy Hellstrom
Tagged: nexus 6p, google, Android
The Nexus 6P is new from Google and is designed to compete directly against the new Apple devices, with an all aluminium body and a new USB Type-C connection for charging. One of the benefits of the new USB is in the charging speed, which Google claims will give you 7 hours of usage off of a 10 minute charge. They have also added a 12.3MP camera complete with a 1.55μm sensor, though in the style of the Nokia Lumia 1520, that camera does project beyond the casing. The 5.7in 2560x1440 AMOLED screen is made of Gorilla Glass 4 and is powered by a 2GHz Qualcomm Snapdragon 810 v2.1 octa-core processor, which may display that chips tendency to get a little warm during use. The Inquirer has not had a chance to fully review the Nexus 6P but you can catch their preview right here.
"THE NEXUS 6P is the first truly premium Android device from Google. Last year's Nexus 6 divided opinion with its bulky design and lacklustre features, but the firm is hoping that its successor, with the premium case and next-gen specs, will finally fill the void for those after a stock Android device."
Here are some more Mobile articles from around the web:
- ASUS ZenFone 2 Review @ Hardware Secrets
- Asus ZenPad S 8.0 @ Kitguru
- Samsung Galaxy Tab S Review @ Hardware Secrets
- B2C Telecom Premium Custom Bookstyle Case for iPhone Review @ Madshrimps
Subject: Storage | October 1, 2015 - 05:42 PM | Allyn Malventano
Tagged: Samsung, firmware, 840 evo, msata
It took them a while to get it right, but Samsung did manage to fix their read degradation issue in many of their TLC equipped 840 Series SSDs. I say many because there were some models left out when firmware EXT0DB6Q was rolled out via Magician 4.6. The big exception was the mSATA variant of the 840 EVO, which was essentially the same SSD just in a more compact form. This omission was rather confusing as the previous update was applicable to both the 2.5" and mSATA form factors simultaneously.
The Magician 4.7 release notes included a bullet for Advanced Performance Optimization support on the 840 EVO mSATA model, but it took Samsung some time to push out the firmware update that enabled this possibility. We know from our previous testing that the Advanced Performance Optimization feature was included with other changes that enabled reads from 'stale' data at full speeds, compensating for the natural voltage drift of flash cell voltages representing the stored data.
Now that the firmware has been made available (it came out early this week but was initially throttled), I was able to apply it to our 840 EVO 1TB mSATA sample without issue, and could perform the Advanced Performance Optimization and observe the expected effects, but my sample was recently used for some testing and did not have data old enough to show a solid improvement with the firmware applied *and before* running the Optimization. Luckily, an Overclock.net forum member was able to perform just that test on his 840 EVO 500GB mSATA model:
Kudos to that member for being keen enough to re-run his test just after the update.
It looks like the only consumer 840 TLC model left to fix is the original 840 SSD (not 840 EVO, just 840). This was the initial model launched that was pure TLC flash with no SLC TurboWrite cache capability. We hope to see this model patched in the near future. There were also some enterprise units that used the same planar 19nm TLC flash, but I fear Samsung may not be updating those as most workloads seen by those drives would constantly refresh the flash and not give it a chance to become stale and suffer from slowing read speeds. The newer and faster V-NAND equipped models (850 / 950 Series) have never been susceptible to this issue.
Subject: General Tech, Mobile | October 1, 2015 - 02:45 PM | Ryan Shrout
Tagged: iphone 6s, iphone, ios, google, apple, Android
PC Perspective’s Android to iPhone series explores the opinions, views and experiences of the site’s Editor in Chief, Ryan Shrout, as he moves from the Android smartphone ecosystem to the world of the iPhone and iOS. Having been entrenched in the Android smartphone market for 7+ years, the editorial series is less of a review of the new iPhone 6s as it is an exploration on how the current smartphone market compares to what each sides’ expectations are.
Full Story Listing:
- Day 0: What to Expect
- Day 3: Widgets and Live Photos
- Day 6: Battery Life and Home Screens
- Day 17: SoC Performance
- Day 31: Battery Life and Closing
It probably won’t come as a shock to the millions of iPhone users around the globe, but the more days I keep the 6s in my pocket, the more accepting I am becoming with the platform. The phone has been fast and reliable – I have yet to come across any instability or application crashes despite my incessant installations of new ones. And while I think it’s fair to say that even new Android-based phones feel snappy to user interactions out of the box, the iPhone is just about a week in without me ever thinking about performance – which is exactly what you want from a device like this.
There are some quirks and features missing from the iPhone 6s that I had on my Droid Turbo that I wish I could implement in settings or through third-party applications. I fell in love with the ability to do a double wrist rotation with the Droid as a shortcut to opening up the camera. It helped me capture quite a few photos when I only had access to a single hand and without having to unlock the phone, find an icon, etc. The best the iPhone has is a “drag up from the bottom” motion from the lock screen but I find myself taking several thumb swipes on it before successfully activating it when only using one hand. Trying to use the home button to access the lock screen, and thus the camera shortcut, is actually hindered because the Touch ID feature is TOO FAST, taking me to a home screen (that may not have the camera app icon on it) where I need to navigate around.
I have been a user of the Pebble Time since it was released earlier this year and I really enjoy the extended battery life (measured in days not hours) when compared to Android Wear devices or the Apple Watch. However, the capabilities of the Pebble Time are more limited with the iPhone 6s than they are with Android – I can no longer use voice dictation to reply to text messages or emails and the ability to reply with easy templates (yes, no, I’ll be there soon, etc.) is no longer available. Apple does not allow the same level of access to the necessary APIs as Android does and thus my Time has effectively become a read-only device.
Finally, my concern about missing widgets continues to stir within me; it is something that I think the iPhone 6s could benefit from greatly. I also don’t understand the inability to arrange the icons on the home screens in an arbitrary fashion. Apple will not let me move icons to the bottom of the page without first filling up every other spot on the screen – there can be no empty spaces!! So while my organizational style would like to have a group of three icons in the bottom right hand corner of the screen with some empty space around it, Apple doesn’t allow me to do that. If I want those icons in that location I need to fill up every empty space on the screen to do so. Very odd.
Subject: General Tech | October 1, 2015 - 02:17 PM | Ken Addison
Tagged: podcast, video, fable legends, dx12, apple, A9, TSMC, Samsung, 14nm, 16nm, Intel, P3608, NVMe, logitech, g410, TKL, nvidia, geforce now, qualcomm, snapdragon 820
PC Perspective Podcast #369 - 10/01/2015
Join us this week as we discuss the Fable Legends DX12 Benchmark, Apple A9 SoC, Intel P3608 SSD, and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the Store
- RSS - Subscribe through your regular RSS reader
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Josh Walrath, Jeremy Hellstrom, and Allyn Malventano
Program length: 1:42:35
Week in Review:
0:54:10 This episode of PC Perspective is brought to you by…Zumper, the quick and easy way to find your next apartment or home rental. To get started and to find your new home go to http://zumper.com/PCP
News item of interest:
Hardware/Software Picks of the Week:
Subject: General Tech | October 1, 2015 - 01:43 PM | Jeremy Hellstrom
Tagged: corsair, VOID Wireless, gaming headset, 7.1 headset
On paper these headphones are impressive, wireless performance out to 40' with 16 hours of charge, frequency response of 20Hz to 20kHz on the 50mm drivers and 7.1 surround sound. There have been many previous software emulated 7.1 directional gaming headsets which have disappointed users but in this case Benchmark Reviews quite liked the performance of the VOID while gaming and listening to music. The noise cancelling microphone, dubbed an “InfoMic” as it has LED lights which can be illuminated in different ways depending on your preferences and even the game you happen to be playing. You can also sync the lights with other Corsair RGB devices using the Cue software if you are so inclined. Check out the full reivew right here.
"In the world of computer peripherals and hardware, most of us are well aware of Corsair’s existence. This is an organization that has well-earned reputation for producing quality components; components that are going to be high-performing, intelligently designed, and very likely to provide its owners with years of service."
Here is some more Tech News from around the web:
- ASUS Strix 7.1 Headset Review @HiTech Legion
- Tt eSPORTS Shock 3D 7.1 Gaming Headset Review @ NikKTech
- Inateck MercuryBox Bluetooth Speaker & Mobile Products @ eTeknix
- Creative Sound Blaster Roar 2 @ Legion Hardware
Subject: General Tech | October 1, 2015 - 12:44 PM | Jeremy Hellstrom
Tagged: stagefright, security, Android
Assuming you have a carrier with a sense of responsibility and a reasonably modern phone the chances are you are patched against the original Stagefright vulnerability. This is not the case for the recently reported vulnerabilities dubbed Stagefright 2.0. If you open a specially and nefariously modified MP3 or MP4 file in Stagefright on Android 5.0+ it has been confirmed that those files can trigger remote code execution via libstagefright. If you are on an older model then the vulnerability lies in libutils and can be used for the same purpose, gaining access to the data stored on your device. From the security company reports that The Register has linked, it sounds like we can expect many repeat performances as the Stagefright library was poorly written and contains many mistakes; worse is the fact that it is not sandboxed in any way and has significantly higher access than an application for playing media files should ever have.
"Joshua Drake from the security outfit Zimperium zLabs introduced us to StageFright earlier this summer, and he is now back with a similar warning and a brace of problems, according to a post on the Kaspersky Threatpost news site."
Here is some more Tech News from around the web:
- Microsoft rolls out Skype Translator to Windows desktop app @ The Inquirer
- Windows 10's second month sees sluggish growth in market share @ The Inquirer
- Weird garbled Windows 7 update baffles world – now Microsoft reveals the truth @ The Register
- Tear teardown down, roars Apple: iFixit app yanked from store @ The Register
- Acer: We're not laying off staff, just shifting 'em out of the PC biz @ The Register
- Tenda AC15 AC1900 Dual-Band WiFi Router @ Benchmark Reviews
Subject: Processors | September 30, 2015 - 09:55 PM | Josh Walrath
Tagged: TSMC, Samsung, FinFET, apple, A9, 16 nm, 14 nm
So the other day the nice folks over at Chipworks got word that Apple was in fact sourcing their A9 SOC at both TSMC and Samsung. This is really interesting news on multiple fronts. From the information gleaned the two parts are the APL0898 (Samsung fabbed) and the APL1022 (TSMC).
These process technologies have been in the news quite a bit. As we well know, it has been a hard time for any foundry to go under 28 nm in an effective way if your name is not Intel. Even Intel has had some pretty hefty issues with their march to sub 32 nm parts, but they have the resources and financial ability to push through a lot of these hurdles. One of the bigger problems that affected the foundries was the idea that they could push back FinFETs beyond what they were initially planning. The idea was to hit 22/20 nm and use planar transistors and push development back to 16/14 nm for FinFET technology.
The Chipworks graphic that explains the differences between Samsung's and TSMC's A9 products.
There were many reasons why this did not work in an effective way for the majority of products that the foundries were looking to service with a 22/20 nm planar process. Yes, there were many parts that were fabricated using these nodes, but none of them were higher power/higher performance parts that typically garner headlines. No CPUs, no GPUs, and only a handful of lower power SOCs (most notably Apple's A8, which was around 89 mm squared and consumed up to 5 to 10 watts at maximum). The node just did not scale power very effectively. It provided a smaller die size, but it did not increase power efficiency and switching performance significantly as compared to 28 nm high performance nodes.
The information Chipworks has provided also verifies that Samsung's 14 nm FF process is more size optimized than TSMC's 16 nm FF. There was originally some talk about both nodes being very similar in overall transistor size and density, but Samsung has a slightly tighter design. Neither of them are smaller than Intel's latest 14 nm which is going into its second generation form. Intel still has a significant performance and size advantage over everyone else in the field. Going back to size we see the Samsung chip is around 96 mm square while the TSMC chip is 104.5 mm square. This is not huge, but it does show that the Samsung process is a little tighter and can squeeze more transistors per square mm than TSMC.
In terms of actual power consumption and clock scaling we have nothing to go on here. The chips are both represented in the 6S and 6S+. Testing so far has not shown there to be significant differences between the two SOCs so far. In theory one could be performing better than the other, but in reality we have not tested these chips at a low enough level to discern any major performance or power issue. My gut feeling here is that Samsung's process is more mature and running slightly better than TSMC's, but the differences are going to be minimal at best.
The next piece of info that we can glean from this is that there just isn't enough line space for all of the chip companies who want to fabricate their parts with either Samsung or TSMC. From a chip standpoint a lot of work has to be done to port a design to two different process nodes. While 14 and 16 are similar in overall size and the usage of FinFETS, the standard cells and design libraries for both Samsung and TSMC are going to be very different. It is not a simple thing to port over a design. A lot of work has to be done in the design stage to make a chip work with both nodes. I can tell you that there is no way that both chips are identical in layout. It is not going to be a "dumb port" where they just adjust the optics with the same masks and magically make these chips work right off the bat. Different mask sets for each fab, verification of both designs, and troubleshooting the yields by metal layer changes will be different for each manufacturer.
In the end this means that there just simply was not enough space at either TSMC or Samsung to handle the demand that Apple was expecting. Because Apple has deep pockets they contracted out both TSMC and Samsung to produce two very similar, but still different parts. Apple also likely outbid and locked down what availability to process wafers that Samsung and TSMC have, much to the dismay of other major chip firms. I have no idea what is going on in the background with people like NVIDIA and AMD when it comes to line space for manufacturing their next generation parts. At least for AMD it seems that their partnership with GLOBALFOUNDRIES and their version of 14 nm FF is having a hard time taking off. Eventually more space will be made in production and yields and bins will improve. Apple will stop taking up so much space and we can get other products rolling off the line. In the meantime, enjoy that cutting edge iPhone 6S/+ with the latest 14/16 nm FF chips.
Subject: General Tech | September 30, 2015 - 04:05 PM | Jeremy Hellstrom
Tagged: roccat, Nyth, gaming mouse, input
That is no typo, the Twin-Tech Laser Sensor R1 on the Nyth really does go all the way up to 12000 DPI and it also has an adjustable lift-off distance. There are also 18 buttons, with the shift key function they can all be assigned a second function as well. The Swarm software used to program the mouse is rather impressive, not only can you assign profiles to games you can program a light show into your mouse if you so desire. It will set you back $120 but if the price tag does not scare you off you can see how it performs in MadShrimps' review.
"ROCCAT Nyth is like a breath of fresh air in the already crowded gaming mice market which sports quite a modular design with replaceable right side panel, no less than four different sets of buttons, a smooth durable plastic texture, catchy LED light effects and a comfortable shape for lengthy gaming sessions."
Here is some more Tech News from around the web:
- Ozone Argon Ocelote World @ techPowerUp
- Razer Mamba Chroma Tournament Edition Review @ Bjorn3d
- CM Storm Quick Fire XTi Mechanical Keyboard @ eTeknix
- E-Blue Mazer K727 Mechanical Gaming Keyboard @ Bjorn3d
- Corsair Strafe Gaming Keyboard @ techPowerUp
Subject: Mobile | September 30, 2015 - 02:33 PM | Sebastian Peak
Tagged: X12 Modem, SoC, snapdragon 820, qualcomm, phones, mu-mimo, mobile, LTE, cell phones
The upcoming Snapdragon 820 is shaping up to be a formidable SoC after the disappointing response to the previous flagship, the Snapdragon 810, which was in far fewer devices than expected for reasons still shrouded in mystery and speculation. One of the biggest aspects of the upcoming 820 is Qualcomm’s new X12 modem, which will provide the most advanced LTE connectivity seen to date when the SoC launches. The X12 features CAT 12 LTE downlink speeds for up to 600 Mbps, and CAT 13 on the uplink for up to 150 Mbps.
LTE connectivity isn’t the only new thing here, as we see from this slide there is also tri-band Wi-Fi supporting 2x2 MU-MIMO.
“This is the first publicly announced processor for use in mobile devices to support LTE Category 12 in the downlink and Category 13 in the uplink, providing up to 33 percent and 200 percent improvement over its predecessor’s download and upload speeds, respectively.”
The specifications for this new modem are densely packed:
- Cat 12 (up to 600 Mbps) in the downlink
- Cat 13 (up to 150 Mbps) in the uplink
- Up to 4x4 MIMO on one downlink LTE carrier
- 2x2 MU-MIMO (802.11ac)
- Multi-gigabit 802.11ad
- LTE-U and LTE+Wi-Fi Link Aggregation (LWA)
- Next Gen HD Voice and Video calling over LTE and Wi-Fi
- Call Continuity across Wi-Fi, LTE, 3G, and 2G
- RF front end innovations
- Advanced Closed Loop Antenna Tuner
- Qualcomm RF360™ front end solution with CA
- Wi-Fi/LTE antenna sharing
Rumored phones that could end up running the Snapdragon 820 with this X12 modem include the Samsung Galaxy S7 and around 30 other devices, though final word is of course pending on shipping hardware.
Subject: General Tech | September 30, 2015 - 02:31 PM | Jeremy Hellstrom
Tagged: gaming, fallout 4
Fallout 4 is sounding less and less like a Fallout game and more like a game which happens to bear the name Fallout. Apparently the skill system which has been a core of Fallout is confusing people, although how is unclear and the example given is rather poor “What’s better, the Charisma SPECIAL, or the Speech Skill" considering you can't have more than a 10 Charisma. Perhaps it is too early to be negative, there will be 70 perks, 10 level for each SPECIAL stat and each perk with five levels to increase their effectiveness. Your perks are limited by the stat, if you have a Perception of 7 then you will never be able to gain the perks associated with levels 8 and higher, then again if you have a stat of 10 at level 1 nothing is stopping you from starting with a level 10 perk.
There are going to be a lot of differences apparent in Fallout 4 and it will be interesting to see how they effect gameplay. Excitiment is waning for some long time fans but perhaps for gamers new to the series who are in love with crafting, base management and are easily confused by numbers this will be a perfect introduction to the wasteland. Follow the link to RPS to see the video explaining the new system.
"Here’s the big news: as many suspected, Skills are indeed gone, with their effects rolled into a bounteous system of perks with levels of their own. I’ll explain."
Here is some more Tech News from around the web:
- BATTLETECH by Harebrained Schemes LLC @ Kickstarter
- Killing Floor 2 - NVIDIA FleX Technology @HiTech Legion
- SC2: Legacy Of The Void Trailer Pledges Its Life For Aiur @ Rock, Paper, SHOTGUN
- Chess, guns, and chainswords collide in Warhammer 40,000: Regicide @ The Tech Report
- Made It! 80 Days Out On PC Today @ Rock, Paper, SHOTGUN
- SafeDisc, SecuROM DRM support removed from Windows 8, 7, Vista @ HEXUS
- What I Want From The Next BioShock @ Rock, Paper, SHOTGUN
- Humble Indie Bundle 15: Gang Beasts, Skullgirls, More @ Rock, Paper, SHOTGUN
Subject: General Tech | September 30, 2015 - 01:04 PM | Jeremy Hellstrom
Tagged: amd, carrizo pro, Godavari Pro, 28nm, hp, elitebook
The Carrizo based AMD Pro A12 APU is going to be familiar to anyone who read our coverage of the non-Pro Carrizo models. The A12 will have a boost clock of 3.4GHz, eight 800MHz Radeon R7 cores, 2MB of L2 cache, and hardware based HEVC decoding, exactly like the FX-8800P. Indeed there is nothing obvious that differentiates the two processors apart from AMD's tag line that the Pro models are designed for corporate desktops and laptops. The Inquirer lists three laptops which should already be available which use the new mobile processor, the HP EliteBook 725, 745 and 755. No news yet on Godavari Pro powered desktops.
"AMD HAS ANNOUNCED its "most powerful" line of Pro A-Series mobile and desktop processors, formerly codenamed Carrizo Pro and Godavari Pro."
Here is some more Tech News from around the web:
- Google literally dangles its new dongle in front of gasping TV audiences @ The Register
- Hack Anything into a Phone @ Hack a Day
- Critical WinRAR flaw puts a nation of unzippers in harm's way @ The Inquirer
- Microsoft eats its Dynamics CRM young with Adxstudio buy @ The Register
- New Attack Bypasses Mac OS X Gatekeeper @ Slashdot
- Linux-powered botnet can kick out a huge and persistent DoS attack @ The Inquirer
- AVM FRITZ!Powerline 546E WLAN Adapter Review @ NikKTech