Flash player not detected. Click here to install flash.
« 1 2 3 4 5 »

Turn that Note 7 off, it won't work in your Gear VR

Subject: General Tech | October 11, 2016 - 12:38 PM |
Tagged: Samsung, recall, gear vr, galaxy note 7

It is official, Samsung has called a halt to production of the Note 7 and not just because it is likely impossible to insure a building in which they are manufactured or stored.  The recall of 2.5 million handsets was damaging to the company and its reputation but the incidents of replacement batteries suffering the same catastrophic failure have spelled the end of this device.  Samsung suggests you immediately power down your device and contact your provider or retailer for a refund or for credit on a different handset.

Ars Technica also spotted a pertinent message on the current update to the Gear VR headset which states that support for the Note 7 has been discontinued and you are no longer able to install the app on a Note 7.  Thankfully there have been no reports of a battery failure while a Note 7 was inside of a Gear VR and this move should prevent that from ever happening.  Expect more statements from Samsung on this topic throughout the week.


"Oculus and Samsung have obviously realized this and has pushed out an update preventing the volatile phone from working with the Gear VR headset."

Here is some more Tech News from around the web:

Tech Talk

Source: Ars Technica

Western Digital Gets Back in the SSD Game With Blue and Green SSDs!

Subject: Storage | October 11, 2016 - 11:50 AM |
Tagged: western digital, wdc, WD, ssd, Green, Blue

It has been over 6 years since we saw an SSD come out of Western Digital, but we suspected some new ones may be coming after their recent acquisition of SanDisk. That say has come, and today we have two new SSD models announced by WD:


These new SSDs naturally borrow SanDisk 15nm TLC flash but drive that flash with aftermarket controllers. The Blue employs a Marvell 88SS1074 controller while the Green will use a Silicon Motion SM2256S. The Blue will have the typical SATA 6Gbps saturating specs seen in modern SSDs, while the Green will be derated a bit. Detailed specifications are below:

  • Form Factors: 2.5¨/7mm cased, M.2 2280
  • Endurance (Blue):
    • 250GB: 100 TBW
    • 500GB: 200 TBW
    • 1TB: 400 TBW
  • Power (Blue):
    • Slumber: 42mW-52mW
    • DEVSLP: 4.9mW-9.7mW
    • Average Active Power: 70mW
  • Warranty (Blue and Green): 3 years

WD Blue SSD Specs.png

The WD Green will be more budget minded and is to be offered in only a 120GB and 240GB form factor, with reduced endurance ratings of 40 TBW and 80 TBW, respectively.

Pricing (for the WD Blue SSD):

  • 250 GB $79.99
  • 500 GB $139.99
  • 1TB $299.99

WD Blue WD Green SATA 1.png

The WD Green SSD will be available 'later this quarter', and we do not yet have pricing for that model, but it should come in at a lower cost than the Blue prices above. We have a Blue model in for testing and should see how it fares on our new storage suite later this week.

Press blast after the break.

Western Digital Refreshes Colorful My Passport and My Book Lines

Subject: Storage | October 11, 2016 - 10:22 AM |
Tagged: western digital, wdc, WD, my passport, my book

Western Digital has refreshed their My Passport and My Book lines with a new industrial design:


The My Passport line (pictured above) features a new design and colors. Capacities now extend all the way up to 4TB. Prices:

  • 1 TB $79.99
  • 2 TB $109.99
  • 3 TB $149.99
  • 4 TB $159.99

These feature password protection and AES-256 hardware encryption. There is also a 'My Passport for Mac' model which parallels the above series but comes pre-formated for use with a Mac. Amazing that they are now fitting 4TB of capacity into a 2.5" enclosure.


Also up is a redesign of the My Book. This bookshelf style drive is now a chunkier version of the My Passport products mentioned earlier. Thanks to Helium-filled HGST HelioSeal technology recently acquired by Western Digital, capacities now extend up to 8TB on this line. Prices follow:

  • 3 TB $129.99
  • 4 TB $149.99
  • 6 TB $229.99
  • 8 TB $299.99

I like the more squared off design, especially for the My Book, as it should make them more stable and less likely to be tipped over by accidental bumps. These also support hardware encryption. All models of both the My Book and My Passport come with a 2-year limited warranty as well as backup software to help ease the process of automating your backups. 

Press blast after the break.

Subject: Storage
Manufacturer: Drobo

Introduction and Packaging

The Drobo 5D launched a few years ago and continues to be a pricey solution, running close to $600. This was due to added complexity with its mSATA hot data cache and other features that drove the price higher than some potential buyers were happy with. Sure the cache was nice, but many photographers and videographers edit their content on a faster internal SSD and only shift their media to their external storage in bulk sequential file copies. These users don’t necessarily need a caching tier built into their mass storage device - as they just want good straight-line speed to offload their data as fast as possible.

With new management and a renewed purpose with a focus on getting lower cost yet performant products out there, Drobo relaunched their base 4-bay product in a third-generation form. We tested that unit back in December of 2014, and its performance was outstanding for a unit that typically runs in the mid-$200 price range. The price and performance were great, but things were a bit tight when trying to use Dual Disk Redundancy while limited to only four installed drives. A fifth bay would have certainly been handy, as would USB-C connectivity, which brings me to the subject of today’s review:


I present to you the Drobo 5C. Essentially a 5-bay replacement to the 4-bay 3rd gen Drobo. This will become the new base model Drobo, meaning there will no longer be any 4-bay models in Drobo's product lineup:

Drobo 5C lineup.png

Read on for our review of the new Drobo 5C!

Intel Launches Stratix 10 FPGA With ARM CPU and HBM2

Subject: Processors | October 10, 2016 - 02:25 AM |
Tagged: SoC, Intel, FPGA, Cortex A53, arm, Altera

 Intel and recently acquired Altera have launched a new FPGA product based on Intel’s 14nm Tri-Gate process featuring an ARM CPU, 5.5 million logic element FPGA, and HBM2 memory in a single package. The Stratix 10 is aimed at data center, networking, and radar/imaging customers.

The Stratix 10 is an Altera-designed FPGA (field programmable gate array) with 5.5 million logic elements and a new HyperFlex architecture that optimizes registers, pipeline, and critical pathing (feed-forward designs) to increase core performance and increase the logic density by five times that of previous products. Further, the upcoming FPGA SoC reportedly can run at twice the core performance of Stratix V or use up to 70% less power than its predecessor at the same performance level.

Intel Altera Stratix 10.jpg

The increases in logic density, clockspeed, and power efficiency are a combination of the improved architecture and Intel’s 14nm FinFET (Tri-Gate) manufacturing process.

Intel rates the FPGA at 10 TFLOPS of single precision floating point DSP performance and 80 GFLOPS/watt.

Interestingly, Intel is using an ARM processor to feed data to the FPGA chip rather than its own Quark or Atom processors. Specifically, the Stratix 10 uses an ARM CPU with four Cortex A53 cores as well as four stacks of on package HBM2 memory with 1TB/s of bandwidth to feed data to the FPGA. There is also a “secure device manager” to ensure data integrity and security.

The Stratix 10 is aimed at data centers and will be used with in specialized tasks that demand high throughput and low latency. According to Intel, the processor is a good candidate for co-processors to offload and accelerate encryption/decryption, compression/de-compression, or Hadoop tasks. It can also be used to power specialized storage controllers and networking equipment.

Intel has started sampling the new chip to potential customers.

Intel Altera Stratix 10 FPGA SoC.png

In general, FPGAs are great at highly parallelized workloads and are able to efficiently take huge amounts of inputs and process the data in parallel through custom programmed logic gates. An FPGA is essentially a program in hardware that can be rewired in the field (though depending on the chip it is not necessarily a “fast” process and it can take hours or longer to switch things up heh). These processors are used in medical and imaging devices, high frequency trading hardware, networking equipment, signal intelligence (cell towers, radar, guidance, ect), bitcoin mining (though ASICs stole the show a few years ago), and even password cracking. They can be almost anything you want which gives them an advantage over traditional CPUs and graphics cards though cost and increased coding complexity are prohibitive.

The Stratix 10 stood out as interesting to me because of its claimed 10 TFLOPS of single precision performance which is reportedly the important metric when it comes to training neural networks. In fact, Microsoft recently began deploying FPGAs across its Azure cloud computing platform and plans to build the “world’s fastest AI supercomputer. The Redmond-based company’s Project Catapult saw the company deploy Stratix V FPGAs to nearly all of its Azure datacenters and is using the programmable silicon as part of an “acceleration fabric” in its “configurable cloud” architecture that will be used initially to accelerate the company’s Bing search and AI research efforts and later by independent customers for their own applications.

It is interesting to see Microsoft going with FPGAs especially as efforts to use GPUs for GPGPU and neural network training and inferencing duties have increased so dramatically over the years (with NVIDIA being the one pushing the latter). It may well be a good call on Microsoft’s part as it could enable better performance and researchers would be able to code their AI accelerator platforms down to the gate level to really optimize things. Using higher level languages and cheaper hardware with GPUs does have a lower barrier to entry though. I suppose ti will depend on just how much Microsoft is going to charge customers to use the FPGA-powered instances.

FPGAs are in kind of a weird middle ground and while they are definitely not a new technology, they do continue to get more complex and powerful!

What are your thoughts on Intel's new FPGA SoC?

Also read:

Source: Intel

Google WiFi Bringing Wireless Mesh Networking to the Home

Subject: Networking | October 9, 2016 - 01:42 AM |
Tagged: wifi, onhub, mesh, google wifi, google, 802.11ac

Building on the company’s OnHub WiFi router program, the search giant will be offering up its own mesh WiFi network solution for home users later this year aptly named “Google WiFi.” Available in November for pre-order Google will offer single and triple packs of its puck-shaped smartphone controlled WiFi nodes.

Google WiFi node.png

Google WiFi is a new product that takes advantage of an old technology called mesh networking. While most home users rely on a single powerful access point to distribute the wireless signal throughout the home, mesh networks place nodes around the home in such a way that the WiFi networks overlap. Devices can connect to any node and transition between nodes automatically. The nodes communicate with each other wirelessly and connect end devices to the router and Internet by taking the best path (least number of hops and/or highest signal strengths). This model does have some disadvantages that are shared with WiFi repeater solutions in that as much as 50% (or worse!) of the bandwidth can be lost at each hop as the devices use wireless for both communicating with end devices and the backbone to the router. The advantage though is that you need only find a power outlet to set up the mesh node and there is no need to run Ethernet or deal with Powerline or MoCA setups.

Fortunately, it looks as though Google has mitigated the disadvantage by including two radios. The circular Google WiFi nodes (which measure 4.17” diagonally and 2.7” tall) pack a dual band 802.11ac WiFi chip that can operate at both 2.4 GHz and 5 GHz. Using the 5 GHz network for in room end devices (PCs, smartphones, game consoles, Rokus, et al) and the 2.4 GHz network to communicate with each other will help to eliminate a major bottleneck. There will likely still be some bandwidth lost, especially over multiple hops, due to interference, but it should be much less than 50% bandwidth loss.

Google WiFi Mesh.png

Each Google WiFi node features two Gigabit Ethernet ports that can be setup as LAN or WAN ports, Bluetooth, and an 802.11ac 2x2 WiFi radio with beamforming support. The nodes are powered by an unspecified quad core processor, 512MB DDR3L memory, and 4GB of eMMC flash storage. The nodes apparently draw as much as 15 watts.

Of course, being Google, the Google WiFi can be controlled using an Android or iOS app that allows the administrator to pause WiFi on a per-device basis (e.g. set time limits for children), monitor device bandwidth usage and prioritize traffic, and automatically apply firmware updates to mitigate security risks. Additionally, Google WiFi automatically configures each node to use the best channel and band to get the best performance that supports all devices.

The nodes currently come only in white and are constructed of plastic. There are blue LEDs around the middle of the puck shaped device. Google WiFi will be available for pre-order in November. A single node will cost $129 while a three pack will cost $299. Google is not first to the wireless mesh party but it looks like it will be competitively priced (the three pack is $200 cheaper than eero, for example).

This looks like it might be a simple to setup solution if you or your family are currently running a single access point that can’t quite cover the entire home. I don’t really see this as a product for enthusiasts, but it might be worth recommending to people that just want WiFi that works with little setup. I will have to wait for reviews to say for sure though.

What are your thoughts on Google WiFi?

Also read:

Source: Google

Ubisoft Has Some Games for Free

Subject: General Tech | October 8, 2016 - 07:33 AM |
Tagged: ubisoft, pc gaming, free games, free

This has apparently been going on since June, but I just found out that Ubisoft was giving away some of their older titles for free. Like EA's “On the House” promotion, you can keep the title, but only if you add it to your UPlay account before the cut off date. We're just before the change in months, so, for the next few days, you can add The Crew. Then, starting on October 12th, you can pick up the original Beyond Good and Evil for free.


As expected, you will need to have a UPlay account for this to work. Still, it's an otherwise free game, and a cult classic at that. While this promotion is officially for Ubisoft's 30th anniversary, and two games will go free after Beyond Good and Evil, Ubisoft took the opportunity to announce that a sequel to Beyond Good and Evil is being developed. I guess this means that we'll only have a couple more E3s where journalists write top ten “I want to see announced” lists containing Beyond Good and Evil 2. Yet another thing that will probably be released before Half-Life 2: Episode 3.

Source: Ubisoft

NVIDIA Releases GeForce 373.06 Drivers

Subject: Graphics Cards | October 8, 2016 - 07:01 AM |
Tagged: nvidia, graphics drivers, geforce

On Thursday, NVIDIA released their latest graphics drivers to align with Gears of War 4, Mafia 3, and Shadow Warrior 2. The drivers were published before each of these games launched, which allows gamers to optimize their PCs ahead of time. Graphics vendors work with many big-budget studios during their development cycles, and any tweaks that they found over the months and years will be targeted to this release, as usual.


Beyond tweaking for these games, NVIDIA has also announced a couple of fixes. If you were experiencing issues in Overwatch, then these new drivers fix how decals are drawn. The major fix claims to reduce inconsistent performance in multiple VR titles, which is very useful for these applications.

You can get these drivers from their website, or just install them from GeForce Experience.

Source: NVIDIA

Like matte rubber on your mice? Check out the Dream Machines DM1 Pro

Subject: General Tech | October 7, 2016 - 02:48 PM |
Tagged: dream machines, gaming mouse, DM1 Pro

Dream Machines is not a well known brand but do have a line of computer equipment including the newly released DM1 Pro gaming mouse.  The mouse uses a Pmw3310dh optical sensor with DPI ranging from 400 to 5000 and indicates your current setting in a unique way, the colour displayed by the LED indicates the current sensitivity.  There are six buttons present, including the sensitivity toggle, with thumb buttons positioned for right handed users.  Kitguru liked the mouse but were disappointed by the complete lack of software to customize the mouse, take a peek and see what you think.


"The latest mouse to come in for review is the Dream Machines DM1 Pro. A Polish company, you would be forgiven for not having heard of them. However, they supply laptops, speakers and mice so we were pleased to be sent the DM1 Pro mouse. Priced at £39 in the UK, it sports an ambidextrous design and optical sensor – but how does it fare in the real world?"

Here is some more Tech News from around the web:

Tech Talk

Source: Kitguru

Accessorize your Oculus, if you are still interested in the Rift

Subject: General Tech | October 7, 2016 - 01:56 PM |
Tagged: oculus rift

The Rift just got a lot more expensive to set up for those of you who prefer it to the Vive.  The kit has expanded its requirements and prices for those who would like the ability to move around in VR and those who want something more accurate than the basic remote.  To upgrade your remote is $199 and the additional sensor to track your body movement is $79.  While that is not too bad as they are additional features it seems that Oculus had the incredibly bad taste to use a proprietary audio connector.  That means if you want upgraded audio that is receiving from the same source as your video you need to fork over an additional $49.  As The Register points out, this is somewhat more than the originally quoted $350 price tag for a functional VR headset.


"It's bad enough that the basic system costs $599 – almost double the expected price of $350. Today, the Facebook-owned biz revealed a range of accessories that will push its cost even higher."

Here is some more Tech News from around the web:

Tech Talk

Source: The Register

Microsoft Focusing Efforts, Forming AI and Research Group

Subject: General Tech | October 6, 2016 - 11:37 PM |
Tagged: supercomputer, microsoft, deep neural network, azure, artificial intelligence, ai

Microsoft recently announced it would be restructuring 5,000 employees as it focuses its efforts on artificial intelligence with a new AI and Research Group. The Redmond giant is pulling computer scientists and engineers from Microsoft Research, the Information Platfrom, Bing, and Cortana groups, and the Ambient Computing and Robotics teams. Led by 20 year Microsoft veteran Harry Shum (who has worked in both research and engineering roles at Microsoft), the new AI team promises to "democratize AI" and be a leader in the field with intelligent products and services. 

AI Cortana.jpg

It seems that "democratizing AI" is less about free artificial intelligence and more about making the technology accessible to everyone. The AI and Research Group plans to develop artificial intelligence to the point where it will change how humans interact with their computers (read: Cortana 2.0) with services and commands being conversational rather than strict commands, new applications baked with AI such as office and photo editors that are able to proof read and suggest optimal edits respectively, and new vision, speech, and machine analytics APIs that other developers will be able to harness for their own applications. (Wow that's quite the long sentence - sorry!)

Further, Microsoft wants to build the world's fastest AI supercomputer using its Azure cloud computing service. The Azure-powered AI will be available to everyone for their applications and research needs (for a price, of course!). Microsoft certainly has the money, brain power, and computing power to throw at the problem, and this may be one of the major areas where looking to "the cloud" for a company's computing needs is a smart move as the up front capital needed for hardware, engineers, and support staff to do something like this in-house would be extremely prohibative. It remains to be seen whether Microsoft will win out in the wake of competitors at being the first, but it is certainly staking its claim and does not want to be left out completely.

“Microsoft has been working in artificial intelligence since the beginning of Microsoft Research, and yet we’ve only begun to scratch the surface of what’s possible,” said Shum, executive vice president of the Microsoft AI and Research Group. “Today’s move signifies Microsoft’s commitment to deploying intelligent technology and democratizing AI in a way that changes our lives and the world around us for the better. We will significantly expand our efforts to empower people and organizations to achieve more with our tools, our software and services, and our powerful, global-scale cloud computing capabilities.”

Interestingly, this announcement comes shortly after a previous announcement that industry giants Amazon, Facebook, Google-backed DeepMind, IBM, and Microsoft founded the not-for-profit Partnership On AI organization that will collaborate and research best practices on AI development and exploitation (and hopefully how to teach them not to turn on us heh).

I am looking forward to the future of AI and the technologies it will enable!

Source: Microsoft

MSI Will Support Kaby Lake Processors On All 100-Series Motherboards

Subject: Motherboards | October 6, 2016 - 10:06 PM |
Tagged: Z170, msi, kaby lake, Intel B150, Intel, H110

MSI has announced that it will be supporting Intel's next generation "Kaby Lake" LGA 1151 processors via a BIOS update. The company has updated its website with the new UEFI/BIOS updates that add support for Kaby Lake on all of its 100-series motherboards.


According to MSI, in addition to Kaby Lake support, the updates improve stability and overclocking potential. Currently, the following Z170, B150, and H110 chipset based motherboards have a BIOS update available.

  • Z170 Motherboards
    • Z170A GAMING M9 ACK
    • Z170A GAMING M7
    • Z170A GAMING M6
    • Z170A GAMING M5
    • Z170A-G45 GAMING
    • Z170A GAMING M3
    • Z170A GAMING PRO
    • Z170A TOMAHAWK
    • Z170M MORTAR
  • B150 Motherboards
    • B150 GAMING M3
    • B150M NIGHT ELF
    • B150M GAMING PRO
    • B150I GAMING PRO
    • B150M MORTAR
    • B150M BAZOOKA
    • B150M BAZOOKA D3
    • B150M GRENADE
  • H110 Motherboards
    • H110M GAMING
    • H110M GRENADE

To grab the latest bios, head over to https://www.msi.com/support#support_download and use the drop down menus to search for your motherboard model. The BIOS download will be available towards the top of the list of available downloads.

MSI and ASUS have both announced support for Kaby Lake on their existing motherboards which is nice to see. If leaks are true, Intel is readying Z270 Express chipset for release in late 2016 or early 2017, but it is nice to know that you will not have to upgrade the motherboard if you do not want to just to get the latest Intel CPU.

Source: MSI

VR Lets Legally Blind Man Experience Clear Vision For First Time

Subject: General Tech | October 6, 2016 - 07:00 PM |
Tagged: virtual reality, htc vive, assistive technology

As technology continues to advance, virtual reality is slowly but surely becoming more of a reality. For many readers, VR is the next step in gaming and achieving an immersive (virtual) experience. However, for Jamie Soar virtual reality is being used to allow him to experience what it is like to have "normal" vision in the real world. Mr. Soar lives with a genetic and progressive eye condition called Retinitis Pigmentosa as well as diplopia (or double vision) which means that he has severely limited night and peripheral vision. Jamie uses a white cane for mobility and needs to get close to things like computer monitors and signs in order to read them.

HTC Vive.png

EIC Ryan Shrout using the HTC Vive to enter a VR world (Job Simulator) during a live stream.

Enter the HTC Vive and its dual lens solution that puts the displays (and the vitrual world) front and center. After donning the virtual reality headset at a PC World demo in the UK, Jamie was amazingly able to experience the virtual world in a similar way to how many people see the real world. His eyes were able to refocus on the close up displays, and thanks to the illusion of depth created by the dual lenses, he was able to look around the virtual world and see everything clearly and in brilliant color both near and far! 

vision with retinitis pigmentosa.png

Via Blindness.org: An example of what vision is like with Retinitis Pigmentosa in an advanced stage. Peripheral and night vision are generally the first aspects to be lost as photoreceptors (rods) on outer edges of retina die.

In an interview with Upload VR, Mr. Soar had this to say to those with similar visual impairments:

“Try VR . Find a means to try it because I went so long without ever knowing that this extra dimension existed that you can see. Try out as many experiences as possible. It might not be for everyone but it might give people a lot more freedom or independence in what they do.”

This is a very cool story and I am excited for Mr. Soar. The aspiring music producer plans to continue experimenting with VR and I hope that as it continues to advance it can help him even more. My first thought jumped to Scott's desire to use VR for productivity work using an infinite desktop and how it could help Jamie compose and produce his music and get the same – or better – benefits most people get from having mutiple monitor setups without having to lean in to each monitor. I do not have nearly the vision loss that Mr. Soar has, but I can definitely empathize with him on many points. I think that it is awesome that he was able to test out VR and explore how he can use it to help him!

In my case I am more looking forward to AR (augmented reality) and future products built on things like Or Cam, Microsoft's Seeing AI project (which I thought I wrote about previously but can not find it via Google heh), and even things like and AiPoly (iOS) that use neural networks and can identify objects, people and their facial expressions, and even describe what is happening in natural language (we are not quite there yet but are definitely getting there).

Regardless of whether AR or VR, the advances in technology in just my 26 years have been amazing and the assitive technology available now is unbelievable. The future is exciting, indeed and I can't wait to see what comes next!

Source: Upload VR

Thermaltake's new Toughpower PSU; its RGB skill causes 850 DPS

Subject: General Tech | October 6, 2016 - 05:46 PM |
Tagged: toughpower DPS G RGB, thermaltake, RGB, modular psu, 850W

Enough is enough marketing departments!  The Toughpower branding is recognizable but when it becomes an RGB to the DPS G-unit one starts to wonder if this a PSU or a new professional gaming team.  Oh, lest we forget to mention it, the box proclaims this is indeed a VR Ready PSU; perhaps it provides 3D virtual electrons?   Aparently the DPS G portion indicates it is compatible with your cellphone as the PSU provides both modular and mobile features.  Lastly the RGB portion of the branding; if you guessed it has a fan capable of producing 256 different colours then you got it! It is even possible it creates airflow at the same time.

Does it actually work as a PSU?  Does anyone even care when it has all of these wonderous features?  Only [H]ard|OCP knows.


"Flashy lights are cool if you are into that kind of thing, but we want to know about the new Thermaltake power supply beyond the pretty hues of red, green, and blue. Toughpower units have weighed in well in the past, but how about in today's market as it is a lot more competitive now."

Here are some more Cases & Cooling reviews from around the web:


Source: [H]ard|OCP

Podcast #420 - REVEEN JUSTICE, Silverstone Strider 550W PSU, NVIDIA Xavier and more!

Subject: General Tech | October 6, 2016 - 03:26 PM |
Tagged: Xavier SoC, video, Silverstone Strider Platinum 550W, REVEEN JUSTICE, podcast, logitech, GTX 1050 Ti, Drobo 5C, Crimson 16.10.1, C922

PC Perspective Podcast #420 - 10/06/16

Join us this week as we discuss the REVEEN JUSTICE Air Cooler, Silverstone Strider 550W PSU, NVIDIA Xavier SoC, Google Pixel, Logitech C922, 1050Ti, and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

Hosts:  Ryan Shrout, Allyn Malventano, Josh Walrath, Jeremy Hellstrom, and Sebastian Peak

Program length: 1:21:45

  1. Week in Review:
  2. News items of interest:
  3. Hardware/Software Picks of the Week
  4. Closing/outro

Subscribe to the PC Perspective YouTube Channel for more videos, reviews and podcasts!!

Gigabyte Launches GTX 1080 TT With Blower Style Cooler

Subject: Graphics Cards | October 6, 2016 - 03:17 PM |
Tagged: windforce, pascal, nvidia, GTX 1080, gigabyte

Gigabyte is launching a new graphics card with a blower style cooler that it is calling the GTX 1080 TT. The card, which is likely based on the NVIDIA reference PCB, uses a lateral-blower style single “WindForce Turbo Fan” fan. The orange and black shrouded fan takes design cues from the company’s higher end Xtreme Gaming cards and it has a very Mass Effect / Halo Forerunners vibe to it.

The GV-N1080TTOC-8GD is powered by a single 8-pin PCI-E power connector and has a 180W TDP. Despite not using more than one external power connector, the card does still have a bit of overclocking headroom (a total of 225W from the PCI-E spec, though overdrawing on the 8-pin has been done before if the card is not locked in the BIOS to not do so heh). External video outputs include one DVI, one HDMI, and three DisplayPorts. I wish that the DVI port had been cut so that the blower cooler could have a much larger vent to exhaust air out of the case with, but it is what it is.

Gigabyte GTX 1080TT Blower Style Cooled Graphics Card.jpg

Out of the box the Gigabyte GTX 1080 TT runs the Pascal-based 2560 CUDA core GPU at 1632 MHz base and 1772 MHz boost. In OC Mode the GPU runs at 1657 MHz base and 1797 MHz boost. The 8 GB of GDDR5X memory is left untouched at the stock 10 GHz in either case. For comparison, reference clock speeds are 1607 MHz base and 1733 MHz boost. As far as factory overclocks go, these are not bad (they are usually at least this conservative).

The heatsink uses three direct contact 6mm copper heat pipes for the GPU and aluminum plates on the VRM and memory chips that transfer heat to an aluminum fin channels that the blower fan at the back of the card uses to push case air over and out of the case. It may be possible to push the card beyond the OC mode clocks though it is not clear how stable boost clocks will be under load (or how loud the fan will be). We will have to wait for reviews on that. If you have a cramped case this may be a decent GTX 1080 option that is cheaper than the Founder's Edition desgin.

There is no word on pricing or an exact release date yet, but I would estimate it at around $640 at launch. 

Also read:

The GeForce GTX 1080 8GB Founders Edition Review - GP104 Brings Pascal to Gamers

CES 2016: Gigabyte Xtreme Gaming Series GPU with RGB LED Fans (video) @ PC Perspective

Source: TechPowerUp

Putting fingerprints on the Google Pixel

Subject: Mobile | October 6, 2016 - 01:20 PM |
Tagged: google, pixel, pixel xl, nougat, Android 7.1

The Inquirer had a chance to lay their hands on the new Google Pixel and Pixel XL and have shared their experiences here.  We have covered the specs of the phone previously and so will not reiterate them here, check out Tim's coverage for the details.  The impression that The Inq immediately had upon grasping the phone is that it feels very much like a slimmer HTC 10, which they were not overly impressed by. That HTC phone was rated 88 in DxOMark, the Pixel an 89 while the iPhone 7 garnered a rating of 86, if you follow that particular benchmark tool.  They had a strong feeling that Google may have missed too many marks on this phone to justify the pricing, read on to see if you agree with their experiences.


"On first impressions, we can't help but feel that the Pixel is a bit of a wasted opportunity. The handset has a largely boring design, doesn't offer much in the way of innovation and is expensive compared with previous Nexus smartphones."

Here are some more Mobile articles from around the web:

More Mobile Articles

Source: The Inquirer

Not everyone will be allowed to make fruit preserves; an interview with Blackberry

Subject: General Tech | October 6, 2016 - 12:36 PM |
Tagged: blackberry, Android, licensing

The Register sat down with Alex Thurber, a BlackBerry senior VP, to discuss the companies plans to license their particular flavour of Android to other phone manufacturers. Thurbur has worked at Cisco, McAfee after Intel's purchase of the company as well as a firewall company called WatchGuard so he has had some experience with locking down kit.  We will still see two more BlackBerry devices before they finally stop selling hardware but you should expect to see other brands running Blackberry licensed versions of Android soon.  They will have NIAP (National Information Assurance Partnership) certification, the same certification that Samsung's KNOX and LG's GATE qualify for.  Drop by for deeper look into what they discussed.


"BlackBerry says it won’t license its brand and security hardened Android “to any Tom Dick and Harry” as it tries to maintain the value of its brand."

Here is some more Tech News from around the web:

Tech Talk

Source: The Register

AMD Discusses Multi-GPU Frame Pacing in DirectX 12

Subject: Graphics Cards | October 5, 2016 - 09:01 PM |
Tagged: amd, frame pacing, DirectX 12

When I first read this post, it was on the same day that AMD released their Radeon Software Crimson Edition 16.10.1 drivers, although it was apparently posted the day prior. As a result, I thought that their reference to 16.9.1 was a typo, but it apparently wasn't. These changes have been in the driver for a month, at least internally, but it's unclear how much it was enabled until today. (The Scott Wasson video suggests 16.10.1.) It would have been nice to see it on their release notes as a new feature, but at least they made up for it with a blog post and a video.

If you don't recognize him, Scott Wasson used to run The Tech Report, and he shared notes with Ryan while we were developing our Frame Rating testing methodology. He was focused on benchmarking GPUs by frame time, rather than frame rate, because the number of frames that the user sees means less than how smooth the animation they present is. Our sites diverged on implementation, though, as The Tech Report focused on software, while Ryan determined that capturing and analyzing output frames, intercepted between the GPU and the monitor, would tell a more complete story. Regardless, Scott Wasson left his site to work for AMD last year, with the intent to lead User Experience.

We're now seeing AMD announce frame pacing for DirectX 12 Multi-GPU.

This feature particularly interesting, because, depending on the multi-adapter mode, a lot of that control should be in the hands of the game developers. It seems like the three titles they announced, 3D Mark: Time Spy, Rise of the Tomb Raider, and Total War: Warhammer, would be using implicit linked multi-adapter, which basically maps to CrossFire. I'd be interested to see if they can affect this in explicit mode via driver updates as well, but we'll need to wait and see for that (and there isn't many explicit mode titles anyway -- basically just Ashes of the Singularity for now).

If you're interested to see how multi-GPU load-balancing works, we published an animation a little over a month ago that explains three different algorithms, and how explicit APIs differ from OpenGL and DirectX 11. It is also embedded above.

Source: AMD

AMD Releases Radeon Software Crimson Edition 16.10.1

Subject: Graphics Cards | October 5, 2016 - 08:37 PM |
Tagged: graphics drivers, amd

Earlier today, AMD has released their Radeon Software Crimson Edition 16.10.1 drivers. These continue AMD's trend of releasing drivers alongside major titles, which, this time, are Mafia III (October 7th) and Gears of War 4 (October 11th). Both of these titles are multiple days out, apart from a handful of insiders with advanced copies, which makes it nice for gamers by letting them optimize their machine ahead of time, on their own schedule, before launch.


The driver also includes a handful of interesting fixes. First, a handful of games, such as Overwatch, Battlefield 1, and Paragon, should no longer flicker when set to CrossFire mode. Also, performance issues in The Crew should be fixed with this release.

You can download AMD Radeon Software Crimson Edition 16.10.1 from their website.

Source: AMD