Creatively testing GPUs with Google's Tilt Brush

Subject: Graphics Cards | August 23, 2016 - 01:43 PM |
Tagged: amd, nvidia, Tilt Brush, VR

[H]ard|OCP continues their foray into testing VR applications, this time moving away from games to try out the rather impressive Tilt Brush VR drawing application from Google.  If you have yet to see this software in action it is rather incredible, although you do still require an artist's talent and practical skills to create true 3D masterpieces. 

Artisic merit may not be [H]'s strong suite but testing how well a GPU can power VR applications certainly lies within their bailiwick.  Once again they tested five NVIDIA GPUs and a pair of AMD's for dropped frames and reprojection caused by a drop in FPS.

1471635809gU37bh4rad_6_1.jpg

"We are changing gears a bit with our VR Performance coverage and looking at an application that is not as GPU-intensive as those we have looked at in the recent past. Google's Tilt Brush is a virtual reality application that makes use of the HTC Vive head mounted display and its motion controllers to allow you to paint in 3D space."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

What dwells in the heart of HoloLens? Now we all know!

Subject: General Tech | August 23, 2016 - 12:40 PM |
Tagged: hololens, microsoft, Tensilica, Cherry Trail, hot chips

Microsoft revealed information about the internals of the new holographic processor used in their Hololens at Hot Chips, the first peek we have had.  The new headset is another win for Tensilica as they provide the DSP and instruction extensions; previously we have seen them work with VIA to develop an SSD controller and with AMD for TrueAudio solutions.  Each of the 24 cores has a different task it is hardwired for, offering more efficient processing than software running on flexible hardware.

The processing power for your interface comes from a 14nm Cherry Trail processor with 1GB of DDR and yes, your apps will run on Windows 10.  For now the details are still sparse, there is still a lot to be revealed about Microsoft's answer to VR.  Drop by The Register for more slides and info.

hololens_large.jpg

"The secretive HPU is a custom-designed TSMC-fabricated 28nm coprocessor that has 24 Tensilica DSP cores. It has about 65 million logic gates, 8MB of SRAM, and a layer of 1GB of low-power DDR3 RAM on top, all in a 12mm-by-12mm BGA package. We understand it can perform a trillion calculations a second."

Here is some more Tech News from around the web:

Tech Talk

Source: The Register

AMD's 7870 rides again, checking out the new cooler on the A10-7870K

Subject: Processors | August 22, 2016 - 05:37 PM |
Tagged: amd, a10-7870K

Leaving aside the questionable naming to instead focus on the improved cooler on this ~$130 APU from AMD.  Neoseeker fired up the fun sized, 125W rated cooler on top of the A10-7870K and were pleasantly surprised at the lack of noise even under load.  Encouraged by the performance they overclocked the chip by 500MHz to 4.4GHz and were rewarded with a stable and still very quiet system.  The review focuses more the improvements the new cooler offers as opposed to the APU itself, which has not changed.  Check out the review if you are considering a lower cost system that only speaks when spoken to.

14.jpg

"In order to find out just how much better the 125W thermal solution will perform, I am going to test the A10-7870K APU mounted on a Gigabyte F2A88X-UP4 motherboard provided by AMD with a set of 16 GB (2 x 8) DDR3 RAM modules set at 2133 MHz speed. I will then run thermal and fan speed tests so a comparison of the results will provide a meaningful data set to compare the near-silent 125W cooler to an older model AMD cooling solution."

Here are some more Processor articles from around the web:

Processors

Source: Neoseeker

A trio of mechanical keyboards from AiZO, the new MGK L80 lineup

Subject: General Tech | August 22, 2016 - 03:14 PM |
Tagged: AiZO, MGK L80, Kailh, gaming keyboard, input

The supply of mechanical keyboards continues to grow, once Cherry MX was the only supplier of switches and only a few companies sold the products.  Now we have choice in manufacturer as well as the switch type we want, beyond the choice of Red, Brown, Blue and so on.  AiZO chose to use Kailh switches in their MGK L80 lineup, your choice of click type and also included a wrist rest for those who desire such a thing.  Modders Inc tested out the three models on offer, they are a bit expensive but do offer a solid solution for your mechanical keyboard desires.

IMG_9339.jpg

"The MGK L80 series is the latest line of gaming keyboards manufactured by AZIO. Available in red, blue or RGB backlighting, the MGK L80 offers mechanical gaming comfort with a choice of either Kailh brown or blue switch mounted on an elegant brushed aluminum surface."

Here is some more Tech News from around the web:

Tech Talk

Source: Modders Inc

Use Bing in Edge for 30 hours a month and get ...

Subject: General Tech | August 22, 2016 - 01:26 PM |
Tagged: microsoft, microsoft rewards, windows 10, bing, edge

If you remember Bing Rewards then this will seem familiar, otherwise the gist of the deal is that if you browse on Edge and use Bing to search for 30 hours every month you get a bribe similar to what credit card companies offer.  You can choose between Skype credit, ad-free Outlook or Amazon gift cards, perhaps for aspirin to ease your Bing related headache; if such things seem worth your while.  The Inquirer points out that this is another reminder that Microsoft tracks all usage of Edge, otherwise they would not be able to verify the amount of Bing you used. 

Then again, to carry on the credit card analogy ...

Bing-logo-2013-880x660.png

"Microsoft Rewards is a rebrand of Bing Rewards, the firm's desperate attempt to get people using the irritating default search engine, and sure enough the bribes for using Edge apply only if you use Bing too."

Here is some more Tech News from around the web:

Tech Talk

 

Source: The Inquirer

Mozilla Publishes WebVR 1.0 to Nightly Releases

Subject: General Tech | August 20, 2016 - 05:36 PM |
Tagged: mozilla, webvr, Oculus

Earlier this month, the W3C published an Editor's Draft for WebVR 1.0. The specification has not yet been ratified, but the proposal is backed by engineers from Mozilla and Google. It enables the use of VR headsets in the web browser, including all the security required, such as isolating input to a single tab (in case you need to input a password while the HMD is on your face).

Mozilla_Firefox_logo_2013.png

Firefox Nightly, as of August 16th, now supports the draft 1.0 specification.

The browser currently supports Oculus CV1 and DK2 on Windows. It does not work with DK1, although Oculus provided backers of that KickStarter with a CV1 anyway, and it does not (yet) support the HTC Vive. It also only deals with the headset itself, not any motion controllers. I guess, if your application requires this functionality, you will need to keep working on native applications for a little while longer.

Source: Mozilla

GlobalFoundries Will Allegedly Skip 10nm and Jump to Developing 7nm Process Technology In House (Updated)

Subject: Processors | August 20, 2016 - 03:06 PM |
Tagged: Semiconductor, lithography, GLOBALFOUNDRIES, global foundries, euv, 7nm, 10nm

UPDATE (August 22nd, 11:11pm ET): I reached out to GlobalFoundries over the weekend for a comment and the company had this to say:

"We would like to confirm that GF is transitioning directly from 14nm to 7nm. We consider 10nm as more a half node in scaling, due to its limited performance adder over 14nm for most applications. For most customers in most of the markets, 7nm appears to be a more favorable financial equation. It offers a much larger economic benefit, as well as performance and power advantages, that in most cases balances the design cost a customer would have to spend to move to the next node.

As you stated in your article, we will be leveraging our presence at SUNY Polytechnic in Albany, the talent and know-how gained from the acquisition of IBM Microelectronics, and the world-class R&D pipeline from the IBM Research Alliance—which last year produced the industry’s first 7nm test chip with working transistors."

An unexpected bit of news popped up today via TPU that alleges GlobalFoundries is not only developing 7nm technology (expected), but that the company will skip production of the 10nm node altogether in favor of jumping straight from the 14nm FinFET technology (which it licensed from Samsung) to 7nm manufacturing based on its own in house design process.

Reportedly, the move to 7nm would offer 60% smaller chips at three times the design cost of 14nm which is to say that this would be both an expensive and impressive endeavor. Aided by Extreme Ultraviolet (EUV) lithography, GlobalFoundries expects to be able to hit 7nm production sometime in 2020 with prototyping and small usage of EUV in the year or so leading up to it. The in house process tech is likely thanks to the research being done at the APPC (Advanced Patterning and Productivity Center) in Albany New York along with the expertise of engineers and design patents and technology (e.g. ASML NXE 3300 and 3300B EUV) purchased from IBM when it acquired IBM Microelectronics. The APPC is reportedly working simultaneously on research and development of manufacturing methods (especially EUV where extremely small wavelengths of ultraviolet light (14nm and smaller) are used to etch patterns into silicon) and supporting production of chips at GlobalFoundries' "Malta" fab in New York.

APPC in Albany NY.jpg

Advanced Patterning and Productivity Center in Albany, NY where Global Foundries, SUNY Poly, IBM Engineers, and other partners are forging a path to 7nm and beyond semiconductor manufacturing. Photo by Lori Van Buren for Times Union.

Intel's Custom Foundry Group will start pumping out ARM chips in early 2017 followed by Intel's own 10nm Cannon Lake processors in 2018 and Samsung will be offering up its own 10nm node as soon as next year. Meanwhile, TSMC has reportedly already tapped out 10nm wafers and will being prodction in late 2016/early 2017 and claims that it will hit 5nm by 2020. With its rivals all expecting production of 10nm chips as soon as Q1 2017, GlobalFoundries will be at a distinct disadvantage for a few years and will have only its 14nm FinFET (from Samsung) and possibly its own 14nm tech to offer until it gets the 7nm production up and running (hopefully!).

Previously, GlobalFoundries has stated that:

“GLOBALFOUNDRIES is committed to an aggressive research roadmap that continually pushes the limits of semiconductor technology. With the recent acquisition of IBM Microelectronics, GLOBALFOUNDRIES has gained direct access to IBM’s continued investment in world-class semiconductor research and has significantly enhanced its ability to develop leading-edge technologies,” said Dr. Gary Patton, CTO and Senior Vice President of R&D at GLOBALFOUNDRIES. “Together with SUNY Poly, the new center will improve our capabilities and position us to advance our process geometries at 7nm and beyond.” 

If this news turns out to be correct, this is an interesting move and it is certainly a gamble. However, I think that it is a gamble that GlobalFoundries needs to take to be competitive. I am curious how this will affect AMD though. While I had expected AMD to stick with 14nm for awhile, especially for Zen/CPUs, will this mean that AMD will have to go to TSMC for its future GPUs  or will contract limitations (if any? I think they have a minimum amount they need to order from GlobalFoundries) mean that GPUs will remain at 14nm until GlobalFoundries can offer its own 7nm? I would guess that Vega will still be 14nm, but Navi in 2018/2019? I guess we will just have to wait and see!

Also read:

Source: TechPowerUp

IDF 2016: G.Skill Shows Off Low Latency DDR4-3333MHz Memory

Subject: Memory | August 20, 2016 - 01:25 AM |
Tagged: X99, Samsung, ripjaws, overclocking, G.Skill, ddr4, Broadwell-E

Early this week at the Intel Developer Forum in San Francisco, California G.Skill showed off new low latency DDR4 memory modules for desktop and notebooks. The company launched two Trident series DDR4 3333 MHz kits and one Ripjaws branded DDR4 3333 MHz SO-DIMM. While these speeds are not close to the fastest we have seen from them, these modules offer much tighter timings. All of the new memory modules use Samsung 8Gb chips and will be available soon.

On the desktop side of things, G.Skill demonstrated a 128GB (8x16GB) DDR4-3333 kit with CAS latencies of 14-14-14-34 running on a Asus ROG Rampage V Edition 10 motherboard with an Intel Core i7 6800K processor. They also showed a 64GB (8x8GB) kit clocked at 3333 MHz with timings of 13-13-13-33 running on a system with the same i7 6800K and Asus X99 Deluxe II motherboard.

128GB 3333MHz CL14 demo.JPG

G.Skill demonstrating 128GB DDR4-3333 memory kit at IDF 2016.

In addition to the desktop DIMMs, G.Skill showed a 32GB Ripjaws kit (2x16GB) clocked at 3333 MHz running on an Intel Skull Canyon NUC. The SO-DIMM had timings of 16-18-18-43 and ran at 1.35V.

Nowadays lower latency is not quite as important as it once was, but there is still a slight performance advantage to be had tighter timings and pure clockspeed is not the only important RAM metric. Overclocking can get you lower CAS latencies (sometimes at the cost of more voltage), but if you are not into that tedious process and are buying RAM anyway you might as well go for the modules with the lowest latencies out of the box at the clockspeeds you are looking for. I am not sure how popular RAM overclocking is these days outside of benchmark runs and extreme overclockers though to be honest.

Technical Session OC System.JPG

Overclocking Innovation session at IDF 2016.

With regards to extreme overclocking, there was reportedly an "Overclocking Innovation" event at IDF where G.Skill and Asus overclocker Elmor achieved a new CPU overclocking record of 5,731.78 MHz on the i7 6950X running on a system with G.Skill memory and Asus motherboard. The company's DDR4 record of 5,189.2 MHz was not beaten at the event, G.Skill notes in its press release (heh).

Are RAM timings important to you when looking for memory? What are your thoughts on the ever increasing clocks of new DDR4 kits with how overclocking works on the newer processors/motherboards?

Source: G.Skill

Lian Li Releases 550W and 750W SFX-L Power Supplies

Subject: Cases and Cooling | August 19, 2016 - 04:15 PM |
Tagged: Lian Li, SFX, SFX-L, power supply, PSU, small form-factor, 550W, 750w, PE-550, PE-750

Announced back in April, Lian Li has now released a pair of SFX-L power supplies for small form-factor systems with the PE-550 and PE-750. The pair offer fully-modular designs with flat, ribbon-style cables, and carry 80 Plus Gold and Platinum certifications.

pe750.jpg

The PE-750 power supply

"The PE-550 is 80Plus Gold-rated for a maximum 89.5% efficiency; the PE-750 is 80Plus Platinum-rated for a maximum 92% efficiency. Both use a near-silent 120mm smart fan and minimize noise by operating fanlessly when output power is below 30%. Both PSUs use a single 12V rail design for the best possible stability under heavy system load, matched with myriad protection features to ensure reliable operation."

pe550.jpg

The PE-550 power supply

For more information and full specs, the product page for the 550W PE-550 is here, and the 750W PE-750 page is here. The PE-550 and PE-750 retail for $115 and $169 respectively, and both are available now.

Source: Lian Li

ASUS Announces the ROG GX800 4K G-SYNC Gaming Laptop with GTX 1080 SLI

Subject: Mobile | August 19, 2016 - 03:46 PM |
Tagged: UHD, ROG, Republic of Gamers, notebook, laptop, GX800, GTX 1080, gaming, g-sync, asus, 4k

ASUS has announced perhaps the most impressively-equipped gaming laptop to date. Not only does the new ROG GX800 offer dual NVIDIA GeForce GTX 1080 graphics in SLI, but these cards are powering an 18-inch 4K display panel with NVIDIA G-SYNC.

gx800_1.jpg

Not enough? The system also offers liquid cooling (via the liquid-cooling dock) which allows for overclocking of the CPU, graphics, and memory.

gx800_2.jpg

"ROG GX800 is the world’s first 18-inch real 4K UHD gaming laptop to feature the latest NVIDIA GeForce GTX 1080 in 2-Way SLI. It gives gamers desktop-like gaming performance, silky-smooth gameplay and detailed 4K UHD gaming environments. The liquid-cooled ROG GX800 features the ROG-exclusive Hydro Overclocking System that allows for extreme overclocking of the processor, graphics card and DRAM. In 3DMark 11 and Fire Strike Ultra benchmark tests, a ROG GX800 equipped with the Hydro Overclocking System scored 76% higher than other gaming laptops in the market.

ROG GX800 comes with NVIDIA G-SYNC technology and has plug-and-play compatibility with leading VR headsets to allow gamers to enjoy truly immersive VR environments. It has the MechTAG (Mechanical Tactile Advanced Gaming) keyboard with mechanical switches and customizable RGB LED backlighting for each key."

gx800_3.jpg

Specifics on availability and pricing were not included in the announcement.

Source: ASUS

Serious mobile gaming power from ASUS, if you can afford it

Subject: Mobile | August 19, 2016 - 03:15 PM |
Tagged: asus, ROG, gtx 1070, G752VS OC Edition, pascal, gaming laptop

The mobile version of the GTX 1070, referred to here as the GTX 1070M even if NVIDIA doesn't, is a interesting part sporting 128 more cores than the desktop version albeit at a lower clock.  Hardware Canucks received the ASUS RoG G752VS OC Edition gaming laptop which uses the mobile GTX 1070, overclocked by 50MHz on the Core and by 150MHz on the 8GB of RAM, along with an i7-6820 running at 3.8GHz.  This particular model will set you back $3000US and offers very impressive performance on either it's 17.3" 1080p G-SYNC display or on an external display of your choice.  The difference in performance between the new GTX 1070(M) and the previous GTX 980M is marked, check out the full review to see just how much better this card is ... assuming the price tag doesn't immediately turn you off.

GTX1070-NOTEBOOK-10.jpg

"The inevitable has finally happened: NVIDIA's Pascal architecture has made its way into gaming notebooks....and it is spectacular. In this review we take a GTX 1070-totting laptop out for a spin. "

Here are some more Mobile articles from around the web:

More Mobile Articles

Now we know what happened to Josh's stream; does your camera do YUY2 encoding?

Subject: General Tech | August 19, 2016 - 01:06 PM |
Tagged: yuy2, windows 10, skype, microsoft, idiots

In their infinite wisdom, Microsoft has disabled MJPEG and H.264 encoding on USB webcams for Skype in their Adversary Update to Windows 10, leaving only YUY2 encoding as your choice.  The supposed reasoning behind this is to ensure that there is no duplication of encoding which could lead to poor performance; ironically the result of this change is poor performance for the majority of users such as Josh.  Supposedly there will be a fix released some time in September but for now the only option is to roll back your AU installation, assuming you are not already past the 10 day deadline.   You can thank Brad Sams over at Thurrott.com for getting to the bottom of the issue which has been plaguing users of Skype and pick up some more details on his post.

4520-max_headroom_31.jpg

"Microsoft made a significant change with the release of Windows 10 and support for webcams that is causing serious problems for not only consumers but also the enterprise. The problem is that after installing the update, Windows no longer allows USB webcams to use MJPEG or H264 encoded streams and is only allowing YUY2 encoding."

Here is some more Tech News from around the web:

Tech Talk

Source: Thurrott

Gamescom 2016: Mount & Blade II: Bannerlord Video

Subject: General Tech | August 19, 2016 - 01:40 AM |
Tagged: mount & blade ii, taleworlds

Mount & Blade is a quite popular franchise in some circles. It is based around a fairly simple, but difficult to master combat system, which mixes melee, blocking, and ranged attacks. They are balanced by reload time (and sometimes accuracy) to make all methods viable. A 100 vs 100 battle, including cavalry and other special units, is quite unique. It is also a popular mod platform, although Warband's engine can be a little temperamental.

As such, there's quite a bit of interest for the upcoming Mount & Blade II: Bannerlord. The Siege game mode involves an attacking wave beating down a fortress, trying to open as many attack paths as possible, and eventually overrunning the defenders. The above video is from the defending perspective. It seems like it, mechanically, changed significantly from Warband, particularly the Napoleonic Wars DLC that I'm used to. In that mod, attackers spawn infinitely until a time limit is reached. This version apparently focuses on single-life AI armies, which Warband had as Commander Battles.

Hmm. Still no release date, though.

AMD Announces TrueAudio Next

Subject: Graphics Cards | August 18, 2016 - 07:58 PM |
Tagged: amd, TrueAudio, trueaudio next

Using a GPU for audio makes a lot of sense. That said, the original TrueAudio was not really about that, and it didn't really take off. The API was only implemented in a handful of titles, and it required dedicated hardware that they have since removed from their latest architectures. It was not about using the extra horsepower of the GPU to simulate sound, although they did have ideas for “sound shaders” in the original TrueAudio.

amd-2016-true-audio-next.jpg

TrueAudio Next, on the other hand, is an SDK that is part of AMD's LiquidVR package. It is based around OpenCL; specifically, it uses AMD's open-source FireRays library to trace the ways that audio can move from source to receiver, including reflections. For high-frequency audio, this is a good assumption, and that range of frequencies are more useful for positional awareness in VR, anyway.

Basically, TrueAudio Next has very little to do with the original.

Interestingly, AMD is providing an interface for TrueAudio Next to reserve compute units, but optionally (and under NDA). This allows audio processing to be unhooked from the video frame rate, provided that the CPU can keep both fed with actual game data. Since audio is typically a secondary thread, it could be ready to send sound calls at any moment. Various existing portions of asynchronous compute could help with this, but allowing developers to wholly reserve a fraction of the GPU should remove the issue entirely. That said, when I was working on a similar project in WebCL, I was looking to the integrated GPU, because it's there and it's idle, so why not? I would assume that, in actual usage, CU reservation would only be enabled if an AMD GPU is the only device installed.

Anywho, if you're interested, then be sure to check out AMD's other post on it, too.

Source: AMD

More space than even Jimmy Stewart would need to satisfy his voyeurism

Subject: Storage | August 18, 2016 - 02:59 PM |
Tagged: skyhawk, Seagate, rear window, hitchcock, 10TB

Seagate designed the 10TB SkyHawk HDD for recording video surveillance by adding in firmware they refer to as ImagePerfect.  This is designed for handling 24/7 surveillance and extends the endurance life of the drive to 180TB a year, for the length of the three year warranty.  Constantly recording video means this drive will write far more often than most other usages scenarios and reads will be far less important.  eTeknix tried the drive out in their usual suite of benchmarks; being somewhat difficult to set up a test to verify the claimed support for up to 64HD recordings simultaneously.  If you are looking for durable storage at a reasonable price and might even consider needing more than eight drives of storage you should check the SkyHawk out.

Seagate_SkyHawk-Photo-top-angle.jpg

"I’ve recently had a look at the 10TB IronWolf NAS HDD from Seagate and today it is time to take a closer look at its brother, the brand new SkyHawk DVR and NVR hard disk drive with a massive 10TB capacity. Sure, you could use NAS optimized drives for simple video setups, but having a video and camera optimized surveillance disk does bring advantages. Especially when your recorded video is critical."

Here are some more Storage reviews from around the web:

Storage

Source: eTeknix

NVIDIA Officially Announces GeForce GTX 1060 3GB Edition

Subject: Graphics Cards | August 18, 2016 - 02:28 PM |
Tagged: nvidia, gtx 1060 3gb, gtx 1060, graphics card, gpu, geforce, 1152 CUDA Cores

NVIDIA has officially announced the 3GB version of the GTX 1060 graphics card, and it indeed contains fewer CUDA cores than the 6GB version.

GTX1060.jpg

The GTX 1060 Founders Edition

The product page on NVIDIA.com now reflects the 3GB model, and board partners have begun announcing their versions. The MSRP on this 3GB version is set at $199, and availablity of partner cards is expected in the next couple of weeks. The two versions will be designated only by their memory size, and no other capacities of either card are forthcoming.

  GeForce GTX 1060 3GB GeForce GTX 1060 6GB
Architecture Pascal Pascal
CUDA Cores 1152 1280
Base Clock 1506 MHz 1506 MHz
Boost Clock 1708 MHz 1708 MHz
Memory Speed 8 Gbps 8 Gbps
Memory Configuration 3GB 6GB
Memory Interface 192-bit 192-bit
Power Connector 6-pin 6-pin
TDP 120W 120W

As you can see from the above table, the only specification that has changed is the CUDA core count, with base/boost clocks, memory speed and interface, and TDP identical. As to performance, NVIDIA says the 6GB version holds a 5% performance advantage over this lower-cost version, which at $199 is 20% less expensive than the previous GTX 1060 6GB.

Source: NVIDIA

Intel's new SoC, the Joule

Subject: General Tech | August 18, 2016 - 02:20 PM |
Tagged: Intel, joule, iot, IDF 2016, SoC, 570x, 550x, Intel RealSense

Intel has announced the follow up to Edison and Curie, their current SoC device, called Joule.  They have moved away from the Quark processors they previously used to a current generation Atom.  The device is designed to compete against NVIDIA's Jetson as it is far more powerful than a Raspberry Pi and will be destined for different usage.  It will support Intel RealSense, perhaps appearing in the newly announced Project Alloy VR headset.  Drop by Hack a Day for more details on the two soon to be released models, the Joule 570x and 550x.

intel-joule-1-2x1-720x360.jpg

"The high-end board in the lineup features a quad-core Intel Atom running at 2.4 GHz, 4GB of LPDDR4 RAM, 16GB of eMMC, 802.11ac, Bluetooth 4.1, USB 3.1, CSI and DSI interfaces, and multiple GPIO, I2C, and UART interfaces."

Here is some more Tech News from around the web:

Tech Talk

Source: Hack a Day

Podcast #413 - NVIDIA Pascal Mobile, ARM and Intel partner on 10nm, Flash Memory Summit and more!

Subject: Editorial | August 18, 2016 - 02:20 PM |
Tagged: video, podcast, pascal, nvidia, msi, mobile, Intel, idf, GTX 1080, gtx 1070, gtx 1060, gigabyte, FMS, Flash Memory Summit, asus, arm, 10nm

PC Perspective Podcast #413 - 08/18/2016

Join us this week as we discuss the new mobile GeForce GTX 10-series gaming notebooks, ARM and Intel partnering on 10nm, Flash Memory Summit and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

Hosts:  Allyn Malventano, Sebastian Peak, Josh Walrath and Jeremy Hellstrom

Program length: 1:29:39
  1. Week in Review:
  2. This episode of PC Perspective is brought to you by Casper!! Use code “PCPER”
  3. News items of interest:
    1. 0:42:05 Final news from FMS 2016
  4. Hardware/Software Picks of the Week
    1. Ryan: VR Demi Moore
  5. Closing/outro

Gigabyte BRIX Gaming UHD Combines 2.6L Chassis with Discrete GPU

Subject: Systems | August 17, 2016 - 04:37 PM |
Tagged: UHD, SFF, IDF 2016, idf, gigabyte, gaming, brix

While wandering around the exhibit area at this year’s Intel Developers Forum, I ran into our friends at Gigabyte a brand new BRIX small form factor PC. The BRIX Gaming UHD takes the now-standard NUC/BRIX block shape and literally raises it up, extending the design vertically to allow for higher performance components and the added cooling capability to integrate them.

brixuhd06.jpg

The design of the BRIX Gaming UHD combines a brushed aluminum housing with a rubber base and bordering plastic sections to create a particularly stunning design that is both simple and interesting. Up top is a fan that pulls air through the entire chassis, running over the heatsink for the CPU and GPU. This is similar in function to the Mac Pro, though this is a much more compact device with a very different price point and performance target.

brixuhd08.jpg

Around the back you’ll find all the connections that the BRIX Gaming UHD supplies: three (!!) mini DisplayPort connections, a full size HDMI output, four USB 3.0 ports, a USB 3.1 connection, two wireless antennae ports, Gigabit Ethernet and audio input and output. That is a HUGE amount of connectivity options and is more than many consumer’s current large-scale desktops.

brixuhd07.jpg

The internals of the system are impressive and required some very custom design for cooling and layout.

brixuhd02.jpg

The discrete NVIDIA graphics chip (in this case the GTX 950) is on the left chamber while the Core i7-6500HQ Skylake processor is on the left side along with the memory slot and wireless card.

brixuhd04.jpg

Gigabyte measures the size of the BRIX Gaming UHD at 2.6 liters. Because of that compact space there is no room for hard drives: you get access to two M.2 2280 slots for storage instead. There are two SO-DIMM slots for DDR4 memory up to 2133 MHz, integrated 802.11ac support and support for quad displays.

Availability and pricing are still up in the air, though early reports are that starting cost will be $1300. Gigabyte updated me and tells me that the BRIX Gaming UHD will be available in October and that an accurage MSRP has not been set. It would not surprise me if this model never actually saw the light of day and instead Gigabyte waited for NVIDIA’s next low powered Pascal based GPU, likely dubbed the GTX 1050. We’ll keep an eye on the BRIX Gaming UHD from Gigabyte to see what else transpires, but it seems the trend of small form factor PCs that sacrifice less in terms of true gaming potential continues.

HP's Latest Omen Desktop Puts a Gaming System in a Cube

Subject: Systems | August 17, 2016 - 04:25 PM |
Tagged: PC, Omen 900, Omen, hp, gaming, desktop, cube, computer

HP has introduced a new pre-built gaming desktop, and while the Omen series has existed for a while the new Omen offers a very different chassis design.

omen_01.jpg

This Omen isn't just cube-like, it's actually a cube (Image credit: HP)

Inside the specifications look like the typical pre-built gaming rig, with processors up to an Intel Core i7 6700K and graphics options including AMD's Radeon RX 480 and the NVIDIA GeForce GTX 1080. Configurations on HP's online store start at $1799 for a version with a GTX 960, a 1TB spinning hard drive, and a single 8GB DIMM. (Curiously, though reported as the "Omen X", the current listing is for an "Omen 900".)

OMEN_X_Radeon_Performance.jpg

A look inside an AMD Crossfire configuration (Image credit: HP via The Verge)

HP is certainly no stranger to unusual chassis designs, as those who remember the Blackbird 002 (which Ryan stood on - and reviewed - here) and subsequent Firebird 803 systems will know. The Verge is reporting that HP will offer the chassis as a standalone product for $599, itself an unusual move for the company.

omen_2.jpg

(Image credit: HP)

The new Omen desktop goes on sale officially starting tomorrow.

Source: The Verge